Tech

Bias in AI: What can blockchains do to ensure fairness?

Experts believe that decentralized systems can help secure the integrity and objectivity of data being fed to AI systems, but there still exist very clear limitations.

Projects rooted in artificial intelligence (AI) are fast becoming an integral part of the modern technological paradigm, aiding in decision-making processes across various sectors, from finance to healthcare. However, despite the significant progress, AI systems are not without their flaws. One of the most critical issues faced by AI today is that of data biases, which refers to the presence of systemic errors in a given set of information leading to skewed results when training machine learning models. 

As AI systems rely heavily on data; the quality of the input data is of utmost importance since any type of skewed information can lead to prejudice within the system. This can further perpetuate discrimination and inequality in society. Therefore, ensuring the integrity and objectivity of data is essential.

For example, a recent article explores how AI-generated images, specifically those created from data sets dominated by American-influenced sources, can misrepresent and homogenize the cultural context of facial expressions. It cites several examples of soldiers or warriors from various historical periods, all with the same American-style smile.

An AI generated image of Native Americans. Source: Medium

Moreover, the pervading bias not only fails to capture the diversity and nuances of human expression but also risks erasing vital cultural histories and meanings, thereby potentially affecting global mental health, well-being and the richness of human experiences. To mitigate such partiality, it is essential to incorporate diverse and representative data sets into AI training processes.

Several factors contribute to biased data in AI systems. Firstly, the collection process itself may be flawed, with samples not being representative of the target population. This can lead to the underrepresentation or overrepresentation of certain groups. Second, historical biases can seep into training data, which can perpetuate existing societal prejudices. For instance, AI systems trained on biased historical data may continue to reinforce gender or racial stereotypes. 

Lastly, human biases can inadvertently be introduced during the data labeling process, as labelers may harbor unconscious prejudices. The choice of features or variables used in AI models can result in biased outcomes, as some features may be more correlated with certain groups, causing unfair treatment. To mitigate these issues, researchers and practitioners need to be aware of potential sources of skewed objectivity and actively work to eliminate them.

Can blockchain make unbiased AI possible?

While blockchain technology can help with certain aspects of keeping AI systems neutral, it is by no means a panacea for eliminating biases altogether. AI systems, such as machine learning models, can develop certain discriminatory tendencies based on the data they are trained on. Additionally, if the training data contains various pre-dispositions, the system will likely learn and reproduce them in its outputs.

That said, blockchain technology can contribute to addressing AI biases in its own unique ways. For example, it can help to ensure data provenance and transparency. Decentralized systems can track the origin of the data used to train AI systems, ensuring transparency in the information collection and aggregation process. This can help stakeholders identify potential sources of bias and address them.

Recent: Why join a blockchain gaming guild? Fun, profit and create better games

Similarly, blockchains can facilitate secure and efficient data sharing among multiple parties, enabling the development of more diverse and representative data sets.

Also, by decentralizing the training process, blockchain can enable multiple parties to contribute their own information and expertise, which can help mitigate the influence of any single biased perspective.

Maintaining objective neutrality requires careful attention to the various stages of AI development, including data collection, model training and evaluation. Additionally, ongoing monitoring and updating of AI systems are crucial to addressing potential prejudices that may arise over time.

To gain a deeper understanding of whether blockchain tech can make AI systems completely neutral, Cointelegraph reached out to Ben Goertzel, founder and CEO of SingularityNET — a project combining artificial intelligence and blockchain.

In his view, the concept of “complete objectivity” is not really helpful in the context of finite intelligence systems analyzing finite data sets.

“What blockchain and Web3 systems can offer is not complete objectivity or lack of bias but rather transparency so that users can clearly see what bias an AI system has. It also offers open configurability so that a user community can tweak an AI model to have the sort of bias it prefers and transparently see what sort of bias it is reflecting,” he said.

He further stated that in the field of AI research, “bias” is not a dirty word. Instead, it is simply indicative of the orientation of an AI system looking for certain patterns in data. That said, Goertzel conceded that opaque skews imposed by centralized organizations on users who are not aware of them — yet are guided and influenced by them — are something that people need to be wary of. He said:

“Most popular AI algorithms, such as ChatGPT, are poor in terms of transparency and disclosure of their own biases. So, part of what’s needed to properly handle the AI-bias issue is decentralized participatory networks and open models not just open-source but open-weight matrices that are trained, adapted models with open content.”

Similarly, Dan Peterson, chief operating officer for Tenet — an AI-focused blockchain network — told Cointelegraph that it’s tough to quantify neutrality and that some AI metrics cannot be unbiased because there is no quantifiable line for when a data set loses neutrality. In his view, it eventually boils down to the perspective of where the engineer draws the line, and that line can vary from person to person.

“The concept of anything being truly ‘unbiased’ has historically been a difficult challenge to overcome. Although absolute truth in any data set being fed into generative AI systems may be hard to pin down, what we can do is leverage the tools made more readily available to us through the use of blockchain and Web3 technology,” he said.

Peterson stated that techniques built around distributed systems, verifiability and even social proofing can help us devise AI systems that come “as close to” absolute truth. “However, it is not yet a turn-key solution; these developing technologies help us move the needle forward at neck break speed as we continue to build out the systems of tomorrow,” he said.

Looking toward an AI-driven future

Scalability remains a significant concern for blockchain technology. As the number of users and transactions increases, it may limit the ability of blockchain solutions to handle the massive amounts of data generated and processed by AI systems. Moreover, even the adoption and integration of blockchain-based solutions into existing AIs pose significant challenges.

Recent: Crypto in Europe: Economist breaks down MiCA and future of stablecoins

First, there is a lack of understanding and expertise in both AI and blockchain technologies, which may hinder the development and deployment of solutions that combine both paradigms effectively. Second, convincing stakeholders of the benefits of blockchain platforms, particularly when it comes to ensuring unbiased AI data transmission, may be challenging, at least in the beginning.

Despite these challenges, blockchain tech holds immense potential when it comes to leveling out the rapidly evolving AI landscape. By leveraging key features of blockchain — such as decentralization, transparency and immutability — it is possible to reduce biases in data collection, management and labeling, ultimately leading to more equitable AI systems. Therefore, it will be interesting to see how the future continues to pan out from here on end.

In crypto winter, DeFi needs an overhaul to mature and grow

For DeFi to make a major comeback, projects must focus on enhancing consumer experience while utilizing sustainable incentive schemes, experts believe.

For several months now, the decentralized finance (DeFi) sector has been on the receiving end of a major bear market, so much so that the total value locked within this space has slipped from its all-time high of $150 billion (achieved back in May 2022) to its current levels of just over $50 billion

Despite this, the amount of capital flowing into this space from “centralized avenues” has grown, largely due to the collapse of FTX alongside other prominent entities like Celsius, Genesis, Vauld, etc. — even doubling trading volumes on many platforms over the course of November 2022 alone. Not only that, amid the recent market volatility, several decentralized exchanges and lending platforms continued to function smoothly, especially in comparison to their centralized counterparts.

Thus, in order for DeFi to truly reach its maximum potential, the sector needs a significant transformation. This is because a large number of protocols operating within this space have been continuing to offer users unsustainable returns for far too long. Moreover, with the recent surge in interest rates, inflation levels — and the so-called “risk-free” rate of return on six-month Treasury bills surpassing 5% — investor interest in decentralized options appears to be diminishing.

In fact, even the rapidly changing macroeconomic environment has affected DeFi, with various established projects implementing significant changes to their reward structures just to remain competitive. For instance, MakerDAO recently voted to increase its Dai (DAI) savings rate tenfold to 1%.

How can DeFi regain consumer confidence?

According to Rachid Ajaja, founder and CEO of AllianceBlock — a decentralized infrastructure platform connecting traditional financial institutions to Web3 applications — DeFi, like all global markets, is going through a cycle right now. And while what happened with Terra, Celsius, Three Arrows Capital and FTX most definitely shook investor confidence, the problem lies with the players operating within the market and not the technology itself. He told Cointelegraph:

“To bolster and maintain consumer confidence, DeFi needs to focus on solutions that put users first and protect them. This means working towards compliant DeFi solutions that focus on identity management, data encryption, data ownership by users, and trustless KYC procedures.” 

“These can pave the way for the tokenization of real-world assets and financial instruments, thereby attracting more cash flow into DeFi, including from traditional players and institutions who place a high value on compliance and sustainability,” he added.

Similarly, Varun Kumar, founder and CEO of the decentralized exchange Hashflow, told Cointelegraph that, at present, this niche industry needs stronger products that are capable of solving real-world problems. “The DeFi ecosystem is still in an exploration phase, with lots of projects still identifying their respective market fits,” he said.

However, Kumar claimed that, while there is a direct correlation between consumer confidence and declining dollar volumes, it’s important to consider other factors as well. For example, the DeFi boom of 2021 happened amid a strong macroeconomic environment, which had a significant impact on the sector:

“This quick growth was a great kickstarter for the space and created a lot of opportunity. However, now that conditions are different and volumes are much lower, business models and value propositions are being reshaped. Superior products will always win, from which consumer confidence will follow.”

Juana Attieh, co-founder and chief product officer for Fluus, an aggregator of fiat-to-crypto gateways with a crypto ramping network, told Cointelegraph that DeFi’s decline and loss of trust have been due to centralized entities abusing their power and exploiting their consumers time and again.

Recent: Is the IMF shutting the door prematurely on Bitcoin as legal tender?

To restore market confidence, she believes DeFi participants must prioritize enhancing transparency and creating standards for sharing information about underlying assets, protocols, governance mechanisms and more.

“Security measures must be significantly improved to protect user assets and information. This could include conducting regular audits, implementing bug bounties, and other measures to ensure the safety and security of DeFi protocols,” she said.

Attieh further believes that it is crucial for the sector to work closely with legislators so as to obtain regulatory clarity and devise governance frameworks that can reduce volatility and uncertainty while restoring confidence.

Not everything looks bad

Even though the market is going through a bit of a lull at the moment, Robert Miller, vice president of growth for Fuse, a blockchain-based Web3 payments ecosystem, told Cointelegraph that DeFi (specifically automated market maker-based applications) seems to have found an enormously successful product-market fit during the last innovation cycle. He said:

“Despite the drop, the fact that $50 billion in liquidity is still currently deployed to DeFi protocols is exciting and unprecedented in the world of finance, where we would typically need to rely on institutional market makers and lenders as the catalyst to get the economy moving again.”

Miller conceded that heightened consumer confidence and demand will only come with improved user experiences. “Even as a seasoned crypto professional, I still struggle with using well-known DeFi apps, so I can’t imagine how difficult it must be for the layman,” he added.

Andy Ku, CEO of Altava Group, a digital content Web3 ecosystem, believes that sometimes things need to get really bad in order for them to eventually become stable. He told Cointelegraph that, in the past, bad actors have loosely used the word DeFi to promote platforms that were more or less fully centralized.

However, in his view, most quality DeFi projects today are firmly rooted in the ethos of transparency, with a growing list of these offerings now undergoing smart contract audits and publishing proof-of-reserve reports to help restore confidence in this space.

“The growing distrust in traditional financial institutions is what has given birth to DeFi. The balancing act now is how to evolve DeFi into something that has more transparency, oversight and accountability,” he said. 

Wherein lies the future of DeFi?

Learning from the various high-profile scandals of 2022, Ajaja believes that the next wave of DeFi will put a stronger emphasis on compliance and customer experience. In this regard, she noted that we are already seeing the rise of projects that are focused on providing compliant DeFi solutions that integrate trustless Know Your Customer and Know Your Transaction protocols, which are key for long-term adoption by traditional industries.

Moreover, the concept of self-custody is also fast becoming important in the minds of many users, with more and more DeFi projects working on self-custodial wallet solutions that give full control and ownership of their assets and data. These wallets make it easy to manage and recover assets, store encrypted digital identities and verifiable credentials, and give the users full control over how they share this information.

Attieh believes that, while the bear market may have caused a decline in the usage of some DeFi projects, particularly as investors become more risk-averse, it is likely that the most robust projects with strong fundamentals and real-world use cases will continue to flourish and gain traction, even in challenging economic conditions.

Recent: Regulation and risk: Factors driving demand for a euro-backed stablecoin

In a somewhat similar vein, Daniel Fogg, president and chief operating officer for IOVLabs, the firm behind Rootstock — a smart contract platform secured by the Bitcoin Network — told Cointelegraph that the one positive outcome to emerge from the ongoing crypto winter is that it has reduced the white noise surrounding the ecosystem, adding:

“We’re seeing more builders and lesser buzzwords. For the DeFi sector to cross the chasm, teams building crypto projects must focus on accessibility, usability and utility. We need to be building products that solve real problems for real people — paying bills, sending money to family members overseas, getting protection from runaway inflation, finding safe places to save their money.”

Therefore, as we head into a future driven by decentralized technologies, it will be interesting to see how the rapidly evolving decentralized finance paradigm continues to mature, especially with more people looking for avenues that do not use intermediaries.

The importance of open-source in computer science and software development

Open-source software development promotes collaboration, innovation and accessibility in the tech industry.

Open-source refers to the practice of making source code freely available to the public, allowing anyone to view, modify and distribute the code. In computer science and software development, open source is important for several reasons, as explained in the below sections.

Collaboration and innovation

Global collaboration and contributions to the creation of software projects are made possible by open source, leading to faster innovation and the creation of more advanced and reliable software.

The creation of the Linux operating system is a prime illustration of how open source promotes cooperation and innovation. Linus Torvalds founded the open-source Linux project in 1991. It is one of the most popular open-source projects in history and is widely used in servers, smartphones and other devices today.

Related: Why less may be more when building Web3

Thousands of programmers from all over the world work together on the Linux project to develop the operating system by correcting problems, adding new features and enhancing performance. Anyone can contribute to the project because the source code is openly available for developers to inspect, alter and share.

The collaborative spirit of the Linux project has sparked quick innovation and produced an extremely sophisticated and dependable operating system. There are numerous instances where open source has aided in collaboration and creativity, including the creation of the Python programming language, the MySQL database and the Apache web server, to name a few.

Cost savings

Since open-source software is frequently free to use and distribute, both enterprises and individuals can significantly cut the cost of software creation and deployment.

The use of the LibreOffice productivity suite is one example of how open-source aids in cost savings. Alternatives to expensive, closed-source office productivity suites, such as Microsoft Office, include LibreOffice. Businesses and individuals can avoid paying high software license fees by utilizing LibreOffice.

Increased transparency and security

By allowing anybody to access, evaluate and alter the source code, open source encourages greater transparency and security. This increases the software’s overall security and stability by allowing developers and security professionals to find and repair bugs and security vulnerabilities more rapidly.

For instance, a group of developers that work on the project can remedy a problem if a security flaw is found in an open-source project. This community is capable of promptly identifying a fix and producing a patch that can be widely applied, enhancing the software’s security for all users.

Proprietary software, in contrast, is created behind closed doors, with the vendor of the product being the only one with access to the source code. It is the vendor’s responsibility to address the problem and make a patch available when a security flaw in proprietary software is found. If the vendor is not motivated to accomplish this, the procedure may take some time or even not happen at all.

Community support

Open-source software often has a large and active community of users and developers who provide support and help to improve the software. This can result in faster and more efficient problem resolution.

Related: What are decentralized social networks?

The creation of the WordPress content management system is one instance of how open-source fosters community support. Since its initial release in 2003, WordPress has grown to become one of the most widely used content management systems in the world, powering millions of websites.

A sizable and vibrant community of users and developers work together on the WordPress project to advance the platform. Through online forums, documentation and tutorials, this community helps to make WordPress more approachable and user-friendly by offering assistance to other users.

Education and training

Students and professionals can access real-world software projects using open-source software, giving them a chance to learn and advance their abilities. Additionally, open-source programming languages, such as Python, Java and Ruby, are frequently utilized in education and training courses because they are affordable, simple to learn, and have a big user and developer community that can offer assistance and resources.

For instance, many colleges and institutions teach computer science and software development using open-source programming languages because they allow students to use tools and technologies that are currently in use and help them build skills that are applicable to the labor market.

Additionally, many open-source development tools and platforms, such as GitHub, are widely used in the industry, making it possible for students to gain experience with tools and technologies that are used in real-world development projects. This can help to bridge the gap between education and employment, making it easier for students to transition into software development careers.

Top 10 most famous computer programmers of all time

Computer programming has made the impossible possible. Read about the top 10 computer programmers to date.

For computer programs and mobile applications, programmers must develop code. In order to keep things working properly, they are also involved in maintaining, debugging and troubleshooting software and systems.

Here is a brief overview of the top 10 most famous computer programmers of all time.

Alan Turing

Alan Turing was a British mathematician and computer scientist who contributed significantly to the growth of artificial intelligence, cryptography and computer science. He helped decipher the Enigma code during World War II and introduced the idea of the Turing Machine, a theoretical representation of a computer.

Turing also contributed to the creation of the Manchester Baby, the first stored-program computer and the basis for contemporary computing. He is widely regarded as the father of theoretical computer science and artificial intelligence.

Ada Lovelace

Many people consider Ada Lovelace, an English mathematician and writer, to be the first ever computer programmer. She understood the creative potential of computing and realized that computers could do more than just crunch numbers, creating the first published algorithm designed to be processed by a machine.

Lovelace has motivated countless generations of women to work in the fields of science and technology and is honored today for her contributions to the history of computing.

Bill Gates

Bill Gates is a software developer, businessman and philanthropist most well known for founding Microsoft, the world’s largest personal computer software company. He was crucial to the development of the PC and transformed the computer software market.

Under his direction, Microsoft created several successful lines of software, including the well-known Windows operating system, which eventually overtook other PC platforms. In addition, Gates founded the Bill and Melinda Gates Foundation to help improve global health and education.

Steve Jobs

Steve Jobs co-founded Apple and played a crucial role in developing the Macintosh, iPod, iPhone, and iPad. With his ground-breaking innovations and striking design aesthetics, he changed the PC, music and mobile phone sectors as well as popularized the graphical user interface. Jobs was a dynamic, forward-thinking leader who encouraged and motivated his team to develop and introduce successful products.

Jobs’ technical know-how and love for design and marketing contributed to Apple’s success as one of the world’s most cutting-edge and prosperous technological businesses. Numerous people acknowledge his influence on technology, and his legacy continues to motivate future generations of entrepreneurs and tech enthusiasts.

Linus Torvalds

Linus Torvalds developed the Linux operating system, which is frequently found running servers, supercomputers and mobile devices. He began Linux as a side project, but it has since expanded into an extensive global development collaboration.

In addition, he is the principal architect of the Linux kernel, the foundational element of the Linux operating system. Torvalds has won numerous honors for his contributions to the open-source software movement, and Linux has grown to be one of the most significant, well-known software projects in history.

Mark Zuckerberg

Mark Zuckerberg co-founded Facebook, one of the world’s most widely used social networking sites. He played a crucial role in building its infrastructure and turning the startup into a multibillion-dollar corporation now known as Meta. He has been instrumental in connecting people across the world through the platform, enabling them to share information, news and personal experiences.

Meta is currently working on several projects and initiatives to make its vision of the metaverse a reality, including the Meta Quest (formally Oculus Quest) virtual reality headsets, Horizon Worlds and Meta Horizon. In addition to Meta, Zuckerberg has worked on charitable projects, including the Chan Zuckerberg Initiative, which aims to advance human potential and promote equal opportunity.

Related: What is metaverse in blockchain? A beginner’s guide on an internet-enabled virtual world

Guido van Rossum

Computer programmer Guido van Rossum created the Python programming language in 1989. In addition to being the language’s original implementer, he actively participated in its growth and made numerous significant contributions to its functionality, community of users and design.

In July 2018, he left his post as the Python community’s “benevolent dictator for life” (BDFL).

Bjarne Stroustrup

Early in the 1980s, Danish computer scientist and professor Bjarne Stroustrup developed the C++ programming language. C++, one of the most popular programming languages in the world, was created by him to add object-oriented capabilities to the C language.

Stroustrup has made numerous key contributions to the design and features of the C++ language and is still actively involved in its development and progress.

Tim Berners-Lee

British computer scientist Tim Berners-Lee is widely recognized as the creator of the World Wide Web. In the early 1990s, he created the first web browser and server software and expanded on the idea of hypertext, which made it possible to create connected documents and the modern web.

Berners-Lee, who currently serves as the president of the World Wide Web Consortium — the leading international standards body for the Web — has been a significant proponent of the open Web and continues to work on its advancement and accessibility.

Related: What is Web 3.0: A beginner’s guide to the decentralized internet of the future

Dennis Ritchie

American computer scientist Dennis Ritchie was instrumental in creating the Unix operating system and the C programming language. While working at Bell Labs in the late 1960s and early 1970s, he co-created Unix, and his contributions to the development of the C programming language helped make it one of the world’s most widely used programming languages.

Ritchie is widely considered a pioneer of modern computing, and his work has had a significant impact on the computer science industry.

What is necessary for Web3 to fully replace Web2?

Can Web3 replace Web2… and what stands in the way of this happening? We look at some of the biggest challenges facing this burgeoning sector right now.

Web3 is the buzzword that’s on everyone’s lips — but when you put the mania aside for a moment, there’s a burning question that needs to be asked: Can these projects fully replace Web2… and what stands in the way of this happening? 

The likes of Google and Facebook have made a killing during the Web2 era, amassing billions of dollars in profits and a profound influence over the shape of the internet. But their continued influence is far from guaranteed. The 30-year history of the web is littered with the collapses of once-indestructible companies… MySpace being a notable example.

Amid countless concerns over how the data of users is harvested and used, plus fears that content creators aren’t being properly compensated for their hard work, Web3 is positioning itself as a democratizing force that puts power back in the hands of the public. Even the Web2 giants themselves see the potential of this new approach — it’s been almost a year since Facebook changed its name to Meta and declared plans to focus on the Metaverse. 

While the vision and ambition of Web3 startups is to be applauded, there are challenges that must be tackled. Critics rightly point to the vast energy consumption of some blockchains — especially those based on a Proof-of-Work consensus mechanism. They argue that creating a level playing field online can’t be at the expense of the environment. And with a dizzying number of DeFi protocols and cross-chain bridges falling victim to eye-watering hacks, with billions of dollars lost, there are safety issues to take into account as well. 

For Web3 projects to achieve their full potential, the infrastructure they rely on needs to have fully decentralized data management — and that means eliminating a reliance on centralized cloud providers such as Amazon Web Services. Owners need to be in the driving seat too, and blockchains have to be immutable, affordable and more eco aware. Ticking all of these factors is no mean feat.

Big ideas, worrying teething troubles

The Metaverse has been touted as a $1 trillion opportunity by JPMorgan — a silver bullet that could revitalize the music industry and reinvent the way we work and play. But before virtual worlds truly go mainstream, tricky security and privacy challenges must be overcome. A lack of interoperability risks standing in the way of adoption, too. And while the internet was pretty clunky in the early days, Metaverses have a long way to come before they’re usable and intuitive. The aspiration of people using blockchain technology without even realizing is some way off yet.

And that brings us to some of the other use cases that have been proposed for blockchains. A number of entrepreneurs firmly believe these immutable ledgers could drag the healthcare sector into the 21st century — ensuring medical records are properly digitized and easily transferred between facilities. Here’s the problem: this is an industry that has copious amounts of data, and patient confidentiality is sacrosanct. Big opportunities lie ahead for networks that can achieve interoperability, immutability, security, transaction transparency, and medical data sovereignty. Blockchain could also be nothing short of revolutionary if it tackles the sheer volume of fake medication that’s in this space — with some estimates suggesting 10% of the drugs in circulation are counterfeit.

So… what’s the answer?

Inery is a Layer 1 blockchain that aims to tackle some of these burning issues — seamlessly connecting systems, applications and a plethora of networks. Its database management solution, IneryDB, champions high throughput, low latency and complex query search — all while ensuring data assets remain fully controlled by their owners.

The team behind this Proof-of-Stake network say it’s scalable, resistant to Sybil attacks, energy efficient, tamperproof and speedy — capable of achieving 5,000 transactions per second, with new blocks created every half a second. All of this is achieved without compromising on security.

Dr Naveen Singh, the CEO of Inery, told Cointelegraph: “With Inery, our efforts are focused on envisioning a decentralized, secure and environmentally sustainable architecture for database management. Inery enables an affordable and scalable solution that allows people to issue and control data assets to activate a new paradigm for data accessibility.”

Inery says it’s already achieved a number of big milestones, and has been listed on Huobi. The network’s testnet has now been launched, and it has secured a $50 million investment commitment from GEM — as well as other contributions from the likes of Metavest and Truth Ventures. It’s also attracted some big-name talent. The founder of Orange Telecom now serves as chairman, and the ex-VP of global marketing at Apple is joining as a principal advisor.

Looking ahead, the project wants to enter into strategic partnerships that will unlock compelling use cases for its systems in more industries. It’s hoped that the mainnet will launch in the first quarter of 2023 — paving the way for developers and users alike to properly discover what the future of Web3 should look like.

Learn more about Inery

Disclaimer. Cointelegraph does not endorse any content or product on this page. While we aim at providing you with all important information that we could obtain, readers should do their own research before taking any actions related to the company and carry full responsibility for their decisions, nor can this article be considered as investment advice.

Snapchat’s parent company shutters Web3 division amid layoffs

The company said it will let go of its Web3 division to make way for restructuring after falling behind financial targets.

Snap Inc’s CEO Evan Speigel announced in a note on Friday that the company had made the difficult decision to reduce the size of its workforce by approximately 20%. 

The note said that this round of layoffs comes after the company experienced slow revenue growth, a slump in stock prices, and a general lag behind its financial targets. Speigel shared:

“Our forward-looking revenue visibility remains limited, and our current year-over-year QTD revenue growth of 8% is well below what we were expecting earlier this year.”

Snap Inc. will now undertake the task of restructuring in an attempt to ensure the company’s success in a highly competitive space where Instagram and TikTok are currently dominating. As part of its restructuring process, the company has axed its entire Web3 team. Jake Sheinman, head of Snap’s Web3 team, announced his exit from the company on Wednesday in a series of posts on Twitter stating:

“As a result of the company restructure, decisions were made to sunset our web 3 team.”

CEO Speigel shared that the restructuring is a part of an effort to focus on three strategic priorities; namely, community growth, revenue growth and augmented reality (AR). Projects that are not in alignment with these areas will be discontinued or have their budgets slashed significantly.

At the moment, it appears that Snap will not be prioritizing the budding Web3 and Metaverse space as much as its competition, such as Meta. Although many tech innovators seem to share the opinion that Web3 is going to be the next iteration of the internet, Snap does not appear interested in positioning itself within the blockchain industry.

Snap’s layoffs come after other tech companies like Coinbase, LinkedIn, Meta, Apple, Google and Netflix have had to cut down their workforce due to rising interest rates in an inflationary economy.