

Areas of Activity
From Web 1.0 to Web 3.0
Before exploring «Web 3.0» and «Web 4.0», it's crucial to understand the foundations laid by Web 1.0 and Web 2.0. Web 1.0, initiated in 1989 by British computer scientist Tim Berners-Lee, introduced the concept of linking digital text based on Ted Nelson's 1963 hypertext proposals. Berners-Lee not only developed the first browser but also authored «hypertext markup language» («HTML») for content display and «hypertext transfer protocol»(«HTTP») for file transmission between web servers and browsers. Despite envisioning a «semantic web» to interconnect data across pages, technical limitations hindered its immediate implementation. The web gained substantial public awareness in 1993 with Mosaic, later Netscape Navigator. This era ushered in user-friendly graphical browsers, including Microsoft Internet Explorer and Apple Safari. Search engines like Yahoo! Search, Lycos, and AltaVista dominated this period. However, Google's entry in 1998 quickly eclipsed rivals, leading to the decline of many search engines by 2004. This transformative shift marked a pivotal moment in the web evolution, laying the groundwork for subsequent advancements. In other words, «Web 1.0» was the first draft of the internet. A significant portion of Web 1.0 was constructed using open protocols which are methods of exchanging information accessible to anyone rather than a specific entity. During this era, the internet was primarily used for reading web pages and engaging in online conversations. As Web 1.0 evolved, there was a gradual shift towards increased e-commerce activities and the utilization of the internet for academic and scientific research. Around the turn of the millennium, experts advocated for an enhanced and more interactive web, coining it «Web 2.0». They distinguished it from the basic connectivity and mostly static websites of Web 1.0. Tim Berners-Lee further developed his semantic web concept, and Tim O'Reilly played a pivotal role in promoting Web 2.0 through a dedicated conference. The realization of an interactive web materialized with the surge of «social networks» like Facebook, which becamse increasingly popular. Simultaneously, the World Wide Web Consortium introduced semantic web standards. Web 2.0, emerging in the mid-2000s, empowered users to generate their content through platforms such as Facebook, Twitter (now X), and Wikipedia. However, this came at the cost of users unwittingly contributing to monetization strategies through user «data» sold to advertisers. In 2009, two foundational «Web 3.0» technologies emerged: cryptocurrency and blockchain. Prominent figures, including Gavin Wood, co-founder of Ethereum, began popularizing the terms Web 3.0 and «Web3» to signify a decentralized, semantically aware web. Web 3.0 envisions a decentralized internet built on blockchain technology, disrupting the centralized model of Web 2.0. Web 3.0 is expected to emphasize decentralized applications, extensive use of blockchain, and leverage «machine learning» («ML») and AI for a more intelligent and adaptive web. Since 2018, momentum around Web3 has surged across various sectors, including equity investment, online searches, patent filings, scientific publications, job vacancies, and press reports. The transformative potential of Web 3.0 lies in its capacity to alter web interaction dynamics and redefine how companies profit. As we will see below, the use of immutable blockchain ledgers in Web 3.0 has the potential to enhance customer service, improve supply chain monitoring, and serve as essential infrastructure for the emerging metaverse. As we will see below, while Web 3.0 and the metaverse share some commonalities, they differ in focus. Web 3.0 pertains to decentralized databases and systems architecture, while the metaverse represents a new computing and networking paradigm.
Web 4.0
«Web 4.0» represents the next phase in the Internet's evolution, introducing a novel paradigm based on diverse models, technologies, and social interactions. Although still in its early developmental stages, some initial concepts have emerged. The specific definition of Web 4.0 remains somewhat unclear and lacks unanimity in the literature, given its multifaceted nature. The terms «pervasive computing» and «ubiquitous computing» are commonly employed in the literature to describe this emerging paradigm, alternatively referred to as the «intelligent web». In a nutshell, Web 4.0 signifies a transformative shift in internet functionality, leveraging advanced technologies like AI, XR, blockchain, decentralized networks, and the IoT to establish a more intelligent, autonomous, and secure digital ecosystem. While Web 3.0 introduced user-generated content and social networking, Web 4.0 takes it further by enabling machine-to-machine communication, self-learning algorithms, and «decentralized governance» («DeGov»). Collaboration and collective intelligence are central themes, replacing individual isolation with collaborative problem-solving and content creation. Web 4.0 aims to create a web that is not only more intelligent but also intuitive, user-friendly, and easily navigable. Despite being in the conceptual stage with no consensus on its precise features, Web 4.0 is envisioned as a future evolution of the World Wide Web. Some experts foresee an era of true artificial intelligence, where machines understand and interpret human language for more natural interactions through advanced chatbots and virtual assistants. Others anticipate a shift towards a decentralized web, facilitated by blockchain and «decentralized applications» («DApps»), fostering greater user control and data ownership in platforms like decentralized social networks and marketplaces.
Digital & Artificial Era
It is commonly known that the information age consists of a historical period that began between the middle and end of the 20th century, more specifically between the 1950s and 1970s. This era is characterized by a rapid change in traditional industries, established during the Industrial Revolution, for an economy centered on information technology. Currently, with the beginning of the 21th century and the direction of the economy towards digital and artificial technologies, everything indicates that we are entering a new historical period, specifically the digital and artificial era. With the evolution of «Web 1.0» to «Web 4.0» and the emergence of new disruptive technologies, the use of computer systems has expanded at an even faster pace, with recent technological advances. In the «financial technology» («FinTech») and «decentralized finance» («DeFi») sectors, advances in «distributed ledger technologies» («DLTs») stand out. In the sector of «immersive» and «sensorial» technologies, «digital reality», and the «metaverse», essentially in terms of «digital sensory interaction» and «extended reality» («XR»), i. e. the combination of «virtual» («VR»), «augmented» («AR»), and «mixed» («MR») reality. In the «artificial intelligence» («AI») sector, mainly in terms of «general-purpose AI» («GAI» or «GPAI»), «foundations models» («FMs»), «large language models» («LLMs»), «natural language processing» (NLP), «generative AI» («GenAI»), computer vision, and predictive analysis. All of these sectors promise to revolutionize the way society interacts with technology (and vice versa). In addition to this type of technologies improving the user experience in terms of efficiency, they also present functionalities and characteristics to boost multiple sectors at a dizzying speed and at a rate that exceeds all predictions, including in sectors associated with other disruptive concepts or technologies, such as «data science», «data», & «big data», «cloud, edge, & spatial computing», «internet of things» («IoT»), «smart cities», «brain-computer interfaces» («BCIs»), «quantum technologies», «robotics», «autonomous systems» & «intelligent automation», among others. On the other hand, and inevitably, the «dizzying speed» of development in all these sectors could result in an equally «dazzling» increase in the scale and scope of «cybercrime» & «neurocrime» hacking methods. Imagine that these technologies reach the widely predicted potential. In the next 15 to 20 years, everything will be directly connected. New virtual worlds and even new countries will be (re)created in digital version in the multiverse of metaverses. «Central bank digital currency» («CBDC») will be the only legal tender. Smart cities will be connected through IoT, with CCTVs in every corner, highly equipped with facial recognition and intelligent sensory systems. Drones and AI-based humanoid police robots will patrol streets and buildings. AI-based industrial robots will produce any and all types of goods. Universities' teaching processes will be remote and mostly based on AI. Most organizations will not even have workers or physical spaces, being autonomous, digital and operated through AI systems. Businesses will be digital, within the scope of the metaverse and developed by autonomous digital and AI-based companies. Workplaces will be virtually all digital. People will spend practically their entire day (and night) in the metaverse, which will be increasingly developed and immersive. People will be able to choose to go out into the physical environment in the form of holograms, which makes it possible to be present in meetings and workspaces and even move freely anywhere on our planet and, of course, without leaving home. The line that will separate VR, AR, and MR, will become increasingly smaller. The same happens with the line that will separate physical reality from digital reality. In daily tasks, people will be assisted by portable and wearable technologies (in many cases invisible) or even by AI-based humanoid domestic robots. In the decisions they make, they will be assisted by AI through voice assistants or even through neural interface technology, with a view to merging human consciousness with AI. In general, every time people establish a connection with the metaverse, a neural connection will be initiated. Digital sensory interaction will make people feel increasingly comfortable connected to the metaverse, which is the reality that new generations will know best. As a consequence of this digital life, people will always be surrounded by cameras, microphones, and interface systems. Including from birth, as embryonic development will occur exclusively through artificial incubators. In fact, many people will choose to create and cultivate affective or loving relationships with their AI-based humanoid domestic robot. Domestic animals themselves will be replaced by artificial animals. Everything will be digital or artificial. As such, our own thoughts will not be safe, as whenever we connect to the metaverse and/ or activate any type of neural interface technology, we will be an open book. In this scenario, the use of computer systems is not limited to the social, professional, and economic domain, as it also extends to the psychological and biological domain. The «stunning» increase in the scale and scope of hacking will be increasingly noticeable. Ultimately, with the evolution of cybercrime, this use will affect the human brain and mind itself, in the form of «neurocrime» and in the form of «neurohacking». The EU seems to be paying attention to this digital and artificial transition.