Photo above: NASA’s Perseverance Mars rover took a selfie with several of the 10 sample tubes it deposited at a sample depot it is creating within an area of Jezero Crater nicknamed “Three Forks.” The image was taken by the WATSON (Wide Angle Topographic Sensor for Operations and eNgineering) camera on the end of the rover’s robotic arm on Jan. 20, 2023, the 682nd Martian day, or sol, of the mission.
NASA is hoping to land the first humans on Mars by 2030, and achieving this goal will take a thorough investigation and analysis of the environment on Mars. According to NASA, there are several obstacles that we still need to overcome before sending a human mission to Mars, including technological innovation and a better understanding of the human body, mind and how we might adapt to life on another planet.
To gather this vital information, NASA sent robots to Mars with embedded AI that powered their thinking and driving capabilities. This required asking key questions, like whether NASA could trust that the robot’s AI would not destroy itself and drive off a cliff? NASA was able to have that trust not just because AI and software have improved, but rather because of their holistic approach when thinking about AI and robotics.
Previously, they had used autonomous driving for their Curiosity Rover, but they didn’t use it very much because while the AI would navigate the rover around big hazards, it wasn’t avoiding little sharp rocks that started to tear the wheels of the rover. Adjusting the algorithm to say, “Avoid these small sharp rocks,” is a complex operation; a better solution would be to use a full-systems approach, while integrating several emerging technologies.
Before we explain how they achieved their holistic solution and which technologies were utilized to facilitate it, we’ll cover key emerging technologies – what they mean, how they are implemented and how they could be integrated in a full-systems solution.
Metaverse vs. Immersive Technologies
The word “Metaverse” was coined by Neal Stephenson in his 1992 science fiction book, Snow Crash. In the book, he presented a kind of virtual reality where every virtual interaction could also have a direct impact on the physical world. The metaverse offers a future of infinite possibilities without boundaries on who you can be, what you can do, and who you can interact with.
Therefore, the term “Metaverse” represents a virtual space where users engage in an immersive experience; it is not a technology. For this virtual space to provide a full immersive experience, it needs both hardware and software:
- Hardware: The tools that will create the virtual space and will enable us to experience virtual reality (VR), augmented reality (AR), mixed reality (MR)
- Software: The technologies which enable and support these immersive experiences, such as blockchain technology, AI, IoT, and others
Think of it like this: A computer has hardware, whether it’s a desktop or a laptop. It is the device you interact with for your daily needs. But in order for it to operate, it needs an operating system – that’s the software – think macOS or Windows. Likewise, the metaverse has hardware to create the virtual space (such as VR glasses), and software to enable the experience within that space.
An immersive experience refers to one facilitated by digital technology which attempts to imitate a physical or imaginary world by creating a surrounding sensory feeling, thereby creating a sense of immersion. But it’s not quite that simple, because it relies on a multitude of emerging technologies to create that sense of immersion. What technologies do we need, and how do they contribute to that immersion? Here are a few of the key technologies to consider:
Artificial Intelligent (AI)
Artificial intelligence is pretty much just what it sounds like – the practice of getting machines to mimic human intelligence to perform tasks. You’ve probably interacted with AI even if you don’t realize it – voice assistants like Siri and Alexa are founded on AI technology, as are customer service chatbots that pop up to help you navigate websites.
Machine Learning (ML)
Machine learning is a type of artificial intelligence. Through machine learning, practitioners develop artificial intelligence through models that can learn from data patterns without human direction. The unmanageably huge volume and complexity of data (unmanageable by humans, anyway) that is now being generated has increased the potential of machine learning, as well as the need for it.
The internet of things (IoT) is a catch-all term for the growing number of electronics that aren’t traditional computing devices, but are connected to the internet to send data, receive instructions or both. The IoT brings internet connectivity, data processing and analytics to the world of physical objects. For consumers, this means interacting with the global information network without the intermediary of a keyboard and screen (such as Alexa, for example).
There’s an incredibly broad range of ‘things’ that fall under the IoT umbrella:
- Internet-connected ‘smart’ versions of traditional appliances such as refrigerators and light bulbs;
- gadgets that could only exist in an internet-enabled world such as Alexa-style digital assistants;
- and internet-enabled sensors that are transforming factories, healthcare, transportation, distribution centers and farms.
After data is gathered from IoT devices, data processing and analytics takes place. It can take place in data centers or the Cloud, but sometimes that’s not an option. In the case of critical devices, such as shutoffs in industrial settings, the delay of sending data from the device to a remote data center is too great. The round-trip time for sending data, processing it, analyzing it, and returning instructions – e.g., to close the valve before the pipes burst, can take too long. In such cases edge computing is implemented, where a smart edge device can aggregate data, analyze it, and create responses, if necessary, all within relatively close physical distance, thereby reducing delay.
Edge computing is a distributed computing paradigm that brings computation and data storage closer to the sources of data. This is expected to improve response times and save bandwidth. By deploying data centers and servers near the point of transmission – where people live and work – edge solutions make it easier for businesses to support customers and accommodate new technologies without increasing network congestion and latency.
Bringing the above technologies together looks like this: The volume of data IoT devices can gather is far larger than any human can deal with in a useful way, and certainly not in real time. Edge computing devices are not only needed to make sense of the raw data coming in from the IoT endpoints, but there’s also the need to detect the accuracy and the usefulness of the data gathered by these devices.
Many IoT providers are offering machine learning and artificial intelligence capabilities to make sense of that data. IBM’s Watson platform can be trained on IoT data sets to produce useful results in the field of predictive maintenance; for example, analyzing data from drones to distinguish between trivial damage to a bridge versus cracks that need attention.
In 2022, Arm launched low power chips that can provide AI capabilities on the IoT endpoints themselves. The company also launched new IoT Processors, such as the Cortex-M85 and Corstone-1000 that support AI with Edge Computing.
Blockchain is a shared, immutable ledger that facilitates the process of recording transactions and tracking assets in a business network. An asset can be tangible (e.g., house, car, land) or intangible (e.g., intellectual property, patents, copyrights, branding). Anything can be tracked and traded on a blockchain, while increasing efficiency, reducing risk, and cutting costs.
In 2018, the Commonwealth Bank run a pilot to trace an almond supply chain via blockchain. The blockchain platform, underpinned by distributed ledger technology, smart contracts, and the Internet of Things (IoT), was used to facilitate a trade experiment that saw 17 tons of almonds sent from Australia to Germany. The focus wasn’t purely on trade finance; partners could track the shipment, including the temperature and humidity inside the container provided by IoT devices. The level of detail was useful for providing insurance. The test involved digitizing the operations, documentation, and finance.
The system used blockchain technology to track container information, task completion, and shipping documentation. Those documents included bills of lading, certificates of origin and other customs documents. You could enhance a supply chain by integrating AI, which can interpret the information gathered from sensors via IoT before it is sent back to the blockchain and analyze it, providing alerts for any unusual or irregular events.
In addition, it’s also possible to authenticate the certificate of origin and other documents through the use of NFTs.
Immersive technology is an integration of virtual content with the physical environment in a way that allows the user to engage naturally with blended reality. In an immersive experience, the user accepts virtual elements of their environment as part of the whole, potentially becoming less conscious that those elements are not part of physical reality. An immersive digital environment can be a model of reality but it can also be a fantasy or digital abstraction, as long as the user is immersed within it.
Immersive technology has become widely applied in many sectors, including retail and e-commerce, healthcare, video gaming, art, entertainment, and interactive storytelling, and has been getting more momentum with the evolution of Web3 and blockchain technology, especially with the utilization of non-fungible tokens.
Metaverse and Blockchain – examples of the use of NFTs and DAOs
To facilitate any transaction and the transfer of ownership, authentication is key. A transfer of ownership will not occur without authenticating the assets being transferred and the people (or entities), who are engaging in the transaction. This is the true power of NFTs, providing authentication and facilitating the transfer of ownership.
An NFT can authenticate a physical asset, like a $200,000 bottle of single-malt scotch, or a virtual parcel within the virtual real estate environment of Decentraland. The Decentraland metaverse is organized as a Decentralized Autonomous Organization (DAO), meaning that each constituent of Decentraland has a measure of voting power that is calculated based on their holdings within the metaverse, and all of these assets have associated tokens.
What the community votes on runs the gamut: requests for grants for property development, addition of new wearables for users’ avatars, organizing land auctions and sales fees. As the name DAO implies, there is no centralized entity making the decisions for Decentraland.
The metaverse and blockchain technology – what the future hold when Decentralized Finance will be utilized
Decentralized Finance (DeFi) applications will allow the transfer of funds from wallets created in the physical world to any virtual space or universe and between different virtual spaces. But DeFi goes beyond payments. With DeFi applications you’ll be able to take a mortgage on your parcel purchased on Decentraland or use this virtual parcel as a collateral for your business in the real world or in a parallel virtual space.
MyndVR: An example of the integration of immersive technology and other technologies
Virtual reality-based digital therapeutics are rapidly being adopted by the largest healthcare organizations in the world. This new form of treatment is an update to the quality of care and overall quality of life for our booming population of older adults. MyndVR, a VR-based digital therapeutics platform, is offering a vast library of immersive experiences designed to address senior health across the continuum of care, improving the lives of older adults and those who care for them.
The platform uses a combination of advanced hardware, software, and a user-focused experience design. MyndVR assembled an advisory board and research coalition of university and private-sector researchers to study the impact of virtual reality on age-related conditions.
The study at Stanford’s Virtual Human Interaction Lab focused on how VR could reduce seniors’ feeling of isolation from the outside world and has shown promising results. “It really felt like you were traveling – and not alone either. In some of the videos, there are people,” said one user.
By utilizing IoT and blockchain technology, MyndVR could further enhance their research and development capabilities. For example, the use of IoT could be integrated, connecting the VR hardware and/or monitoring devices, which monitors the user’s behavior and physical health. This information then can be transferred and shared globally shared with blockchain technology, which could trace and track all information, while securing the information and the privacy of the users. By enhancing its research, MyndVR can modify and update its product more efficiently and reliably.
NASA: An example of a full-systems approach
In the beginning of this article, we talked about how NASA was not able to make much of use of Curiosity Rover’s autonomous driving because while the AI would navigate the rover around big hazards, it wasn’t avoiding little sharp rocks that started to tear the wheels of the rover. As mentioned before, overhauling the algorithm to tell the driver to avoid those small sharp rocks is a complex operation. A full-systems approach looks at not just the individual components of any given technology, but takes a holistic approach, which also includes hardware.
Therefore, rather than focusing on the AI, NASA took a much simpler solution at the hardware level: they upgraded the wheels to be more resilient. They also made the cameras faster and upgraded the computing. It takes a full-systems approach to say, “Can we make this AI so it’s actually usable to solve the problem we want it to?” In this case, by looking at other solutions, NASA was able to work around the problem.
Planning often starts years before implementation
In 2016, NASA used VR with Oculus Rift to share with the public the experience of being on board several different space vessels. Attendees got the chance to see what an astronaut does when they do things like, climb into the new Orion capsule, which was scheduled for completion in 2018.
In 2016, researchers at NASA’s Jet Propulsion Laboratory devised a method for controlling a robotic arm through the operator’s motions and gestures by combining the Oculus Rift with motion-sensing hardware from the Xbox One’s Kinect 2 sensor. The idea was that the approach could one day be used to manipulate rovers or other instruments located millions of miles away from earth.
And that is exactly what happened.
After years of innovative research, integrating immersive technologies with advanced AI and improved hardware and robotics, the Perseverance Rover landed on Mars in July 2020 to research and investigate Mars while sending information back to earth for further analysis and exploration.
We don’t need to be astronauts to understand that we should approach problems holistically by using a full systems approach. When designing a solution for any use case, consider:
- Which technologies could (or are needed) to solve the problem?
- Are the technologies, in their given state, capable of fully resolving the problem?
- If not, assess the drawbacks, risks and vulnerabilities associated and consider ways to mitigate them as much as possible
We are at a pivotal time as mankind, where different emerging technologies have been evolving exponentially and are soon to shape our global economy and society, granted that we understand how to appropriately integrate them and unleash their full potential.
The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.