The Rise of the Machines, or At Least the Machine Learning

Artificial Intelligence (AI) and machine learning (ML) are almost ubiquitous in today’s engineering conversations. Engineers are using AI to research, design, create and improve products and systems all over the world. Artificial intelligence operating as a partner in engineering research and development is so commonplace that an ISO Committee already exists to build the framework and rules for the use of AI as a tool. Thirty-five members sit on the ISO Committee, and fourteen standards are currently published with twenty-five more on the way.

The engineering goals of being faster, smarter, more efficient, more cost-effective and more sustainable are all gaining ground with the help of AI advances.

AI Dog outfitted with a machine gun in Russia (Image courtesy of New York Post)

Many of us grew up as science fiction fans, watching possible visions of the future play out through books and movies. One of the best parts of consuming science fiction as an engineer is wondering how the technological advances shown might be made real. We get to watch androids that can act as personal assistants, factory workers, doctors and surgeons. Computer systems can model traffic, design suits of armor and create environments that allow us to be completely different versions of ourselves.

The darker side of technology exists in science fiction, as well. In my teens we watched Wargames, where the world narrowly avoided a nuclear war after an artificial intelligence system decided that, like tic-tac-toe, the only way to win is not to play. Dark visions of the near future are plentiful, from humanity living as farm animals in a Matrix simulation to societies where an AI entity takes over all machines and computers before enslaving humanity.

But it does raise the question: How close is current technology in AI to the amazing stories that we see in science fiction? Well here are five examples.

1.    Baymax and AI in Healthcare

In the movie Big Hero 6, Baymax is a personal healthcare companion. With a simple scan, he can detect vital stats. Then when given a patient's level of pain he can treat nearly any ailment. Current AI technology is widely used in the medical industry and while it might not be able to treat every ailment, it can help with many issues.

Viz.ai uses its intelligent care coordination (ICC) to process patient images and find conditions that exist where patients might be susceptible to large-vessel occlusion (LVO) strokes or intracerebral hemorrhaging. The system can alert healthcare professionals to potential issues early and start treatment before the problem becomes debilitating or deadly. TBS iNnsight is a similar program that looks at patient bone material density scans and predicts when a patient is at risk for bone fracture.

Medtronics developed the InPen, a smart insulin pen that takes the user’s insulin data, sends it to the app via Bluetooth, and calculates the insulin dose that will best suit the patient at any time. The software here is making adjustments on the fly, and the more data that is pulled from a specific user, the better the personalized recommendations will be.

Beyond apps and devices, there are several AI companion robots. Mabu is one of the most well-known, and has been in the news since 2015. It ran some clinical trials in 2020, and uses an AI-driven chatbot system to encourage the user to talk to the robot. These conversations give insight into patient wellness and changes over time, bringing in sensor information from Fitbit or smart scales to understand a total health picture.

A Baymax that can fly and fight might not be available in the next five years, but it’s a good bet that current AI technology will make robotic healthcare companions a possibility.

2.    Terminator Robots Are Far Away, Right? Right?

Boston Dynamics has built mobility robots for thirty years, because the “dull, dirty and dangerous” tasks don’t happen in a sterilized clean room. The Waltham, Massachusetts engineering company wants robots to go where people go, and the Spot robot is the most well-known representation of this mobility goal.

Boston Dynamics says that it “will not authorize nor partner with those who wish to use our robots as weapons or autonomous targeting systems.” The Terms and Conditions of Sale for these robots also state that they cannot be weaponized, or Boston Dynamics will “mitigate that misuse.” What could go wrong?

Manufacturers who make knockoffs of the popular robots, however, don’t need to follow the Boston Dynamics rules. Several robots that look like family members of the Spot robots are available for purchase, such as the UnitreeYushuTechnologyDog listed on Aliexpress.

Recently posted a videos of machine gun-retrofitted robot dogs, with one appropriately enough called ‘Skynet’, have popped up around the internet. This video shows the bot moving around a set of targets and successfully hitting them with a volley of bullets. If there is an AI sentient enough to have motive for mass destruction, this mobile machinegun can be its opportunity.

Remember to walk this dog if you want to live.

At this point the technology for a Terminator-like robot and the AI required to control it feels more than five years off, but the scenario feels a little more plausible after watching this video.

3.    JARVIS and Product Design

In the Marvel Cinematic Universe, JARVIS started as a language-input computer system and its capabilities grew over time. Viewers watched the system help Tony Stark analyze and control his Iron Man suits—but more importantly, JARVIS was a design interface.

Iron Man 2 showed us holographically enhanced displays coming from a smartphone and coffee table. The design space existed above the tabletop, where Stark could rotate, pan or zoom components and then create assemblies or focus on one individual design. Eventually in Avengers: Endgame he used the design assist tools to solve time travel, too, but that’s outside the scope of current engineering.

Most of these functions are already possible in the realm of product design. Everybody has a preferred software, but Fusion 360 has my favorite generative design package right now, taking user inputs and creating different options for casting, injection molding and CNC machining path designs. The simulation world is no different, and Ansys has fully committed to the ideas of machine learning and AI in product development.

AI can take a few different paths as a product design aid, either creating a wide array of designs and evaluating them based on specific criteria, or using an engineer’s preference and constraints and then running a small number of trials with already-vetted parameters.

Can we do this with holograms that float in the air and allow user manipulation? MIT has us covered—and some of the holograms even provide haptic feedback using bursts of air against the user’s fingertips or hands. The technology around AI-powered design and simulation will no doubt advance even farther in the next five years.

But as for AI systems that interact with the engineer directly like JARVIS does with Tony, check this out: Coming Soon to a Workplace Near You: An AI-based Engineer.

4.    Star Trek’s LCARS Could Transform an Organization

Personnel on the USS Enterprise worked with the Library Computer Access/Retrieval System (LCARS) system during the Star Trek: The Next Generation era. When not seated at the large terminal screens, data could be accessed through Personal Access Display Devices (PADDs). Looking outward, the computer could scan a planet and could understand what other ships or celestial bodies were in the area. Looking inward, the bridge could pull information about any part of the ship, understanding power reserves and damage reports while judging future goals against the current state.

The Star Trek LCARS image (Image courtesy of Raspberry Pi)

This definitely sounds like a manufacturing environment to me. Understanding inventories, production goals, quality defects and personnel issues is a great indicator that a manufacturer is headed in the right direction. Using AI, companies are working the digital transformation and creating a digital thread that can run all the way through an organization. A digital thread is “a description of the optimal flow of data within and between people, tools and systems associated with the lifecycle of a product.”

Connecting the goals from the development, manufacturing and quality functions into one hyper-complex idea is tough, but it is easier when AI does the big-picture thinking for us. This paper showcases the ways that AI is used to make intelligent searches that pull a digital thread through a company’s full infrastructure.

Even if transporters beaming us up isn’t a possibility yet, there are plenty of engineers using AI-enhanced searches right now to perform the same functions that we watched Star Trek computers accomplish.

5.    WALL-E, AUTO and NASA’s DSA

In the movie WALL-E, an autopilot named AUTO runs most of the operating systems on the starcraft Axiom, following Directive A-113. As an AI-based system, AUTO followed the requirements programmed by its creators, and autonomously worked to find the best operating parameters to meet its goals.

AUTO from WALL-E (Image courtesy of Pixar)

There is a mind-boggling number of folks arguing online about whether AUTO was a villain in the movie or merely following orders that ran counter to the goals of the Axiom’s crew, but everyone agrees that the autopilot was a futuristic AI that could effectively manage a spacecraft full of crew and resources.

Today, we have more self-driving features on vehicles every new model year. Engineers are developing machine learning systems that push us ever closer to a fully-autonomous vehicle grid. The hope is that five years of progress on self-driving cars will take us farther than we can imagine today.

NASA has us covered on the space side as well. The Distributed Spacecraft Autonomy (DSA) is a new approaches that the space organization is investigating for future missions. DSA will let a spacecraft build an itinerary of locations and tasks during the mission, instead of sending status data to a crew on the ground and then awaiting orders before acting. The system will also be used to coordinate activities between multiple spacecraft on the same mission. Deciding on tasks and paths in the location where the activity is happening will eliminate the time-delay caused by beaming information back and forth from Earth to space, and using AI tools should ensure that the right decisions are made.

How Wide Is the Gap Between Fiction and Reality?

Fortune Business Insights valued the AI market at $328.34 billion in 2021 with the potential to grow to $1,394.3 billion by 2029. So, it’s safe to say that the tools of artificial intelligence are likely to stay in use and be even more prominent in the near future. The technological utility of AI five years from now is probably beyond what we can imagine—both the innovations that the most brilliant scientists will be working on, and the everyday technology used by consumers and manufacturers.

Deciding exactly where artificial intelligence sits on a graph from a 2000 Cleverbot to C-3P0 is difficult. There are hundreds of AI engineers working on hundreds of projects across the world, all in varying degrees of technological advancement. Most of the AI technology in these five science fiction concepts is already in use, just in a different form that we’ve seen on the screen.