Autonomous Mobility for Everyone

Autonomous vehicles are starting to become a key player in many people’s day-to-day lives. Beyond the famous autopilot feature in some high-end cars, most consumer vehicles produced over the past few years now include elements of autonomous driving—like parking assist systems. However, the industry’s ongoing challenge is the mostly fragmented development environment for autonomous driving applications. In most vehicles, the software solutions are developed in isolation and delivered closed source to the car manufacturer. To improve autonomous mobility and software interoperability, Apex.AI developed Apex.OS as an end-to-end software development kit (SDK) with broad utility.

(Image courtesy of Apex.AI)
The goal of Apex.OS is to accelerate the autonomous mobility development of any hardware system (from cars to robots) by creating a unified software platform that can be used for applications across what the company describes as “mobility megatrends”— autonomous driving, connectivity, shared mobility and electrification. The objective is to help companies expand beyond prototyping and to develop solutions that can truly be scaled without sacrificing reliability or safety.

Creating a Software-Defined Autonomous Vehicle Industry

In conversation with engineering.com, Jan Becker, the president, CEO, and cofounder of Apex.AI, explained the company’s software by using an analogy to the smartphone industry. For example, let’s consider an Android phone. When a company develops a new app, it can be used on any hardware device running Android software, whether it be a Samsung or an LG device, a phone, or a tablet. Engineers can use the development environment to make applications that access a device’s microphone, camera and other elements without needing to know the device’s specifications. This makes it much easier to scale applications and manufacture products without redeveloping a solution for every unique device.

Becker describes the Apex.AI SDK as a similar solution for mobility applications. The package contains everything needed for autonomous mobility, including autonomous driving. “It’s not an operating system like Windows or macOS, but it’s a meta operating system that is independent of hardware,” explained Becker. As part of the analogy, Becker noted that previously we had hardware-defined phones, like the Nokia devices. Today, we use software-defined phones: Android or iOS, where the hardware doesn’t impact the device as much as the software. The company’s goal is to see the same process occur with the autonomous vehicle industry. Instead of focusing on hardware-defined vehicles, Apex.AI wants software-defined vehicles that use applications developed in its SDK.

To develop Apex.OS, the company started from the popular, open-source Robot Operating System (ROS)—a library of software tools that helps with the production of robots. Currently, most companies use ROS for proof of concept and prototyping. “Although it shortens initial development times, [the software] still can’t reach scale,” explained Becker. “[ROS] is not real time or safety certified, but now Apex.OS is real time, reliable and safety certified.” The company wanted to preserve the success and utility of ROS that most engineers are familiar with but wanted to build on the platform to create an operating system where applications can realistically reach scale.

Overview of where Apex.AI solutions lie within a software stack. (Image courtesy of Apex.AI.)
One of the goals of developing the Apex.OS SDK was having a safety-certified solution for autonomous mobility applications. To develop the solution, Apex.AI expanded ROS from 2019-2021 and then worked with the International Organization for Standardization (ISO) to get its safety certification under ISO26262: “Road vehicles-Functional safety.” The company has now achieved the highest level of automotive software safety within ISO26262, with a rating of ASIL-D.

Make Any Robot Autonomous

A key feature of Apex.OS is the ability to develop applications for any hardware system. The SDK can integrate with existing sensors and act as a hardware-agnostic solution for developing mobility applications. An engineering company can bring any existing device or robotic solution to the table. The hardware and software don’t need to be developed at the same time. Apex.OS is a hardware abstraction layer that allows engineers to build solutions across existing devices and continue to deploy their solutions even as hardware evolves.

To start using Apex.OS, most companies usually transition from using ROS to operate their hardware solutions. Then, in conversation with Apex.AI, the company can try the SDK while retaining all the benefits of ROS. Beyond initial testing, to reach scale, a software license is required.

Realizing the Potential of Autonomous Driving, in a Farmer’s Field

It’s important to realize that Apex.OS isn’t just for autonomous cars; it’s for autonomous anything. For example, in early June, Apex.AI announced that Apex.OS is being incorporated into an autonomous farming robot developed by the agricultural machinery company AGCO. Using the Apex.OS SDK, the AGCO engineering team integrated advanced autonomous driving features such as LiDAR object detection, as well as collision detection and prevention, into its autonomous farming vehicle, Xaver.

In conversation with engineering.com, Christian Kelber, director, Engineering, AGCO, explained the benefit of the Apex.OS system as its ability to use this middle layer of software to meet the challenge of scaling autonomous mobility for a functional product. “At the system level, hardware, software, and mechanics can all work seamlessly together,” explained Kelber. Without needing to redesign its software, the Apex.OS solution can easily be used across new models as they are developed.

Kelber and AGCO turned to Apex.AI to help them meet the specific challenges of developing an autonomous farming robot. “One challenge is that agricultural machines are not cars. Many solutions that work well with autonomous vehicles won’t necessarily work in agriculture. Just because it works on the road doesn’t mean it will work well in the field.”

In recent years, lots of progress has been made with autonomous vehicles, but AGCO struggled with meeting these same milestones in the context of a farming environment. Since December 2020, AGCO has worked with Apex.AI, and both companies worked together to make changes to their hardware and software solutions. Kelber mentioned that Apex.AI was involved throughout the development process, providing suggestions for the robot and its autonomous driving applications.

The Fendt Xaver autonomous robot began as a research project and is now being developed by AGCO for global applications. The robot, which can plant seeds 24 hours per day with centimeter precision, is electric, produces zero emissions, and uses 90 percent less energy than conventional farming machinery. Using Apex.OS, AGCO made a software stack for the robot that meets automotive industry standards. Now, an entire fleet of Xaver machines can be controlled via an app through the cloud, and each robot provides real-time feedback to the operator.

The Xaver autonomous farming robot in the field. (Image courtesy of AGCO.)
The Apex.OS software can be used for applications beyond agriculture. “Anything that moves and requires either performance or safety guarantees can benefit from the solution,” added Becker. Many industries can benefit from autonomous mobility, including aviation, robotic surgery, driver assistance systems, factory automation and more.
Benefits of the Xaver farming robot built on Apex.OS. (Image courtesy of Fendt.)

Realizing Autonomous Mobility for All Industries

Over the next few years, Becker added that Apex.AI will focus on expanding its current customer base. The goal is for the solution to be deployed in products available on the market. He mentioned that the AGCO Xaver project is an excellent first step, but that over the next few months, the company will be announcing many additional customers that are using its software for mobility applications across diverse industries.

Additionally, a separate AGCO goal is to improve the efficiency of data communication within software solutions. Currently, the company is working on a solution that combines the SOME/IP communication protocol with the data distribution service (DDS) protocol for a more modernized product that will better facilitate cloud communications. The solution will make it easier for applications to move from a device to the cloud, so companies can transition from running a few robots during testing to thousands of robots during deployment.

When asked about latency, Becker explained that companies could choose where applications run, ensuring that safety-critical features can run on a device. In contrast, planning applications or other noncritical features can run in the cloud.

AGCO, in contrast, is continuing to test the Xaver autonomous robot in field trials around the world. Kelber explained that farming conditions vary greatly depending on geography, and that the company is continuing to robustly test the robot before the device becomes commercially available.

A hardware-agnostic approach to autonomous mobility development seems like it will be a great solution for the field, allowing companies to improve their autonomous capabilities without reinventing the wheel with each new device. As Apex.AI is still growing, it will be interesting to see how the customers they announce over the next few months will be utilizing the company’s platform to further their own autonomous applications. It will be an exciting place to watch as more and more companies look to reach maturity with their autonomous solutions.