5 Machine Learning Trends Will Train the IoT Market

Don’t let stagnant data lakes turn into a shade of blue-green. Machine learning helps ensure that information will flow into a river of ideas. (Image courtesy of the State of Delaware.)

The Internet of Things (IoT) offers engineering teams an innovative way to collect data and observe the status of their products, services and equipment in the field.

However, IBM reported that as much of 90 percent of the information collected by the IoT is left unused in a dead sea of data.

Fortunately, there is an emergent technology growing concurrently with the IoT that has the potential to stave off the hypoxia in these stagnant data lakes and instead turn them into a healthy ecosystem of usable information. By funneling big data into machine learning algorithms, engineers can breathe life into their development cycles, operations, manufacturing and more.

Duncan Stewart, director of research at Deloitte Canada, explains that there are five basic trends that are defining the direction of the machine learning industry. Many of these changes offer opportunities to help grow the marriage of machine learning and the IoT.

1. Mobile Machine Learning Is on the Rise

The Apple A11 Bionic system on a chip (SoC) has a built-in neural network. Adding machine learning onto mobile phone chips is a trend that will likely bleed into the IoT.

The biggest trend that will certainly affect the realm of IoT is the shrinking of machine learning chips. Stewart notes that most of the flagship phones being released this year will include chips with neural network capabilities.

These chips won’t be as powerful as their cloud-based cousins. However, they are still capable of performing many machine learning tasks.

“As the technology on the smartphone gets popular, it will migrate to the IoT in a few years,” said Stewart. “We’ve already seen some smartphone machine learning chips from Qualcomm introduced to smart routers, fire walls, drones and IoT devices.”

The biggest hurdles to bringing machine learning chips onto IoT devices is the price, power drain and size of the chips.

With respect to price, Stewart reports that the cost of performing machine learning on the cloud is getting much cheaper (more on this later). The cost is getting to the point where adding a machine learning chip onto an IoT device will be a costlier endeavour than crunching the numbers in the cloud.

Stewart notes that this means that when engineers design IoT devices with onboard machine learning chips, it will be to satisfy a functionality where latency is critical. “Think of a drone flying near some powerlines,” said Stewart. “The drone will need to see the lines, assume where others might be, and actuate the motors in a few tenths of a second. This would be an application where you would need a chip onboard. Adding these machine learning chips onto IoT devices will be a conscious design decision.”

As for the power drain and size issues, current research is underway to get these mobile machine learning chips to operate on the microwatt scale and in a more compact space. This capability would be far more suitable for an always-on IoT device. 

Unfortunately, these ultra-low power machine learning chips are still a few years away. Additionally, their capabilities would be reduced significantly. These chips should be able to recognize when someone is talking to them and if some obstacle is in the way. However, they will not be able to decipher that speech or recognize the fine details of an obstacle. For that heavy lifting, much of the data crunching will still take place in the cloud.

For now, the only IoT devices that implement an onboard machine learning chip will require a significant battery pack, like a cell phone, or they will need to be continuously plugged in, like an appliance.

However, Stewart’s supposition is that if you were to add some machine learning to your IoT device, chances are that at this time you would be the first—or one of the first—to market.

2. Data Scientists Can Now Divert Attention to the IoT

Data scientists can now focus on IoT instead of being locked into the automation space thanks to software freeing up time spent on tedious data cleanup.

Another machine learning trend that Stewart thinks will affect the IoT market is that many tedious data science tasks will be, or already are, automated by software.

Tasks like scrubbing the data, removing trivial errors and tossing out junk data constitute a large portion of a data scientist’s day-to-day work. These tasks are repetitive, prone to errors and time consuming. They are therefore better suited to automation.

“Eighty percent of this can be automated,” argued Stewart. “This doesn’t mean to fire these practitioners; they are in too much demand. This just means that the data scientist isn’t wasting time. All of their efforts can be put into value-added results.

“A current constraint to bringing machine learning to the IoT is that much of the time these data scientists have has been eaten up by other industries like automation,” added Stewart.“Now, with software automating much of the tedium, these scientists can focus on IoT applications.”

3. Synthetic Data Develops Machine Learning IoT Systems Faster

The data requirements to setup a machine learning algorithm has been reduced thanks to synthetic data.

Another trending tool that IoT engineers should keep in their back pocket is the idea of synthetic data.

To create a machine learning algorithm, users need a large amount of data. Sure, you might have that data lake from years of collecting IoT data. However, you might just as easily be creating your IoT data collection system now.

Stewart explains that in these cases, engineers can use data that is similar to the data they need to begin the training of their machine learning algorithm.

“Synthetic data is mixed with real data; it is useful in IoT applications where you don’t have the data volume yet,” said Stewart.

This means that engineers can develop IoT systems that they want to immediately link to a machine learning algorithm. They just need to find or build synthetic data to jump-start the training of their algorithms.

4. New Machine Learning Hardware Saves on Costs

Stratix IV FPGA from Altera has been used in data processing applications. (Image courtesy of the Altera Corporation.)

Stewart notes that there are a handful of new machine learning chips out there that are making it much more affordable to crunch numbers on the cloud.

Traditionally, this space was dominated by graphics processing units (GPUs). After all, machine learning is much more suitable to the parallel processing found in GPUs compared to the sequential processing of CPUs.

However, field-programmable gate arrays (FPGAs) and application-specific integrated circuits (ASICs) have grown in popularity for data centers with a machine learning focus. Stewart notes that these tools will make it easier, and more affordable, to perform machine learning.

“This might not be that big of a change for those already using machine learning like financial institutions,” argued Stewart. “When the algorithms detect irregular credit card behavior, it makes it easy to make the ROI argument for someone like Visa.

“On the other hand,” he added, “with IoT, many of the benefits seem incremental. They might aggregate into a big ROI, but with each increment improving the system by a half percent, there are many hurdles to calculate that overall ROI. By making it more affordable to perform machine learning in the cloud, you make that IoT ROI higher.”

As the cost of machine learning gets reduced to a minimum, Stewart suggests that the ROI stumbling block for the IoT will eventually become irrelevant.

5. IoT Systems Are Largely Unaffected by Machine Learning’s Black Box Limitation

Machine learning algorithms are typically black boxes. We know what comes in, and we know that what comes out makes sense, but we don’t know what occurs in between those two steps.

The discussion about the limitations of machine algorithms being a black box has unfortunately prevented the technology from making a difference in many industries.

Humans don’t really understand how machine learning algorithms work. We just know how to build them. We gather data, scrub it, process it, expose it recursively to a neural network during a training cycle, and then implement whichever neural network we deem is ready for inference.

The resulting algorithm wasn’t programmed per se; it was more evolved. Consequently, these algorithms are not well understood or explained to others.

Stewart remarks that this can be a big hurdle for those trying to deploy machine learning in a medical device. Want to create machine learning chips that can live in the human body? Stewart says that will happen sooner than you think.

Want to convince the FDA to accept that machine learning application when it is essentially a black box, even to the inventor? Good luck.

Stewart explains that for the majority of IoT applications, the benefit is that they are not affected by this black box limitation. For many in upper management, the internal workings of their products, equipment and operations are already a black box. This can also be said about most consumers and their IoT products.

Therefore, how hard will it be for IoT practitioners to really get these customers and decisions makers to buy into another black box? It shouldn’t be that hard.

Are you working on an IoT product that has machine learning characteristics? If so, why not ask the engineering community for help with the implementation on projects.engineering.com.