Fujitsu Develops a Faster SSD for Big Data

Fujitsu has announced that it has engineered a solid-state drive with flash memory that can be controlled by software running on a server. This new type of active memory control makes Fujitsu’s new SSD three times faster than traditional solid-state drives. But what’s the point?

Since their invention, computers have been tasked with crunching ever more complicated sets of data. Today, the idea that Big Data can help drive innovation in fields ranging from medicine to product design is a rallying cry for businesses around the world.

But there’s a catch.

With the accumulation of larger and larger data sets, the need to churn through them quickly is making computers hit a bottleneck. Today, computers use dynamic random-access memory (DRAM) to construct in-memory databases that pile huge amounts of data onto servers where it can be combed through rapidly. However, if too much data is loaded into these DRAM channels, a server can become bogged down and its processing speed can be sapped—a concept that’s anathema to today’s Big Data push.

To solve this problem, Fujitsu’s engineers have developed an SSD that allows read/write commands to be directed dynamically by software installed on a host server. With this type of control, the server can strategically place data in DRAM in ways that retain optimal processing speed, making short work of finding solutions.

While Big Data has only just started to mature, the future of data-driven design is only just beginning. Sensors are starting to pop up everywhere, and most people on the planet now carry one or more with them at all times. Given the amount of information that this paradigm can generate, Fujitsu’s breakthrough has arrived just in time. I expect even more improvements like this SSD upgrade will be coming down the pike soon, but I do wonder, will there be a point at which hardware improvements can’t keep pace with the amount of data that we’re trying to analyze?