What's the Big Deal With the Cloud?

Inspiration for this post:

Launch PLM 360
Discovery

It seems as though lately I've had many conversations with people whose opinions I trust, who are calling into question the whole concept of cloud computing. The common thread between those conversations, has been how misunderstood the concept of the cloud still is, even today.

There were actually three different points shared in each of these conversations.  I think they are worth sharing...

"Why should I trust putting my stuff on a mysterious server out there somewhere?"

I see this primarily as an issue of control.  The people usually asking this question are typically from an IT-centric background. Their concerns are more about control of the infrastructure.

I've said before that it's only human nature for people to resist change, and almost everyone feels more comfortable with the tools and techniques they already know, versus those they don't. IT people are comfortable manually configuring and managing servers.  What they may not realize however, is cloud platforms actually make it easier to create, configure, and manage an entire infrastructure more quickly than doing so manually in your own data center. Creating a fully provisioned virtual server in the cloud can be done in a matter of minutes on almost every cloud platform available today. Scaling that infrastructure (adding or removing resources such as CPU cores, memory, disk space, etc.) is often times just a matter of making changes in the control panel for that server.  Far easier than doing so the traditional way.

In most of these conversations, public cloud options were ruled out from the start, leaving the only alternative being a private cloud.

For those that really preferred the idea of data residing within their own meticulously maintained on-premise data centers, the private cloud makes an attractive option. It carries all the benefits available on most public cloud platforms, but still brings the benefits of bringing the infrastructure within your own data center.

"Isn't putting everything in the cloud the same as running everything on mainframes?"

Ahh, Centralized computing.  Yes, we have been down the mainframe road before. But that doesn't necessarily apply to cloud computing today.

Mainframes and dumb terminals were made obsolete by personal computers more than 25 years ago. They put all the power on the server side of the equation. In the case of CAD software, this required tremendous server resources in order to maintain memory and graphics processing for multiple users on a single server. Distributing that load across many personal computers was far more efficient and led to the PC revolution in the 80s.

More than a decade later later, we even flirted with another take on that theory. In the late 90s after the advent of the Internet and the web Larry Ellison (CEO of Oracle) and Scott McNealy (former CEO of Sun Microsystems) spearheaded an effort to come up with a new computing paradigm called the Network Computer. Although this is similar to the concept of what's now known as cloud computing, the idea then was for the software to be distributed across the network to a browser running on a diskless computer loaded with memory and graphics processing resources.  Unfortunately the software technologies available at the time really didn't lend themselves well to applications such as CAD and PLM, and Network Computing came to an end before ever really getting off the ground. The rapidly maturing software technologies of today are what makes cloud computing viable now.

"Is it realistic to expect engineering tools like CAD or PLM to run as we would expect just through a browser, or on mobile devices?"

Actually the answer now is yes, if the software is designed to do so.

Software technology has evolved tremendously in the time since Network Computing.  The three points I'll focus on here are...

1. Rich, engaging user interfaces delivered over the network.

Technologies such as HTML5, AJAX, CSS3, and WebGL allow for lightweight applications with rich user experiences to be delivered online, and often times, with just a browser.

2. More scalable software designed to deliver apps on a wider range of devices.

Applications are becoming more responsive to both the demands of both the user, and the devices they are running on. Consider in the past a CAD or PLM tool would take a significant amount of time just to initialize and open a file.  This was because of all the overhead code necessary to drive every bit of functionality that the tool could possibly deliver. Likewise, consider instead a web delivered application only delivering the functionality and user interface needed for each screen or piece of functionality the user requests.  It can be delivered only at the moment that it's needed. Think of it sort of as on-demand functionality. Desktop & laptop computers can load a traditional interface.  Touch devices can load a touch-enabled interface.  This allows even the largest applications to be delivered on devices with very few resources, or on computers that only have browsers. This translates to less application maintenance by the IT department. Happy users on more devices, along with increased application scalability. Overall, a better experience for everyone concerned.

3. Leaner, more robust data access technologies.

Traditionally software accesses data contained either in proprietary file formats, and/or large enterprise databases. The code necessary to access and interpret that data is large, and the data itself is even larger. Software development is now evolving to decouple much of that required overhead. It's becoming more about using formats such as XML and JSON to get data into the application itself, resulting in exchanging only the data that's needed, when it's needed.

My Take

It's clear to me that cloud computing is here to stay. As we speak, engineering software vendors are scurrying to re-architect their tools and re-evaluate the requirements of both their users, and the devices they use to access them. They are all working hard to decouple functionality from the behemoth desktop tools we've become accustomed to, and deliver them in ways that will be more sustainable and functional for everyone.

Autodesk is a good example of a vendor embracing the benefits of cloud computing and restructuring their products to best leverage these technologies in their 360 line of products. The new PLM 360 product also includes PLM Discovery, a tool that allows customers to generate a custom report illustrating the benefits of cloud-based PLM for their organization.

To me, there is only one conclusion… Desktop applications are dying a slow death.

Long live the cloud.

What's your opinion?  I'd love to hear your comments below...

 

Launch PLM 360 Discovery
This is a sponsored article. All words, ideas, and opinions expressed here are 100% mine. No artificial colors, flavors, or fillers have been added.