An ‘Intelligence Era’ is Revolutionizing our Relationship with Data

Dr. Colin Parris, Senior Vice President and GE Digital Chief Technology Officer, it's essential not to miss the opportunity brought by our new relationship with data. (Image: GE)

In a rapidly transforming digital era, Dr. Colin Parris offers perspectives from his journey defined by exploration, perseverance, and a keen grasp of the digital realm's ever-changing dynamics, offering him both challenges and victories.

Serving as the GE Senior Vice President and GE Digital Chief Technology Officer, Parris recently garnered the prestigious title of 2023 Black Engineer of the Year for his indispensable contributions to digital transformation. His foundational years in engineering and his academic pursuits at Howard University, University of California Berkeley, and Stanford charted a trajectory to technological powerhouses like Bell Labs, IBM, and GE. In this conversation, Parris shares his professional path, including tales of accomplishments and setbacks, and highlights the importance of maintaining a crucial balance between technology and business interests. His perspectives underscore a sentiment that we are merely at the beginning of an intelligence era—a paradigm shift set to revolutionize how we engage with information.


The following has been edited and condensed for clarity.




Engineering.com: Can you describe a defining moment in your early years that, as a person of color, inspired you and guided your career direction?


Colin Parris: That's a complex question, but I’ll try to sum up my experience. I was born in the U.S. but grew up in Trinidad. My father, a professor of engineering at the University of the West Indies, was a natural role model. In Trinidad, there were many people who looked like me, and I observed the growth of engineering in the country, particularly in oil infrastructure.


When I came to the U.S., I attended Howard University, following in my family's footsteps. At Howard, I was introduced to digital transformation through courses like systems design, which hooked me on the potential of technology.


I later joined Bell Labs and IBM, working on transforming voice to digital technology and banking. It was a time when the problems were large and merit could outweigh identity. However, I found it difficult to get opportunities due to my background and lack of connections.


The defining moment for me was choosing to focus on tough, new areas that nobody wanted to risk their career on. Working on the hardest topics at IBM propelled me up the ladder. I gained diverse experiences that shaped me and allowed me to stand out. There were even meetings when it seemed like people weren’t looking at my color, they only saw my capabilities.


So, the defining moments were a series of challenges and decisions, picking the hardest things and new areas that others didn't want to risk. Some of them turned out to be really good, and in some sense, being who I am helped me.


ENG.COM: Tell us about some of your early work.


CP: I pursued a PhD at Berkeley focusing on digital transmission and AI. What struck me about this technology was its ability to improve itself.


At IBM, we built chips using simulators, and those chips would help us create even faster chips. This process extended to writing compilers and eventually to low-code, no-code solutions. The evolution from manually writing logic to having AI write the logic was thrilling to me. It's a technology that continually makes itself better.


I was involved in high-performance computing at IBM, and I saw firsthand how the technology progressed. A computer system we sold in 2002 for around $9 million, occupying 4,000 square feet, had the same power as a $250 PlayStation by 2016. There's no other technology where performance grows while the price drops like this.


It became clear to me over the first 15 years of my career that this self-improving technology would be essential for coping with immense problems and advancing our civilization. Unlike materials like concrete, which doesn't evolve, computer technology builds on itself, and I recognized it as the right path to face the future challenges.



ENG.COM: What's your biggest professional success?


CP: My most significant professional success is related to the digital twin technology we implemented at GE. I've worked on digital transformation in various industries, but the challenge at GE was unique because of the long-lasting nature of industrial products like jet engines, gas turbines, and the complex pricing and government regulations associated with them.


The task was to use digital technology and AI to build digital twins, essentially models of these engines and turbines, to improve productivity and create new capabilities. It's a high-stakes environment where lives and billions of dollars are on the line, not just business profits.


The real success in this endeavor wasn't just the technology; it was in managing and fostering the right environment for highly intelligent people to flourish. Having spent 20 years at IBM and 10 years at GE, I learned that my job was to create an atmosphere where those smarter than me could innovate and solve problems.


The digital twin project stands out to me because of its profound impact and the way we approached it—by focusing on people and building an environment where they could bring their best selves to the project. While the number of digital twins we built and the money we saved are essential, for me, it's all about enabling people to make changes on the planet.




ENG.COM: What's your biggest failure and what did you learn from it?


CP: Oh, there have been quite a few failures, but one that stands out to me is during my time at IBM, where I once ran a $5.5 billion unit, the Unix systems, particularly in the context of banking expansion in China.


The mistake I made was believing that technology was the driver of technology itself. We focused on building a high-performance Unix system to compete with Sun Microsystems, concentrating solely on making it faster, scalable and cheaper. However, we completely missed the real business problem.


Around 2007-2008, China was focused on creating a harmonious society and moving towards a more structured banking system. They were opening something like 9,000 to 11,000 branches a year, intending to have the Chinese population trust the banking system instead of hiding money at home.


My failure was in not recognizing the actual requirements. Our high-performance system needed top talent to install and maintain. It was hard to use and completely missed the point that the Chinese government wanted systems that could be installed quickly, maintained by local employees with minimal skills, and were easy to run.


I had to correct the mistake by flying to China nearly every month, understanding the ground realities, and shifting our entire requirements to align with the actual business needs. That's when I learned the essential lesson that technology itself must be subservient to what you're delivering. It has to be driven by the business needs and the actual operational limitations. In research, you might have the freedom to explore, but in actual business development, understanding the business objectives and limitations is crucial.



ENG.COM: In today's manufacturing sector, where design engineers focus on incorporating the latest technologies into products, do business and systems requirements still take precedence over mere functionality?


CP: Absolutely, and it's not just the business requirements but also the systems requirements that hold precedence now. Let me explain what I mean by this…


Previously, I might have focused solely on technology. For example, with batteries for EVs, I would have concentrated on the battery's capacity, recharging speed, and lifespan. Then I evolved to consider the business aspects, like cost, maintenance, and support.


But now, I've expanded my thinking to encompass the entire system. It's not just about engineering or business-driven aspects; it's about systems-driven considerations. I look at the whole picture, including where the materials come from, ownership of the processes, deployment, support, government regulations, investment, pricing, and even geopolitical aspects like international deals and regulation.


Take the example of a battery needing lithium. I have to consider where it comes from, who processes it, the global dynamics, supply chains, and local laws. All of these aspects influence the final design,nd ignoring them can lead to expensive mistakes down the road.


So, yes, it's no longer just about creating the most innovative or high-performing product. It's about understanding the system, including business, social, government, and environmental constraints. That's where the real innovation and creativity lie. This shift to a systems-driven approach recognizes that we operate in a complex, interconnected world where understanding the broader context is essential for success.



ENG.COM: The understanding of the digital twin is still rudimentary in industries like manufacturing and energy, even though we have this finely tuned tool at our disposal. What does that say about how we're conducting business?


CP: The understanding of digital twins is shaped by two main factors: timeframes and process. In consumer spaces, timeframes change rapidly, while in industrial contexts like jet engines and turbines, they can last decades. These longer timeframes have large impacts and constraints, including the type and amount of data collected.


The collection of data in varying environments, such as different climates and locations, presents challenges. Data mismatch, limited sensors, and the high cost of failure make predicting and using digital twins more complex.


Adapting AI technologies like humble AI and explainable AI has helped bridge some of these gaps, allowing for more accurate models and reduced business risks. Progress is slow, but efforts are being made at various levels, including by the government, to enhance the understanding and application of digital twins.



ENG.COM: Is AI going to replace engineers?


CP: No. Let's think about AI in two ways. AI is a tool, right? So when I give an engineer a cscope, do I think the cscope was going to replace the engineer?


It's a tool, and how I use the tool matters. It can broaden my perspective, enhance creativity, and shorten mundane tasks, but it'll never replace us. The other fundamental flaw is that a lot of the AI algorithms need a lot of data. So I can do things like transfer learning, yes, human feedback loops like in reinforcement learning, but you need the human because we have the ability to understand or work with others. So no, AI is not going to replace us. We just need to think about the tool differently and how we use it. It can even ask me questions that make me smarter, but it's still a tool and will remain that way.



ENG.COM: You’ve said that society is experiencing an intelligence change. Can you explain what you mean by that?


CP: Absolutely. I grew up in a time when education was limited to the professor in front of the class and a few reference books in the library. Today, I have access to 4,000 courses from top institutions like MIT, Stanford, and Berkeley, as well as a million articles and expert opinions. And then there's AI to help summarize all that information.


In the past, I would have spent 80 percent of my time doing the base work, but now it's reduced to 20 percent. The remaining time can be spent on creative things. For example, by using ChatGPT, I can have a conversation that remembers what I've told it. Unlike a simple Google search that doesn't retain information, AI can remember, help me generate ideas, and aid in creating something brilliant out of them.


Everyone in the world is tied to this vast network of knowledge now. When a new piece of information appears, the algorithm can find it and bring it to me. I didn't have that 50 years ago. In my PhD, references were between two and five years old. Now, they're between two months and two years. The rate of change of data is enormous. That's what I mean by the intelligence era happening. It's a significant shift in how we access and utilize information, and it's essential not to miss it.