When you get involved in social networks, you occasionally get comments that make you think. A little while ago I received a tweet that said, “@christianve Who CARES? GET A REAL JOB INFO TECH IS DEAD.” The tweet intrigued me. What was the message? So, I tried to understand where the person who sent the tweet was coming from. To my surprise he describes himself as “director of innovation.” That’s probably about the last person I would have expected such message to come from. Info Tech is dead? I’m not sure what innovation he directs, but when I look around, most innovation includes information technology in one way, form or shape.
The world is turning digital
Over the last 20 years the world has turned digital. Music, video, radio, TV have all migrated from analog signal processing to digital. More than 30 percent of the value of a car is electronics these days. The good old carburetor has been replaced with digitally programmed direct injection. Mobile phones use digital technologies. And I could go on like this.
Where analog signals used electronic logic and gateways for processing, digital information uses programs. As a result a lot of what used to be cast in the electronic circuitry is now based on software that can more easily be adapted in case of error or for improvement. If you write such programs, you are an IT specialist, whether you like it or not.
Consumerization of IT
The generalization of the availability of the Internet, the merging of the mobile phone and portable gaming consoles, PC games and multi-media gaming consoles, have all driven IT to the consumer. Where IT used to be a sort of black magic, performed by a specific department in the company, its now become something accessible and used by all on a daily basis for work and private use.
Who does not have a Facebook page? Who does not share photos with DropBox, Picasa or Flickr? We all do for personal use and even in business. So, the black magic is gone. We know enough about IT no longer to be fooled. The consumerization of IT is obviously a change for the IT department, but they have to cope with it as the arrival of IT capacity that can be rented by the hour. This is the stuff we call cloud and may at first glance make IT redundant.
The Digital Enterprise
The enterprise of the future will use digital technology in everything it does. Being it to support business processes, to complement products, to deliver services to customers, you name it, IT will be involved. Actually, take a moment to read an interesting study from Gartner, titled “Taming the Digital Dragon: The 2014 CIO Agenda (PDF)”. Let me quote them “CIOs are facing all the challenges they have for many years, plus a torrent of digital opportunities and threats. Digitalization raises questions about strategy, leadership, structure, talent, financing and almost everything else.” That’s the challenge of the CIO and the IT department. Either they become central to the organization or they close the limelight on the traditional environment. So, how should IT change?
So, is Information Technology dead?
If by “information technology” you mean the black magic I referred to earlier, the answer is probably yes. The traditional IT department who delivers a minimal service without accountability is gone. But that is not what the term information technology actually means. The Information Technology Association of America defined IT as "the study, design, development, application, implementation, support or management of computer-based information systems."
I believe that is where the issue lays. The traditional IT department needs a serious overhaul. That does not mean however that jobs will disappear in information technology, on the contrary in my mind.
The transformation of the IT department
Cloud computing is changing the role of IT. I have said this several times before. First, the private cloud requires a different organization focused on running an integrated IT environment, not silos of technologies and applications. Second, the public cloud transforms IT from a deliverer of resources to a sourcer of services. Both imply fundamental changes in the way IT operates and interacts with the business.
However, the legacy world will not disappear any time soon as justifying the transformation of large applications to cloud can be very expensive and not justifiable from a business point of view. When introduced, enterprise resource planning (ERP) was a competitive advantage and millions were spent to implement it. Today ERP is needed for companies to operate, but no longer differentiates enterprises from each other. So, why reinvest in ERP type applications to make them cloud ready? What value will that add to the business?
Cloud Computing should focus on new, value added services. The “core” ones should be delivered out of the private cloud, while the “context” ones out of managed or public clouds to optimize the use of cloud computing for the business. IT staff with knowledge and experience in legacy systems can be retained to manage the legacy environment while the more curious teams could focus on the newer cloud based functions and technologies. We can take advantage of the current generational transition to transform IT fundamentally.
Cloud for innovation
But cloud can and should deliver more. The digitization I talked about earlier brings with it numerous capabilities for innovation. Transforming the products and services delivered by the company, changing the ways of access to the market, improving business processes, and linking closer with suppliers and channel partners, are all ways where cloud computing can make a difference. This however requires IT educate people, people that understand technology and how technology can deliver business advantages. A new breed of IT professionals is required for these jobs. They need a good dose of technology knowledge and understanding, but also a fair basis of business acumen.
So, is there no job anymore in information technology?
With all of the above, do you believe the tweet was correct and I’d better go and find another job? I’d rather believe there is a bright future in information technology for the one who is prepared to adapt. In my mind, you need two things for that. I call it the two “C’s”.
- Curiosity to always put yourself in question, to try to understand what is coming and how it can add value to the existing environment, how it can influence the way business is done and how it can help resolve issues that are out there.
- Creativity to think beyond the boundaries. It’s often by combining two or three ideas gathered in different contexts that a brand new approach pops up. That is where innovation is at its best.
In the term information technology there is the “technology” term and many people believe this is the most important. This makes them focus on specific technologies. They become deep specialists. I believe however we need a new breed of IT professionals, ones that have a good understanding of many technologies. They need to be able to articulate how those are working together and what business value they can provide. With cloud in particular, it’s that more global view that is critical. You can always rely on the deep technical knowledge of specialists to make things work, but it is having the possibility to understand end-to-end, across the whole software stack, what is happening, that allows a deep understanding of how things work. And that’s the type of people who will drive IT in the future.
So, be curious and creative. Try understand how things work end-to-end and you’ll have a great future in a digital world. No need to go and find a “real” job.
Christian Verstraete is responsible for building services focused on advising HP clients in their move to cloud, particularly from a business process and application perspective. He has spent more than 30 years in the industry, working with customers all over the world, linking business and technology.
Follow him on Twitter (@ChristianVe)
Check out his previous posts and discussions