These tech trends are generally accelerating the primary characteristics that have defined the digital era: granularity, speed, and scale. But it’s the magnitude of these changes—in computing power, bandwidth, and analytical sophistication—that is opening the door to new innovations, businesses, and business models.
The emergence of cloud and 5G, for example, exponentially increases compute power and network speeds that can enable greater innovation. Developments in the metaverse of augmented and virtual reality open the doors to virtual R&D via digital twins, for example, and immersive learning. Advances in AI, machine learning, and software 2.0 (machine-written code) bring a range of new services and products, from autonomous vehicles to connected homes, well within reach.
Much ink has been spilled on identifying tech trends, but less attention has been paid to the implications of those changes. To help understand how management will need to adapt in the face of these technology trends in the next three to five years, we spoke to business leaders and leading thinkers on the topic. We weren’t looking for prognostications; we wanted to explore realistic scenarios, their implications, and what senior executives might do to get ready.
The discussions pinpointed some broad, interrelated shifts, such as how technology’s radically increasing power is exerting a centrifugal force on the organization, pushing innovation to expert networks at the edges of the company; how the pace and proliferation of these innovations calls for radical new approaches to continuous learning built around skills deployed at points of need; how these democratizing forces mean that IT can no longer act as a centralized controller of technology deployment and operations but instead needs to become a master enabler and influencer; and how these new technologies are creating more data about, and touchpoints with, customers, which is reshaping the boundaries of trust and requiring a much broader understanding of a company’s security responsibilities.
Shift: Innovation develops around personal networks of experts at the porous edge of the organization and is supported by capabilities that scale the benefits across the business.
These technologies promise access to virtually unlimited compute power and massive data sets, as well as a huge leap in bandwidth at low cost, making it cheaper and easier to test, launch, and scale innovations quickly. The resulting acceleration in innovation will mean that companies can expect more disruptions from more sources. Centralized strategic and innovation functions cannot hope to keep pace on their own. Companies will need to be much more involved in networks outside their organizations to spot, invest in, and even acquire promising opportunities.
Corporate venture-capital (VC) funds with centralized teams have looked to find and fund innovation, but their track record has been spotty, often because the teams lack the requisite skills and are simply too far removed from the constantly evolving needs of individual business units. Instead, companies will need to figure out how to tap their front lines, particularly business domain experts and technologists, to enable them to act, in effect, as the business’s VC arm. That’s because the people who are writing code and building solutions are often well plugged into strong external networks in their fields and have the expertise to evaluate new developments. One pharma company, for example, taps its own expert researchers in various fields, such as gene expression, who know well the people outside the company who are leaders in the field.
While companies will need to create incentives and opportunities for engineers to build up and engage with their networks, the key focus must be on empowering teams so they can spend their allocated budget as they see fit—for example, experimenting and failing without penalty (within boundaries) and deciding on technologies to meet their goals (within prescribed guidelines).
The IT organization of the future can play an important role in building up a scaling capability to make that innovation work for the business, something that has traditionally been a challenge. Individual developers or small teams working fast don’t tend to naturally think about how to scale an application. That issue is likely to be exacerbated as nontechnical users working in pockets across organizations use low-code/no-code (LC/NC) applications to design and build programs with point-and-click or pull-down-menu interfaces.
One pharma company has taken this idea to heart by giving local business units the flexibility to run with a nonstandard idea when it has proven to be better than what the company is already doing. In return for that flexibility, the business unit must commit to helping the rest of the organization use the new idea, and IT builds it into the company’s standards.
In considering how this scaling capability might work, companies could, for example, assign advanced developers to “productize” applications by refactoring code so they can scale. IT leadership can provide tools and platforms, reusable-code libraries that are easily accessible, and flexible, standards-based architecture so that innovations can be scaled across the business more easily.
For more on how to empower workers at the edge, see “Tech companies innovate at the edge. Legacy companies can too,” in Harvard Business Review.
Would you like to learn more about McKinsey Digital?Shift: Tech literacy becomes core to every role, requiring learning to be continuous and built at the level of individual skills that are deployed at the point of need.
With the pace and proliferation of technologies pushing innovation to the edge of the organization, businesses need to be ready to incorporate the most promising options from across the front lines. This will create huge opportunities, but only for those companies that develop true tech intelligence through a perpetual-learning culture. The cornerstone of this effort includes training all levels of personnel, from “citizen developers” working with easy-to-use LC/NC tools or in entirely new environments such as the metaverse, to full-stack developers and engineers, who will need to continually evolve their skills to keep up with changing technologies. We’re already seeing situations where poorly trained employees use LC/NC to churn out suboptimal products.
While there will always be a need for more formalized paths for foundational learning, we anticipate an acceleration in the shift from teaching curricula periodically to continuous learning that can deliver varying technical skills across the entire organization. In practice, that will mean orienting employee development around delivering skills. This requires breaking down a capability into its smallest sets of composite skills. One large tech company, for example, created 146,000 skills data points for the 1,200 technical skills it was assessing.
The key point is that these skills “snippets”—such as a block of code or a video of a specific negotiating tactic—need to be integrated into the workflow so that they’re delivered when needed. This might be called a “LearnOps” approach, where learning is built into the operations. This integration mentality is established at Netflix, where data scientists partner directly with product managers, engineering teams, and other business units to design, execute, and learn from experiments. 12 Netflix Technology Blog, “Experimentation is a major focus of data science across Netflix,” blog entry by Martin Tingley et al., January 11, 2022.
Netflix, which makes broad, open, and deliberate information sharing a core value, built the Netflix experimentation platform as an internal product that acts as a repository of solutions for future teams to reuse. It has a product manager and innovation road map, with the goal of making experimentation a simple and integrated part of the product life cycle. 14 Netflix Technology Blog, “Netflix: A culture of learning,” blog entry by Martin Tingley et al., January 25, 2022.
To support this kind of continuous learning and experimentation, companies will need to accept mistakes. The art will be in limiting the impact of potentially costly mistakes, such as the loss or misuse of customer data. IT will need to architect protocols, incentives, and systems to encourage good behaviors and reduce bad ones. Many companies are beginning to adopt practices such as automated testing to keep mistakes from happening in the first place; creating spaces where mistakes won’t affect other applications or systems, such as isolation zones in cloud environments; and building in resiliency protocols.
It is estimated that the global cloud microservices platform market will generate $4.2 billion in revenue by 2028, up from $952 million in 2020. 15 Cloud microservice platform market report, Research Dive, November 2021. GitHub has more than 200 million code repositories and expects 100 million software developers by 2025. 16 Paul Krill, “GitHub expects more than 100 million software developers by 2025,” InfoWorld, December 3, 2020. Nearly 90 percent of developers already use APIs. 17 Christina Voskoglou, “APIs have taken over software development,” Nordic APIs, October 27, 2020. Software 2.0 creates new ways of writing software and reduces complexity. Software sourced by companies from cloud-service platforms, open repositories, and software as a service (SaaS) is growing at a CAGR of 27.5 percent from 2021 to 2028. 18 Software as a service (SaaS) market, 2021–2028, Fortune Business Insights, January 2022.
Shift: IT becomes the enabler of product innovation by serving small, interoperable blocks of code.
When innovation is pushed to the edge and a perpetual-learning culture permeates an organization, the role of IT shifts dramatically. IT can’t support this dynamic environment by sticking to its traditional role as a controlling entity managing technology at the center. The premium will now be on IT’s ability to enable innovation, requiring a shift in its traditional role as protector of big tech assets to a purveyor of small blocks of code. The gold standard of IT effectiveness will be its ability to help people stitch together snippets of code into a useful product.
These developments point toward much more of a “buffet” approach to technology, where IT builds useful blocks of reusable code, sometimes assembles them into specific products, and makes them available through a user-friendly cataloging system for the business to use to create the products it needs. IT provides guiderails, such as API standards and directives on the environments in which the code might be most useful; protects the most sensitive information, such as customer data and financial records; and tracks their adoption. This tracking capability will become particularly crucial as bots, AI, algorithms, and APIs proliferate. Transparency isn’t sufficient. IT will need to make sense of all the activity through advanced tech performance and management capabilities and the development of new roles, such as data diagnosticians and bot managers.
This IT-as-a-service approach puts the product at the center of the operating model, requiring a commitment to organizing IT around product management. Some companies have been moving in this direction. But reaching the scale needed to support fast-paced and more diffuse innovation will require a deeper commitment to product owners, working with leaders in the business side of the house, to run teams with real P&L responsibility. Many organizations, from traditional enterprises to digital natives, have found that putting in place product leaders who set overall product and portfolio strategy, drive execution, and empower product owners to drive innovation aligned with business outcomes and P&L metrics can increase the return on the funding that flows to technology delivery and quicken the pace of innovation.
It was estimated that almost 100 percent of biometrics-capable devices (such as smartphones) will be using biometrics for transactions by 2022. 22 “Usage of biometric technology in transactions with mobile devices worldwide 2016–2022”, Statista Research Department, June 13, 2022. The effectiveness of these technologies has advanced dramatically, with the best facial-identification algorithms having improved 50 times since 2014. 23 William Crumpler, “How accurate are facial recognition systems—and why does it matter?” Center for Strategies and International Studies (CSIS), April 14, 2020. These developments are contributing to profound unease in the relationship between technology and consumers of technology. The Pearson Institute and the Associated Press-NORC Center for Public Affairs Research shows that “about two-thirds of Americans are very or extremely concerned about hacking that involves their personal information, financial institutions, government agencies, or certain utilities.” 24 Chuck Brooks, “More alarming cybersecurity stats for 2021!” Forbes, October 24, 2021.
Shift: Trust expands to cover a broader array of stakeholder concerns and become an enterprise-wide responsibility.
These enormous shifts in technology power and capacity will create many more touchpoints with customers and an exponential wave of new data about customers. Even as IT’s role within the organization becomes more that of an enabler, the expanding digital landscape means that IT must broaden its trust capabilities around security, privacy, and cyber. To date, consumers have largely embraced the convenience that technology provides, from ordering a product online to adjusting the temperature in their homes remotely to monitoring their health through personal devices. In exchange for these conveniences, consumers have traditionally been willing to provide some personal information. But a steady undercurrent of privacy and trust concerns around these ever-more-sophisticated conveniences is raising the stakes on the broad topic of trust. Consumers are becoming more aware of their identity rights, making decisions based on values, and demanding the ethical use of data and responsible AI.
The most obvious concern is around cybersecurity, an ongoing issue that is already on the board-level agenda. But tech-driven trust issues are much broader and are driven by three characteristics. One is the sheer quantity of personal data, such as biometrics, that companies and governments collect, creating concerns about privacy and data misuse. The second is that personal security issues are becoming more pervasive in the physical world. Wired homes, connected cars, and the Internet of Medical Things, for example, are all vectors for attack that can affect people’s well-being. Third is the issue that advanced analytics seem too complex to be understood and controlled, leading to deep unease about people’s relationship with technology. This issue is driving the development of “explainable AI” and the movement to debias AI.
Adding to the complexity is the frequent need to manage and secure trust across an entire ecosystem of technologies. Take the wired home, for example. The proliferation of devices—think virtual assistants, security, communications, power management, and entertainment systems—means that a large group of providers will need to agree on standards for managing, in effect, an interconnected security net in the home.
These developments require a complex extension of the boundaries of trust. The significant advantages that many incumbents enjoy—existing relationships with customers and proprietary data—are at risk unless businesses rethink how they manage and nurture that trust. Companies need to consider putting identity and trust management at the core of their customer experience and business processes. That can happen effectively only when companies assign a dedicated leader with real power and board-level prioritization with enterprise-wide responsibility across the entire trust and security landscape. Given the tech underpinnings of this trust environment, IT will need to play a key role in monitoring and remediating, such as assessing the impact of new legislation on AI algorithms, tracking incidents, identifying the number and nature of high-risk data-processing activities and automated decisions, and—perhaps most important—monitoring consumer trust levels and the issues that affect them.
It is inevitable that the pace of technological change will continue to accelerate. The successful technology leader of the future will not simply need to adopt new technologies but to build capabilities to absorb continuous change and make it a source of competitive advantage.