Source | sloanreview.mit.edu | Tucker J. Marion | Sebastian K. Fixson | Greg Brown
Throughout history, new technologies have demanded step shifts in the skills that companies need. Like the First Industrial Revolution’s steam-powered factories, the Second Industrial Revolution’s mass-production tools and techniques, and the Third Industrial Revolution’s internet-based technologies, the Fourth Industrial Revolution — currently being driven by the convergence of new digital, biological, and physical technologies — is changing the nature of work as we know it. Now the challenge is to hire and develop the next generation of workers who will use artificial intelligence, robotics, quantum computing, genetic engineering, 3D printing, virtual reality, and the like in their jobs.
The problem, strangely enough, appears to be two-sided. People at all levels complain bitterly about being either underqualified or overqualified for the jobs that companies advertise. In addition, local and regional imbalances among the kinds of people companies want and the skills available in labor pools are resulting in unfilled vacancies, slowing down the adoption of new technologies.
Before organizations can rethink how to design jobs, organize work, and compete for talent in a digital age, they must systematically identify the capabilities they need now, and over the next decade, to innovate and survive. For more than 10 years, we’ve been studying the impact of digital design and product development tools on organizations, their people, and their projects.1 We’ve found that the competencies companies need most are business-oriented rather than technical. That’s true even for brick-and-mortar companies that are trying to become more digital.