Life in 2030: These are the 4 Things Experts Can't Predict
Alvin Toffler predicted a future in his 1970 bestseller Future Shock that looks much like today’s reality. He anticipated the rise of the internet, the sharing economy, companies built on “adhocracy” rather than centralized bureaucracy, and the broader social confusions and concerns about technology. He foresaw that the evolving relationship between people and technology would shape how societies and economies develop.
This is also the focus of much of the World Economic Forum's work. It explores how technology advances and how relevant challenges will be addressed by the year 2030. Here are some of the uncertainties that policy-makers, corporate executives, and civil society actors face as they move into this new world.
Can we master greater connectivity?
Many are convinced that the internet will be everywhere - or nearly everywhere - in the next generation. It will be "on" most things and built into many objects and environments. Experts claim that the internet will fade into the background, becoming like electricity - less visible but deeply embedded in human endeavors. Even those without high levels of literacy will interact with digital material and apps using their voice, igniting an unprecedented expansion of knowledge and learning.
This explosion of connectivity brings new possibilities, but also economic and social vulnerabilities. The level of coordination and coding required to stitch the Internet of Things together is orders of magnitude more complicated than any historical endeavour yet. It is likely that things will break and no one will know how to fix them. Bad actors will be able to achieve societal disruptions at scale and from afar. Consequently, we are faced with some hard, costly choices. How much redundancy should these complex systems have? How will they be defended and by whom? How is liability redefined, as objects are networked across a global grid and attacks can metastasize quickly?
Will we create more meaningful work?
There is no consensus about whether the forces unleashed by technology destroy more jobs than they create or whether the historic pattern of human upskilling prevails as new, more valuable jobs replace those supplanted by technology. The next advancements in machines are clear, but the human response is not.
How the ecosystem of education and skills-training will adapt is extremely relevant. Colleges, community colleges and trade schools are in the early stages of adjusting to a disruption in their business model that could rival the challenges already faced by the media and music industry. Many institutions now embrace teaching through online video or hybrid courses which provide both online and classroom experiences. These will all be monitored by artificial intelligence systems that assess student performance and the sufficiency of the course. Employees are also self-training with online material.
Will this adaptation be sufficient to the task? It depends on the talents rewarded by the next economy. When Pew Research Center queried experts, a considerable number focused on how the best education programmes would be those teaching how to be a lifelong learner. Some say alternative credential systems will arise to assess the new skills people acquire.
The experts also discussed specific human talents which they doubt machines and automation will be able to duplicate for some time. These include social and emotional intelligence, creativity, collaborative activity, abstract and systems thinking, complex communication skills, and the ability to thrive in diverse environments. It is unclear whether American schools and universities can re-orient to emphasizing these non-technical skills.
Can trust and truth be revived?
Trust is a social, economic and political binding agent. A vast research literature on trust and social capital documents the connections between trust and well-being, collective problem solving, economic development and social cohesion. Trust is the lifeblood of friendship and care-giving. When trust is absent, all kinds of societal woes unfold, including violence, chaos and paralysing risk-aversion.
There is considerable concern that the way people use the internet is degrading trust. The fate of trust and truth is up for grabs. On one hand, many worry that the fake news ecosystem preys on deep human instincts. Preferences for convenience, comfort, and information that reinforces their views make people vulnerable to the ways new tech tools can identify, target and manipulate them. On the other hand, humans have a decent track record of confronting problems caused by communications revolutions. There are new ways to fight back, at internet speed.
How much can social and organizational innovation alleviate new problems?
With so much upheaval, people, groups and organizations will be forced to adjust. Some primary aspects of collective action and power are already changing as social networks become a societal force. These networks are used for both knowledge-sharing and mobilizing others to action. There are new ways for people to collaborate to solve problems. Moreover, there are a growing number of group structures to address problems, from micro-niche issues to macro-global affairs such as climate change and pandemics.
New laws and court battles are inevitable and are likely to address questions such as: Who owns what information? Who can use and profit from information? When something goes wrong with an information-processing system (say, a self-driving car propels itself off a bridge), who is responsible? Where is the right place to draw the line between data capture - or surveillance - and privacy? What kinds of personal information can be legitimately considered when assessing someone’s employment, creditworthiness or insurance status? Who oversees the algorithms that decide what happens in society?
There is a long road ahead to 2030. There is a lot of opportunity to make the uncertain more certain.
Lee Rainie, Director, internet and technology, World Economic Forum. This post first appeared on the Agenda blog.