It's Not Complicated; it's Just Hard
|Mar 5|| 3|
The be radical team spent the latter half of February delivering our new immersive FutureFWD [“Future Forward”] program to business leaders in Dubai and Johannesburg. Since returning, we’ve been navigating the fog of jetlag and reflecting on new insights sparked by the rich conversations we enjoyed on the road.
In framing FutureFWD — a two-day learning journey into the disruptive implications of emerging technologies and convergent economic and social trends, we took a moment to point out that the change required (of the organization and the organizational leader) to leverage these trends and realize new opportunities in this evolving context isn’t complicated; it’s just hard.
And we believe that the change is hard because it’s complex.
This can sound like a game of semantics, but hear us out. Understanding the difference between the complicated problem and the complex challenge (a distinction fruitfully elaborated more than a decade ago in the work of David Snowden) is critical to making sense of the unfolding future and the organizational leader’s limited but real capacity to actively shape a preferred future–rather than simply allowing the future to be a thing that happens.
Complicated problems admit to technical solutions. They are the domain of discernible cause and effect relationships, reliable models, accurate predictions, replicable solutions, and good (if not best) practice based on expert knowledge and relevant past experience.
Complex challenges are defined by their unpredictability. Cause and effect relationships tend to only be clear in retrospect, and outcomes are highly dependent on ecological interrelationships in a dynamic, adaptive system. Solutions are hard to come by and difficult to maintain, and the best bet tends to be on emergent practice.
Building a video hosting and sharing platform with a powerful recommendation algorithm is a complicated problem. Managing the spread of disinformation and conspiracy theories within a huge user community on such a platform is a complex challenge. Or to bring this into focus with a particularly of-the-moment example: Creating a vaccine might be a (very) complicated problem, but managing the high-level response to a disease outbreak is a frighteningly complex challenge.
By definition, human systems are complex systems, and effecting change in a human system is a correspondingly complex challenge. For better or worse, we can’t simply update the human OS (or the company culture) and revel in a world of instantaneous change and precise future fitness. Change and future fitness are both attainable, but we’re generally going to have to learn our way to both and then continually work to manage and maintain those gains.
Importing the trappings of startup culture (ping pong, anyone?) or building an innovation lab won’t breed a culture of innovation or disruptive design of products and services. If culture is an emergent property of the organization’s human systems, we have to make intentional adjustments at the system level — experiments in the who and how of interrelationships — to learn our way to new ways of doing, thinking & being. A rich example we’ve been researching (and one we’ll be discussing in more detail in future Briefings) is the challenge of designing the org to enable disruptive work at the Edge while maintaining the Core of the business.
It’s not complicated; it’s just hard. And it’s hard because it’s complex.
Navigating (and even embracing) complexity and a complex future is made easier with the right practices and principles–all of which should support smart experimentation and rapid learning, and in forthcoming radical Briefings, we’ll be exploring some of our own practices and principles in both detail and application.
We’ll start by building on something we discussed in the last Briefing (#0017: The Signals Are Talking): spotting and studying weak signals. This is a practice we find profoundly useful for exploring possible futures and the tendencies of a complex and rapidly evolving operating environment. We’re excited to develop this cognitive muscle, and we invite you to join us in flexing it.
Beginning with this Briefing (see below), we’ll be tapping experts in our network to share some of the weak signals they’re watching in 2020 with the aim of facilitating both imagination and sense-making in the be radical community.
Jane, Mafe, Amber, Jeffrey and Pascal
P.S. Interested in exploring how this applies to your organization and your products & services? Find out how be radical can help you. Simply hit reply to this email, tell us a bit about yourself and the opportunity/challenge you face, and we will be in touch.
The Network Effect:
Each Briefing, we’re tapping some of the experts in the be radical network to join the conversation. This round, we’re spotting weak signals with Samantha Snabes and Dr. David Bray.
Samantha Snabes, Co-founder and Catalyst at re:3D Inc
Samantha is the CEO for re:3D where she works with dirty fingernails facilitating global connections between others printing at the human-scale and/or using recycled materials. Here are some of the weak signals Samantha is paying attention to in 2020:
“3D printing from recyclables, namely post-manufacturing trim & scrap is top of our mind these days! The more we work with different materials and organizations, the more we are appalled at how much big brands pay to sustainably dispose of waste and the sheer volume of waste created in manufacturing operations that could be used for internal needs (thus a double value-add) and/or to support local circular economies/new job creation. This is especially evident on islands such Puerto Rico that have many factories, high unemployment, landfills at capacity and import 85% of goods!”
Dr. David A. Bray, Senior Fellow, Institute for Human-Machine Cognition
Director of a new Center with the Atlantic Council (launching March 11)
David is incubating a new global Center with the Atlantic Council to champion positive paths forward and that ensure new technologies and data empower people, increase prosperity, and secure peace. He also provides strategy to both Boards and start-ups espousing human-centric principles to technology-enabled decision making in complex environments. Here are some of the weak signals David is paying attention to in 2020:
Over-saturation of discussions of AI right at the same time that actual data scientists are increasingly concerned about the fragility or brittleness of current machine learning techniques - this might create yet another “AI winter of disillusionment” in a year or so?
On an optimistic note, increasing length in which larger numbers of qubits can be kept stable. Curious as to where this quantum development will take us in 3-5 years.
Increasing pessimism among technology professionals about the ability of tech to strengthen democracy, as shared by this Pew study.
We get it. There’s a lot out there. With radical Recommends, we’re not going to overwhelm you. We’ll only highlight a couple of reads/watches/listens each Briefing that are shaping our thinking, challenging our assumptions, or changing our minds.
⇒ Automatic for the People? Two musician-programmers generated every possible melody and released the collection to the public domain. An awesome example of the use of AI combined with decentralized innovation.
⇒ The new political reality of digital campaigns and disinformation wars at staggering scale is here — just in time for the US presidential election.