“I don’t know what we mean when we say we ‘pursue AI.’ Do you?”
“We’re not changing to adapt to new technologies, anyway… We’re just integrating them into our current paradigm.”
“I don’t even understand what we’re supposed to do right now!”
Twenty officers sit around a table, mired in the discomfort of an “adaptive leadership” workshop. This frame, developed by Ronald Heifetz and his colleagues at the Harvard Kennedy School, is designed to help organizations make progress on complex and collective challenges, called “adaptive” challenges. Unlike “technical” problems, which can be solved with existing know-how, adaptive challenges require learning and change — adaptation — the stakeholders themselves.
Digital transformation presents an adaptive challenge for the Department of Defense. As long as the Department of Defense relies on painless “technical” solutions – what Steve Blank calls the “theater of innovation” – America will become increasingly vulnerable to exploitation by foreign adversaries, costing both dollars and lives. To make progress on the challenge of digital transformation – and to maintain technological superiority – the Department of Defense should re-examine and reshape its deeply held values, habits, beliefs and norms.
The workshop officers are a prime example of a group struggling with adaptation. As in many groups, they begin by looking outward. One said: “It’s the ‘frozen environment’ that prevents us from doing anything digital”, while another added: “Anyway, our superiors cannot agree on what ‘they want. … What are we supposed to do?” The instructor nudges them: “It seems like the group is shifting the responsibility away from here. ?”
Then the officers walk away from the challenge. They share stories of previous success, rate instructor credentials, and joke about the workshop itself. Once again, the instructor intervenes: “I notice that we avoid uncertainty. Can we stay in the nebulous space of “digital transformation” any longer? Or will we escape when we don’t know how to proceed? »
Reluctantly, they return to digital transformation, but after a few minutes they ask the instructor for help: “Are you going to step in here, or…?” The instructor replies, “You depend on an authority – someone in charge – to solve a problem that can only be solved collectively – by all of you.”
At this point, the room is burning with frustration. But the officers cannot be blamed. Their moves to avoid the work of adaptation – diverting attention from the problem and shifting responsibility elsewhere – are typical for groups facing a difficult reality.
Specifically, in what Heifetz calls “classic failure,” groups attempt to solve adaptation problems via “technical solutions”: painless attempts that apply existing know-how, rather than working with parties. stakeholders to change the way they operate.
Hire someone, fire someone, increase the budget, lengthen the timeline, create a committee, restructure the organization, build a new tool, push a new policy: these are all technical fixes that, although they don’t are not inherently harmful, are easier than – and can distract from the internal work of reassessing values, habits, beliefs and norms.
Even today, the Ministry of Defense tries to approach digital transformation through technical means. The Ministry of Defense created the Joint AI Center, in partnership with the Massachusetts Institute of Technology (MIT), and created the position of Director of Digital and AI. These steps are not without interest: the Joint AI Center has developed Ethical principles of AI and one new acquisition process; MIT has produced valuable to research and educational content; and the Chief Digital and AI Officer provides a possibility of integrating in various technological functions. But these actions are not enough. In fact, these are not even the most difficult stages.
The real barriers to digital transformation are deeply ingrained norms and conflicting perspectives that exist across the organization. “What is the value of technologists, really? Should they be treated differently from others? » ; “What about computers: can we trust them to do our job as well as we do? If so, what will be the role of humans next? » ; and perhaps most importantly, “How can we go beyond simply articulating new norms to actually living them?” These are difficult questions that affect the goals, strategies and tasks of the Ministry of Defense at all levels – but the answers will only be obtained through discussion and experimentation in the defense ecosystem itself.
Back at the workshop, at least, the officers made a breakthrough. Towards the end of the session, the instructor says, “I feel a sense of sadness in the room. Does anyone else feel this? Predictably, everyone shakes their heads – admitting sadness is like admitting failure – but then a major speaks up: “I’m going to bite. Yeah, I feel sad. It just seems overwhelming. If we can’t rely on our commanders to do this…” He pauses. “I have no idea how we’re going to do it. Especially when we are told to keep our heads down all the time. It’s hopeless.
The major’s comment is the most honest moment the group has seen, and the change in the room is palpable: an hour before, officers were barely aware of their own duty to generate adaptive work, and if they l were, they didn’t appreciate the mass. Now they accept that responsibility, and they do it publicly — in a vulnerable way — where the whole group can learn from individual experience. This change is the stuff of real change.
The truth is, no one knows how a digitally transformed Department of Defense will work. But no one will find out without the collective process of trial, failure and learning. The Department of Defense should therefore become comfortable learning from experience – gathering data through discussion and experimentation – and disseminating this learning throughout the organization. And while the Department of Defense has good reason to maintain a culture of risk aversion, avoiding learning creates its own set of risks. The world is changing and America’s adversaries are improving their capabilities. We cannot afford to wait for our enemies to clearly show that they have passed us.
Agents can take three steps to move forward with digital transformation now.
First, agents need to generate and execute low-risk experiments: actions that will produce learning for the future, not actions that will produce success based on today’s metrics – who knows if those metrics will relevant after the transformation? For example, at Air Force Department– Massachusetts Institute of Technology Artificial Intelligence Accelerator, we have experimented with multiple forms of service member education, from live lectures and online courses to interactive exercises and project-based workshops. When an experience produces failure, so be it: failure is the main ingredient of learning.
Second, agents need to present as many perspectives on digital transformation as possible. Who balks at digitization? Who supports him? Why? And what is the wisdom in each perspective? If everyone is part of the problem, everyone should also be part of the solution – even if that means engaging people across borders like the Department of Defense has never done before.
Finally, officers must prepare those around them for a long period of ambiguity, where operational reality dictates that those in charge will be unable to answer critical questions. This serves two purposes. First, it helps manage expectations, so people in positions of authority can resist pressure to provide answers where there are none. Second, it allows those without authority to conduct their own experiments – to try something new and fail – and to report what they have learned.
Ultimately, transforming a system requires transforming the people within it. If the Department of Defense is seriously committed to digital transformation, everyone should be engaged in the uncomfortable and personal process of change. As the work continues, the organization and the people within it will find themselves better equipped to deal with new and challenging realities.
The workshop, meanwhile, ends on a note that applies to the entire Ministry of Defense: “This moment requires courage. Try better. Fail better. Learn better. One day you will look back and see that you have transformed yourself.
Brandon Leschchinsky is an AI Innovation Fellow at the Air Force-Massachusetts Institute of Technology’s Artificial Intelligence Accelerator Department, where he has taught AI to more than 600 service members, including more than sixty generals, admirals and members of senior management. He also works with Ronald Heifetz and others at Harvard Kennedy School, where he has coached over 50 students, ranging from young professionals to senior executives, on complex, collective challenges.
Andrew Bowne is an Air Force judge advocate and the chief legal counsel for the Artificial Intelligence Accelerator Department at the Air Force-Massachusetts Institute of Technology. He also holds a doctorate. candidate at the University of Adelaide examining the nexus between national security and AI, focusing on the role of industry. He has published numerous articles and book chapters, including on national security, security cooperation, contract law, rule of law, machine learning, and intellectual property.
The opinions expressed are those of the authors and do not reflect the official guidance or position of the US Government, Department of Defense or the US Air Force. Further, the appearance of external hyperlinks does not constitute an endorsement by the Department of Defense of the linked web sites, or of the information, products or services contained therein. The Department of Defense exercises no editorial, security or other control over the information you may find at these locations.
Picture: The American army