From c7bf8b10fe3623314d29f9ca0f3667a24c9c1213 Mon Sep 17 00:00:00 2001 From: Dagan Henderson Date: Fri, 26 Jul 2024 12:15:10 +0000 Subject: [PATCH] Publishes C2 in the Age of AI --- _posts/2024-07-26-c2-in-the-age-of-ai.md | 36 ++++++++++++++++++++++++ 1 file changed, 36 insertions(+) create mode 100644 _posts/2024-07-26-c2-in-the-age-of-ai.md diff --git a/_posts/2024-07-26-c2-in-the-age-of-ai.md b/_posts/2024-07-26-c2-in-the-age-of-ai.md new file mode 100644 index 0000000..2fa82e9 --- /dev/null +++ b/_posts/2024-07-26-c2-in-the-age-of-ai.md @@ -0,0 +1,36 @@ +--- +layout: post +title: Command and Control in the Age of AI +date: 2024-07-26 +author: Trey Coleman +--- + +Artificial Intelligence (AI) has become ubiquitous, entering our lives so swiftly and smoothly that many of us can’t imagine not having it, even if we don’t always know we’re using it. But AI didn’t spring up overnight, and it isn’t done growing. Soon, AI will be so integrated into our daily lives that it will touch everything we do, including military operations. In fact, any military that doesn’t leverage AI will quickly be outpaced by those that do. + +U.S. doctrine identifies seven joint functions in military operations: intelligence, movement and maneuver, fires, information, protection, sustainment, and command and control (C2). AI can enhance these functions, but the function most ripe for AI is C2. The reason is simple: AI and C2 are both about decision making. + +What we call AI today is the result of decades of research in several arenas, including academia, healthcare, military, and government. The term “Artificial Intelligence,” referring to machines performing tasks typically requiring human intelligence, was coined during a Dartmouth research project in the late 1950s. The next 20 years saw landmark projects advance the field, such as the Perceptron, a binary classification program that “learned” via iteration; [ELIZA](https://web.stanford.edu/class/cs124/p36-weizenabaum.pdf), a quasi-conversational program; [MYCIN](https://www.britannica.com/technology/MYCIN), an expert system for diagnosis and treatment of blood infections; and [Neocognitron](http://vision.stanford.edu/teaching/cs131_fall1415/lectures/Fukushima1988.pdf), the first artificial neural network. + +As the 2000s approached, AI hit a wall. Projects failed to meet expectations, often due to unwieldy localized computing infrastructure or insufficient power and memory capabilities. Hype for AI dwindled, taking funding with it. The 2000s saw the advent of cloud computing, which eliminated a major upfront expense for businesses, with hardware abstracted into geographically distributed third-party data centers. [Amazon Web Services](https://aws.amazon.com/) (AWS) started the trend, and before long, nearly every major tech company had public cloud offerings. These, along with increased computational capabilities, brought us into the era of Big Data. + +With accessible cloud infrastructure, big data, and science tooling as a managed cloud product, the stage was set for modern AI. In 2012, a pivotal event launched artificial intelligence back into the spotlight. Ukranian-born Canadian Alex Krizhevsky entered an image-recognition competition called [ImageNet](https://en.wikipedia.org/wiki/ImageNet). His solution, dubbed [AlexNet](https://en.wikipedia.org/wiki/AlexNet), utilized Graphics Processing Units (GPUs) coupled with layered neural networks. It outpaced the competition's accuracy by at least 10%. + +2017 was the next leap forward, when Google released a paper, [Attention is All You Need](https://research.google/pubs/attention-is-all-you-need/), which proposed a novel architectural approach to the concept of neural networks like those used in AlexNet, paving the way for the Large Language Models (LLMs) we know today, such as ChatGPT, Claude, Gemini, and LLaMa. + +Modern AI is often confused with automation. Like AI, automation, based on Boolean logic, has come a long way over the past several years. But automation isn’t intelligent, it is simply a series of if, then statements. A key differentiator between automation and AI is decision making. An intelligent system makes decisions. Automation executes decisions that have already been made. + +C2 has been a function of war since the first conflict, but like AI, C2 has also transformed drastically in recent history. U.S. doctrine defines command and control as a function that consists of two parts: a commander (the decision maker) and the way he or she communicates those decisions to control the force. One of the ways C2 has transformed in the past several decades is organizationally. Instead of the U.S. President managing a Department of War and a Department of Navy, the National Security Act of 1947 and the [Goldwater-Nichols Act of 1986](https://www.google.com/url?sa=t&source=web&rct=j&opi=89978449&url=https://history.defense.gov/Portals/70/Documents/dod_reforms/Goldwater-NicholsDoDReordAct1986.pdf&ved=2ahUKEwjdrZmL7bqHAxXqtYQIHQNOAFQQFnoECB4QAQ&usg=AOvVaw0LeWx71wdrzY480VSM3kDR) helped restructure the system into 11 combatant commands. Simultaneously, private and public funding has created a sea of data that no single person could fully grasp. + +Recently, Air Force Secretary Frank Kendall said, “If humans are in the loop, we lose.” He’s right. In the age of data, humans aren’t fast enough, they’re too influenced by cognitive bias, and they aren’t good at multi-tasking. + +C2 is essentially resource management. Decisions are made about resources by people who have the proper authority, those decisions are communicated to the fighting force, then combatants act on them. And those decisions have always been informed by data. But the amount of data available to warfighters today is unprecedented, and no human or even team of humans can keep up with it. AI can process that data exponentially faster than the human mind. + +Human decision-making is influenced by various biases, such as confirmation bias (favoring information that supports our beliefs), anchoring bias (relying on initial information), and recency bias (favoring recent events over older ones). Military operations seek to exploit cognitive bias all the time. Take, for example, the (weak) attempts by the Russians, prior to the invasion of Ukraine, to make their force buildup look like an exercise, to maintain an element of surprise. These types of desensitization activities specifically target human cognitive bias. + +Beyond our biases, humans are not built for multitasking. Sure, we’re capable of it. But [studies](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7075496/) have shown that individuals almost always take longer to complete a task and with a higher error rate when switching between tasks. Yet, multitasking is paramount to military operations. A commander is rarely only commanding. And that commander may be further impaired by fatigue and stress, amplifying deleterious effects on their faculties. So, what’s a human to do to shore up these shortcomings? Look to the data, of course. + +"If humans are in the loop, we lose" —Air Force Secretary Frank Kendall + +The human brain offers numerous advantages that can never be replaced by AI - intuition, creativity, and experience chief among them. But AI can help bolster human capability – turning humans into superhumans – by making decisions for humans that don’t require intuition or creativity, taking the human out of the loop and putting him or her on the loop instead. Humans and AI are at their best when they come together in a beautiful gestalt, becoming more than the sum of its parts. + +Raft’s data products are helping transform the military into a [data-centric force](https://teamraft.com/2024/05/23/data-platforms.html). Our products aggregate movements, positions, and real-time communications so commanders can better coordinate responses and adapt to changing conditions. Our products use advanced algorithms and real-time updates to minimize risk and maximize tactical advantage for ground and air units. And they integrate inventory levels, consumption rates, and resupply timelines to help maintain operational readiness and efficiency. Raft provides the tools – powered by AI – that help humans make better decisions, faster. Raft provides tools – powered by AI – that help humans make better decisions, faster. \ No newline at end of file