Should platoons have a designated “hacker” assigned?

girl sitting at computer terminal cyberpunk hacker

Still catching up, so here we are.

Episode 53 of the Irregular Warfare Podcast was right on target.

In Episode 53 of the Irregular Warfare Podcast, we consider how cyber tools and weapons are used at the tactical level within irregular warfare.

DIGITAL IRREGULAR WARFARE: CYBER AT THE TACTICAL LEVEL

A smart and nuanced conversation that touches just about everything in this orbit – cyber, information warfare, psychological warfare, authorities, and more.

Reminds me of this episode: Should platoon’s have a designated “hacker” assigned?

Some choice excerpts below.

Being ‘afraid’ of information warfare.

In Army doctrine, we are afraid to introduce the phrase ‘information warfare.’ So, what can cyber contribute to irregular warfare? We’re going to limit ourselves if we only are allowed to talk about that in the context of creating technical effects, or using technology to create kinetic effects. I think there is a lot more possibility in the information warfare space, but we don’t have an organizational structure or an authorities structure, or a set of policies, or even a national strategy, or even a service strategy – we’re just missing all of the other stuff that allows us to execute that.

Sally White, ~14:00

I agree completely with the first part – fear of the phrase information warfare and limiting ourselves by thinking about cyber only in the context of tech. But I disagree with the second part, on being limited in our ability to operate because we’re “missing” something.

This is something that is discussed all the time – including right here. “If only” we had some mega-command or a special policy that allowed us to “do” the things we want to do. We also fail when we focus on the whiz-bang aspects of information warfare, instead of the hard work of navigating real bureaucracy.

At the end of the podcast Sally makes some important points that gets to the core of where it seems our issues lay.

There is a need for adjustment when it comes to the intersection of cyberspace as a physcial domain and the cognitive informational realm that frankly is also the primary purpose of cyberspace when it comes to how we’re operating with the human element and populations. When it comes to things like cyber-enabled information operations, or the information warfare question… I think we should probably devote a bit more time and intellectual energy to thinking through what is the actual problem that we need to solve, and are we limiting ourselves by keeping things separate in their distinct bins of cyber, of psychological operations, of information operations, et cetera. Are they [these distinctions] inhibiting our ability to be effective in the broader information environment of which cyberspace is a part?

Remember lumping vs splitting?

Cyber is not IO. Cyber is not PSYOP. There are terms (and everything that comes with it) that should be lumped, and there are some that should be split.

But, I tend to agree with Sally that anyone who is in this realm does themselves a disservice by playing too close to their own specialty. This stuff has to be a team effort.

A lot of this could be solved if we stopped thinking of information warfare as the “bits and bytes” or the “nouns and verbs” and instead focused on the actions we take. Everything else comes after that.

Lastly, I love this question posed as an area of needed research.

How can we come up with an integrated theory of information that encompasses both the physical and cogntive realms?

There’s a lot more in this episode, including some really good reasons for why we don’t push some of these capabilities down to the platoon level. Worth the listen.

Enjoy these posts? Sign up for the monthly newsletter.

Ok, but what should the Army do to combat this?

quake 3 arena logo

Good episode of The Convergence Podcast last month. Guests were Joe Littell and Maggie Smith, who recently co-authored a good article on information warfare for the Modern War Institute.

In the podcast, they discuss the article and its implications for the military.

What I like about both the article and the podcast is that we are hearing directly from practitioners – in this case in the fields of psychological operations and cyber.

Often – and especially as of late – we are hearing everyone’s opinion on these fields, whether they hold expertise or not.

One thing that I think gets to the crux of many of the military’s issues in dealing with information warfare came in the form of a question. After a long back and forth on some of the background concerning information warfare on a grand scale – political polarization, distrust in media, misinformation/disinformation, etc – the host poses the following question?

“How does the Army combat this?”

It’s not a bad question – and it is literally referencing the problem addressed in the guests’ article. The issue here is the solution to the problem goes way beyond the scope of what the Army can do. Even those tiny parts of the Army that deal exclusively with these issues.

What is the role of the Army? To win our nation’s wars.

We do ourselves a disservice if we ask it to do more than that.

There are limits to what the military can achieve in a traditional sense. Look at Afghanistan.

But there are also limits to what the military can achieve in an irregular sense. It doesn’t matter what combination of tactics, techniques, or tools you can pull together. There are extreme limits to what can be accomplished when dealing with the complexities of the human condition.

Thinking that it’s possible to fix everything, that we just haven’t discovered the right tool or educated the right people in the right way is dangerous.

This isn’t a cause for cynicism. Rather, it’s a cause for critical thinking and clearly understanding the role of the military and executing accordingly.

And pushing back when asked to do the impossible.

Lastly, there was a good conversation towards the end on the need to move away from the terms misinformation and disinformation. I agree. They are used everywhere now, mostly interchangeably or without a clear meaning.

Unfortunately, I don’t think they’re going anywhere. For what it’s worth, this is how I think of them.

For those who hang in there until the end, you’ll learn a couple of interesting facts about Joe and Maggie.

“Hangin’ with railbait like you is gonna lower my rep.”

Enjoy these posts? Sign up for the monthly newsletter.

Term Warfare

a list of terms for low intensity conflict

A dual release episode from the Cognitive Crucible and the Phoenix Cast.

In this crossover episode of the Phoenix Cast and Cognitive Crucible, John Bicknell is joined by John Schreiner, Kyle Moschetto and Rich Vaccariello. The podcast hosts discuss why they started their respective casts, how they view competition, the key take-aways of their casts, the top must listen episodes, and the other podcasts they listen to.

#78 PHOENIX CAST DUAL RELEASE

I think I’ve listened to a Phoenix Cast episode before, but I wasn’t a subscriber. I am now.

Two things that I took away from this episode. The first is the idea that podcasts like these are a form of “PME” – professional military education.

That seems like a no-brainer – of course they are. But there are still a lot of folks out there that don’t listen to podcasts – which is fine. It’s a form of media – but not everyone is into it.

The second thing is the concept of “term warfare.” This is something we see all the time these days when we’re trying to describe some niche element of warfare.

Credit to David Maxwell.

We should be careful when trying to introduce a new term into the already crowded military lexicon. There’s probably already a term out there that describes whatever you’re thinking about.

On the other hand, sometimes we do need a specific term. Sometimes that term matters.

Sometimes we should split. And sometimes we should lump.

I’ve got a few of the Phoenix Cast’s episodes in my queue. The focus of their podcasts is more cyber/IT – which is good, because I don’t get enough of that.

And speaking of “term warfare” and cyber – this is a reminder, cyber isn’t PSYOP. Cyber isn’t “IO.”

It is its own thing. And you have to understand it.

Enjoy these posts? Sign up for the monthly newsletter.

An information “something” article that gets it right

trivial information is accumulating every second, preserved in all its triteness

A great, tightly written article over at MWI that looks at information through the “man, train, equip” construct of preparing the Army for war.

While emphasis on operations in the information environment and the cyber domain are certainly increasing, the balance of the military’s attention remains focused on force-on-force engagements during declared conflicts. Much of the time, information and cyber are given supporting roles for kinetic operations but recently, US Army Cyber Command announced a shift in focus from information warfare to “information advantage” for “decision dominance,” and is working to bring the concepts to the forefront of how the Army fights.

RETHINKING “MAN, TRAIN, AND EQUIP” FOR INFORMATION ADVANTAGE, Modern War Institute

Co-written by a PSYOP and Cyber officer, no less – folks in the game.

What I love about the article is that it’s not about the shiny stuff or promising some panacea through the right combination of “words and images.” The Army’s mission is to win land wars. Everything supports that. Instead of focusing on how this or that “information” tool can be used to support that, they focus on demonstrating how information already plays a key role in recruiting, training, and equipping the Army for war.

They talk about disinformation campaigns that target the military.

They talk about how lies spread faster than truth, the so-called ‘illusory truth’ effect.

How should the Army deal with this?

They write:

Specifically, to become proactive in the information environment, the Army needs to understand and predict how and what our competitors and adversaries are going to say, and be ready to deploy solutions ahead of, and in response to, competing and malicious narratives. One solution is teaching critical-thinking skills and inoculating the force by teaching soldiers to become more thoughtful consumers of media and information, especially regarding social media.

I love this.

Critical thinking is key. This isn’t going to be solved by artificial intelligence – at least not anytime soon. We need humans in the room who are astute across multiple domains and who understand the potential impacts of publishing that “edgy” Tweet or highlighting that training or social event.

This has application at both the individual and organizational levels.

Yes, we’re talking about “optics.” Optics are easy to dismiss, but they are actually important. What isn’t optics after all?

Doing the right thing is also important. We need critical thinkers who understand which way to lean at a given time. Is the juice worth the squeeze? What are the potential second and third-order effects?

That’s hard. That takes time.

On training, the authors write about how just about everything we do is now exploitable. Training is not just training anymore. It’s operations.

Specifically, they write about the Jade Helm exercise in 2015 which was the canary in the coal mine.

The information warfare tactics used against Jade Helm could be applied throughout the world, whenever and wherever the US military trains with partners and allies. In fact, we should assume those tactics will be used in the very locations that US servicemembers may be fighting the next war.

The idea of perfect secrecy is diminishing. If we want to compete, we need to recognize that now and start playing the actual game instead of the one we want to play.

Again, they offer a solution:

To gain and hold information advantage, the Army must assess the information environment before, during, and after domestic exercises—just as it does internationally—to understand the narratives surrounding the training and troop movements and to predict, preempt, and ultimately prevent false narratives from taking hold.

They close with the following:

Ultimately, the Army has taken the first steps toward recognizing the vulnerabilities inherent to the ubiquity of the information environment by pivoting away from information warfare—a term that preserves the peace-war dichotomy that is irrelevant in competition—toward achieving information advantage—a term that appreciates the information environment’s moral and cognitive aspects and its relevance to military readiness.

I’m growing to like the term “information advantage” as I get to understand it better. And couching it as they did – a term that “appreciates the information environment’s moral and cognitive aspects” – helps in understanding.

However, information advantage is such a big tent that it starts to lose some of its meaning. There are terms that we should lump and terms that we should split.

Information warfare is something that can be “done” – it’s an activity.

Information advantage – as I understand it – is a state, a confluence of things that puts a decision-maker in an advantageous position.

Information Advantage: A condition where a force holds the initiative in terms of relevant actor behavior, situational understanding, and decision-making through the use of all military capabilities.

What I’m saying is that I don’t think information advantage replaces information warfare (or psychological warfare). It’s something different, something bigger.

Kudos to the authors for a terrific, thought-provoking article.

Enjoy these posts? Sign up for the monthly newsletter.

Lumpers and Splitters

a robot spider logging

Good episode from the Cognitive Crucible featuring Mike Vickers.

During this episode, the Honorable Dr. Mike Vickers provides his thoughts on a wide range of strategic issues–all of which have connections with the information environment. Mike makes the case that America is like the cyclops in Homer’s epic poem, The Odyssey. Like the cyclops, the United States is being blinded and deceived by clever adversaries. Mike also discusses China, India, Estonian technology implementation, the authoritarian-democracy trade off, and international relations theory. He also gives a nuanced examination regarding “whole-of-nation” sloganeering. On one hand, Mike discourages simple phrases that might promote inadequate solutions; on the other, he does agree that we are at a point where we need to cohere around a national strategy and direct our instruments of power productively–including our citizenry.

#63 VICKERS ON IO AND THE CYCLOPS

As I wrote about in my most recent newsletter, there are a lot of hucksters out there when it comes to the information space. Just because you use the internet (too) doesn’t mean you understand how all of this stuff works. It’s great to hear an episode (like this one) where it is clear the guest completely gets it.

I especially enjoyed Mr. Vickers punctuating the fact that there is a difference between “cyber” and “information operations.” He correctly points out that many people – commanders especially (my thoughts, not his) – tend to lump these two things together.

And they are not the same.

Cyber is more tech-based.

Information operations are more people-based.

Sometimes it is good to “lump” things together, as we seem to be doing right now with the whole “information advantage” concept.

Sometimes it is better to “split” things apart.

On this topic (cyber/IO), we should be splitting, because the expertise required to do either is vastly different.

Enjoy these posts? Sign up for the monthly newsletter.

POWs in the Digital Era

line of american prisoners of war
Source: National Museum of the United States Air Force

This is the second Cognitive Crucible episode I’ve heard that features Professor Jan Kallberg and COL Stephen Hamilton from the Army Cyber Institute. The first discussed the idea that service members are all very likely targets of foreign influence operations – regardless of whether or not there is active armed conflict.

In this recent episode, they go a step further and discuss the need to prepare for a future where our POWs (prisoners of war) will be further exploited through the use of enhanced deep-fake technology, deception, and instantaneous communication.

More importantly, they discuss how our own institutional structures can be exploited at home by the same.

During this episode, Prof. Jan Kallberg and COL Stephen Hamilton of the Army Cyber Institute return to the Cognitive Crucible and discuss prisoner of war (POW) considerations in the digital world. After Jan recaps his recent article, In Great Power Wars, Americans Could Again Become POWs, the conversation covers the will to fight, cognitive preparation of the battlefield, and ways the enemy might harvest information about service members in advance to identify exploitable information. Both Jan and Stephen give some policy suggestions, as well.

Cognitive Crucible, #58 Kallberg and Hamilton on POWs in a Digital World

This is the type of warning that should scare you. It’s nightmare fuel.

Some things I found particularly interesting:

  • Our personal information is already out there

When social media started to emerge over a decade ago, general security guidance was to avoid putting personal information out there, be mindful of what you’re doing online, and increase your privacy settings.

Good advice, to be sure.

Further, some advised not having social media at all, while others warned that not having social media in an increasingly connected world seemed suspicious.

Well, now we’re at a place where whether you want your “stuff” to be out there or not, it’s out there. If an adversary (or a troll, or harasser) wants to scrape the internet for your stuff, it’s not hard to do.

And for the generation growing up in the shadow of all this, there will be even more “stuff” out there for the foreseeable future.

The genie is out of the bottle. It’s not going back in.

My take – this is over. We’re moving toward a society where the ability to maintain pure privacy is ending. There is little we can do at the individual level to protect ourselves completely. When you combine the growing digital ecosystem with nefarious cyber activities of state and non-state actors, our default position should be that “our information is going to get out there.”

Accept it, plan for it, and move on.

We’re really starting to put this thing together. Researchers and practitioners are weaving a quilt of what information warfare is likely to look like in the near future. It’s already happening, but we haven’t quite got it all figured out yet.

Personally, I think it’s important that we start talking – and implementing policy – that will defend us from this. We can’t just warn that it’s going to happen. We will be caught off guard if we are not prepared.

  • POWs have congressional representatives

This was very spooky. The guests discuss the fact that in future-war, there may no longer be a need to have a POW make a public statement disparaging the United States or the war effort. A hyper-realistic fake could be easily created and beamed out to the world.

That captured service member has a congressional representative somewhere back home. What happens when these POWs are exploited with the intent of influencing domestic politics? What happens when a reporter asks Congressman X what she is doing about the captured soldier who comes from her district?

What is her statement when a dramatic video is released of that servicemember begging his congressional representative – by name – to end the war?

What happens when public pressure is placed on that same congressional representative – from her constituents – to “do something” about this?

  • Television is an instrument that can paralyze this country.” -General William Westmoreland

There was a quick discussion on how what we are seeing now in the information age is just an extension of what we started to experience during the Vietnam War. When there are pictures and images, we pay attention. As much as we like to think we are rational creatures, our decision-making process – even at the strategic level – is often guided by emotions, “optics,” and a burning desire to “control the narrative.” These are often not rational decisions, but decisions that seek to please some interest.

How would things be different if there were no dramatic images? No compelling video? If you had to read the results of overseas operations the next day in your local newspaper, splayed out dispassionately?

I think we would address things more rationally. But I’m not certain that our decisions would always be “better.”

Again, the genie is out of the bottle. There is a role for education. There is a very important role for leaders (at all levels) to be patient and take the longer view. But there is also the realization that words, images, and video matter.

” Television brought the brutality of war into the comfort of the living room. Vietnam was lost in the living rooms of America – not on the battlefields of Vietnam.”

Marshall McLuhan
  • A picture is worth a thousand words
1st Lt. Anthony Aguilar wears the ballistic protective eyewear that prevented a bomb-fragment from possibly damaging his eyes when an IED detonated near his Stryker vehicle while on patrol in Mosul. (Photo by Company C, Task Force 2-1, Feb. 2006.)

COL Hamilton discussed an anecdote from a deployment where he witnessed the rapid purchase of a particular type of eye protection after one of the Generals was shown a picture with a piece of shrapnel lodged in the eye protection that would have almost certainly caused tremendous damage to the soldier’s vision. All of the statistics and lab reports in the world might not move someone to action. But a single image that demonstrates the effect might do the trick.

I don’t like it either – I wish we could be more Spock-like and make decisions based on the evidence.

But there it is.

This was a good episode – one that should have us thinking, and more importantly, moving towards crafting policies and procedures to prepare us for the kinds of deception and smear tactics we’re likely to see in both in the day-to-day operations of Great Power Competition and in the next shooting war.

Enjoy these posts? Sign up for the monthly newsletter.

Storyweapons and Storywarriors

“But in the current, digitized world, trivial information is accumulating every second, preserved in all its triteness.”

Finally sat down to read this quick article by Renny Gleeson in Cyber Defense Review. Renny works in advertising, which is important, as we don’t often get perspectives on information warfare from outside the military or national security bubble. As such, this take is a bit demilitarized (hence the non-doctrinal term “storyweapon”).

Gleeson makes the argument – as many others have – that we have entered a new realm, where the confluence of marketing, digital media, computers, and psychology have made all of us more vulnerable to manipulation by adversaries. It’s a good recap and overview of things IO professionals should have a solid grasp of – the race to the bottom for clicks, the primacy of emotion over rationality as demonstrated through the important work of Daniel Kahneman in Thinking Fast and Slow, and the fact that none of this is going away.

Nothing new here so far.

What Gleeson argues through his article, though, is that what this all leads to is what he calls the primacy of “storyweapons.” He defines this in his opening line as “adversarial narratives that use algorithms, automation, codespaces, and data to hijack decision-making, and the stories of who we are, what we believe and why it matters.”

Storyweapons, as I read it, are not that much different from what we mean when we talk about “narrative warfare,” another non-doctrinal term that gets a lot of attention these days.

And all of these are variations or spin-offs of something doctrinal that we do know: political warfare.

Anyway, to defeat adversarial storyweapons, Gleeson argues that we need to employ our own storyweapons, writing “we need storywarriors on the field, fighting for the best version of America.”

I don’t disagree with that. The problem, as he sees it (and I do too), is that we have a hard time doing that. Here he quotes General (Ret) Jim Mattis who says “a proper understanding of our national story is absent.”

It’s not that we necessarily have to package up ideas, narratives, messages, or whatever, and get it “out there” in the “information environment.” It’s that we have to actually believe in this project – the American story – and project that through action. Certainly there is a role for information operations as it more commonly understood (crafting themes and messages, media, PSYOP, etc.), but there is no calibration or tweaking that fixes all of this.

The concept of “storyweapons” has me skeptical, because it sounds like a simple, short-term solution to a complex, long term problem. Making a “storybomb” and dropping it on a target audience, with the idea that it is going to change something that has a long term tangible result is unlikely. Unfortunately, our incentives are aligned for the short term (politically, operationally, and personally).

Some choice excerpts from the article:

Our stories are more vulnerable than we know: our cognitive systems are hackable by everyone, from kids’ birthday party magicians to infowar adversaries. We do not see the flaws in those systems because they are features of the systems. Storyweapons leverage the infrastructure of perception to misguide, misdirect, and manipulate.

Yes, this is everywhere. And it’s not new. It’s just that many of us are only now realizing it. The advent of the smartphone and the constant notifications represent perhaps the most tactile example of this.

By biological design, outrage, fear, and the unfair light up these lower regions, grab the spotlight of our attention and short-circuit rational thought. 

This is why the ads below articles on websites you might otherwise enjoy are littered with pictures and copy designed to excite or scare you. It’s hard not to click an ad that promises you photos of an actor or actress who you don’t really know but are curious about how they look now or thirty years ago.

The ruthless economic imperative behind the zero-sum wars for attention has fueled the rise of outrage as a business model in the places we connect with who and what we love.

Yes. This also is not new, though. I’d recommend the book The Attention Merchants (2016) which covers the history of advertising. It’s always been a race to the bottom. It’s how you get snake oil salesman and yellow journalism – concepts and tactics that are over a century old. And it’s how we get those stupid ads I just mentioned.

“life-as-software-mediated-experience”

Gross.

We will be alone together: two people looking at the same thing at the same time will sense different things. 

Here he is talking about literally looking at the same space and seeing a different thing, as in advertisements. Imagine walking through a mall or airport and looking at an adverisement on the wall, but you see one thing based on your data while another person sees something else. This is already happening. I think more importantly, we are already living in a world where we can look at the exact same *physical* thing and come to completely different conclusions based on our information diet and the bubble that we live in. To me, this is more frightening. We can look at the nude emperor and admire his clothing, and everyone is okay with it and will say the clothes are beautiful.

Everyone and everything that touches software is effectively on the new Storyweapon battle field; there is no “behind the lines.”

Yes, this is true. I feel confident that the greater public doesn’t know this yet, because a majority of the military isn’t aware of it. There are options for changing this, but it will take time and effort.

You can’t beat a “true enough” storyweapon with facts.

So true.

Enjoy these posts? Follow me on Twitter and sign up for the monthly newsletter.