Should platoons have a designated “hacker” assigned?

girl sitting at computer terminal cyberpunk hacker

Still catching up, so here we are.

Episode 53 of the Irregular Warfare Podcast was right on target.

In Episode 53 of the Irregular Warfare Podcast, we consider how cyber tools and weapons are used at the tactical level within irregular warfare.

DIGITAL IRREGULAR WARFARE: CYBER AT THE TACTICAL LEVEL

A smart and nuanced conversation that touches just about everything in this orbit – cyber, information warfare, psychological warfare, authorities, and more.

Reminds me of this episode: Should platoon’s have a designated “hacker” assigned?

Some choice excerpts below.

Being ‘afraid’ of information warfare.

In Army doctrine, we are afraid to introduce the phrase ‘information warfare.’ So, what can cyber contribute to irregular warfare? We’re going to limit ourselves if we only are allowed to talk about that in the context of creating technical effects, or using technology to create kinetic effects. I think there is a lot more possibility in the information warfare space, but we don’t have an organizational structure or an authorities structure, or a set of policies, or even a national strategy, or even a service strategy – we’re just missing all of the other stuff that allows us to execute that.

Sally White, ~14:00

I agree completely with the first part – fear of the phrase information warfare and limiting ourselves by thinking about cyber only in the context of tech. But I disagree with the second part, on being limited in our ability to operate because we’re “missing” something.

This is something that is discussed all the time – including right here. “If only” we had some mega-command or a special policy that allowed us to “do” the things we want to do. We also fail when we focus on the whiz-bang aspects of information warfare, instead of the hard work of navigating real bureaucracy.

At the end of the podcast Sally makes some important points that gets to the core of where it seems our issues lay.

There is a need for adjustment when it comes to the intersection of cyberspace as a physcial domain and the cognitive informational realm that frankly is also the primary purpose of cyberspace when it comes to how we’re operating with the human element and populations. When it comes to things like cyber-enabled information operations, or the information warfare question… I think we should probably devote a bit more time and intellectual energy to thinking through what is the actual problem that we need to solve, and are we limiting ourselves by keeping things separate in their distinct bins of cyber, of psychological operations, of information operations, et cetera. Are they [these distinctions] inhibiting our ability to be effective in the broader information environment of which cyberspace is a part?

Remember lumping vs splitting?

Cyber is not IO. Cyber is not PSYOP. There are terms (and everything that comes with it) that should be lumped, and there are some that should be split.

But, I tend to agree with Sally that anyone who is in this realm does themselves a disservice by playing too close to their own specialty. This stuff has to be a team effort.

A lot of this could be solved if we stopped thinking of information warfare as the “bits and bytes” or the “nouns and verbs” and instead focused on the actions we take. Everything else comes after that.

Lastly, I love this question posed as an area of needed research.

How can we come up with an integrated theory of information that encompasses both the physical and cogntive realms?

There’s a lot more in this episode, including some really good reasons for why we don’t push some of these capabilities down to the platoon level. Worth the listen.

Enjoy these posts? Sign up for the monthly newsletter.

That’s just Joe

cobra unit metal gear solid 3

Episode 93 of the Cognitive Crucible podcast. This one on information operations and the law.

If interested, I’d pair this episode with this article on the same subject from earlier in the year. Both the podcast and the article discuss similar things (free speech and the ickiness of influence operations).

Tell me the below isn’t true.

Before, if you had somebody with an extremist view, they were on the soapbox in the town square, and everybody knew – ‘that’s just Joe, that’s who he is.’ But now, the Joe in each village can link up with all the other Joes in every other village and reinforce each others’ extremist ideas and thinking.

Todd Huntley, Ep 93, The Cognitive Crucible

It is one thing to have the weird guy in your family obsessed with conspiracy theories. It’s another to have that same guy link up with others across the country and across the world.

And even that seemed to be ok for a while, so long as it seemed mostly like a nerdy hobby.

But when it mutates into action, that’s when it becomes a problem.

Enjoy these posts? Sign up for the monthly newsletter.

Behaviors shape Attitudes

the atlantic saudi arabia women praying

A fascinating write-up in The Atlantic by Graeme Wood on Saudi Arabia. The focus is on MBS, but there is a detour that describes the Kingdom’s efforts at deradicalizing jihadists.

Instead of trying to “deprogram” or otherwise convince jihadists that their attitudes and beliefs are wrong, they have them do mundane office work.

Nothing is stranger than normalcy where one least expects it. These jihadists—people who recently would have sacrificed their life to take mine—had apparently been converted into office drones. Fifteen years ago, Saudi Arabia tried to deprogram them by sending them to debate clerics loyal to the government, who told the prisoners that they had misinterpreted Islam and needed to repent. But if this scene was to be believed, it turned out that terrorists didn’t need a learned debate about the will of God. They needed their spirits broken by corporate drudgery. They needed Dunder Mifflin.

Absolute Power, by Graeme Wood (The Atlantic)

Logical thinking tells us that in order for someone to change their behavior, they need to change their attitudes first. This is why see influence efforts focus on convincing someone of something first in an effort to ultimately change the behavior.

It makes logical sense, but when you start to dig into the psychological research, it doesn’t quite work that way.

It turns out that if we engage in a behavior, and particularly one that we had not expected that we would have, our thoughts and feelings toward that behavior are likely to change. This might not seem intuitive, but it represents another example of how the principles of social psychology—in this case, the principle of attitude consistency—lead us to make predictions that wouldn’t otherwise be that obvious.

Changing Attitudes by Changing Behavior

This partially explains why veterans of the wars in Iraq and Afghanistan are more likely to support those wars than the general public.

  • 53 percent say the war in Afghanistan was worth fighting vs. 30 percent of Americans overall.
  • 44 percent think Iraq was worth fighting vs. 38 percent of the general public.

Source: Washington Post, April 2014

Why is this the case? Cognitive dissonance.

Once placed into a situation (like the wars in Iraq or Afghanistan), to admit that it wasn’t worth it might impact self-esteem or self-worth. Instead of adjusting your attitude, you shift in the other direction and rationalize the behavior to alleviate that dissonance.

For the jihadists, sitting them in a room and trying to convince them that their views are wrong was fruitless. But putting them into a situation where they have to spend time working and churning in an environment seems to have the desired effect.

Their behaviors, over time, influence their attitudes.

They have time to reflect on what they’re doing. It just kind of happens.

Powerful efforts to convince or bludgeon people with information rarely works in terms of changing behavior. Instead, the efforts should be on changing the behavior which can then change the attitude.

Admittedly, this is much harder.

It’s easy to build a flyer with some factual information or a campaign to convince jihadists to “turn away.”

It’s not new information they need. It’s a different behavior.

Think of anyone you’ve tried to convince of something who was resistant because they had a personal experience that informs their thought.

It’s a fool’s errand.

But if you can get the same person to actually try the thing?

The behavior changes the attitude.

Creating experiences and situations where people are forced to behave in certain scenarios is more likely to have the effect you’re looking for.

Anything else is shot-in-the-dark advertising.

Image Source: The Atlantic (Lynsey Addario)

Enjoy these posts? Sign up for the monthly newsletter.

Info ops and legality

revolver ocelot a cold war fought with information and espionage

Published just as the year began – I must have missed it in the deluge of activity that marks the new year.

Terrific and tightly written article on the challenge of military information operations.

This is one of the best (short) articles I’ve read that captures why we seem to be “getting our asses kicked” in the information environment. It’s not about talent, techniques, or will – it’s about authorities and norms.

As well as vision, or “commander’s intent.”

First, the prospect of military engagement to counter adversary information operations during competition raises very significant legal concerns that must be addressed—concerns foundational to our constitutional system. On the other hand, these legal concerns play a significant role in hindering the development of a coherent information strategy in competition. This article will attempt to bring these issues to light, so that the underlying and implicit concerns can be stated, which is a necessary first step to crafting an effective, comprehensive, whole-of-government strategy to respond to our adversaries’ malign influence campaigns. This article will discuss the underlying legal concerns and conclude with thoughts on the development of an integrated strategy.

Static Inertia: The Legal Challenges to Making Progress on an Effective Military Information Strategy – Modern War Institute

I especially enjoyed this upfront rationale:

Behind all the discussions is a nagging sense that the entire enterprise is just wrong—after all, the United States is a liberal democracy, we do not engage in state-sponsored propaganda, and there should be no Ministry of Truth in America. The whole prospect sounds utterly distasteful.

Yup. For lots of reasons, we tend to treat anything “psychological” as a dirty word.

Additionally, this:

The job of the military has been to fight and win our nation’s wars, not engage in propaganda campaigns, even in foreign contexts. 

Correct. But…

With its extensive cyber capabilities and resources, the US military is currently in the best position to counter the adversary in the information arena.

Agree, but this goes far beyond cyber.

Enjoy these posts? Sign up for the monthly newsletter.

Polite propaganda

rush radio journal image

My podcast diet continues to grow.

I recently finished the first three episodes of the RUSI Journal Radio – each focusing on different aspects of information warfare.

The Royal United Services Institute is a UK-based think tank. It turns out they have a bunch of different podcasts.

Here are the first three:

Episode 1: The Realities of Information Warfare

Episode 2: Emotion as a Policy Tool

Episode 3: 21st Century Propaganda

I especially enjoyed the discussion in episode 2 regarding measures of effectiveness (and the fact that they are often meaningless).

While discussing atmospherics, the host asks “how do you measure it?”

It’s hard. It’s not something easy, especially in a discipline or in an environment such as policy-making where we like things to be quantified. We want metrics to be able to show that something has impact.

But having worked in politics and policy for a few years, I’ve come across people, often politicians, strategic communicators, very good strategists, who have this innate and intuitive sense of ‘this is the mood right now, this is the moment, something has changed.’

Claire Yorke, Emotion as a Policy Tool, ~5:00

The conversation moves onto the qualitative aspects of analysis – which is something that doesn’t lend well to putting numbers on a chart. We trust this analysis because it comes from someone who has put in the work and has studied the subject matter over time.

We shouldn’t need to be wowed by the methedolgy.

We can measure things this way, and yes, it is subjective. But that’s ok.

So to measure it is subjective and we have to be comfortable with the ambiguity and the subjectivity of it.

This podcast also has the calmest, unimposing intro music of any I’ve heard. A welcome break from the hum of impending doom that begins most American security-themed podcasts.

Enjoy these posts? Sign up for the monthly newsletter.

The Third Person Effect

men reading newspapers on a train

People tend to overestimate their confidence and ability in things and discount the same in others.

We see this most clearly in driving confidence and ability.

73% of Americans believe that they are a “better-than-average” driver.

Instantly, we know something must be wrong.

There is a similar phenomenon in psychology called the third-person effect.

“…people will tend to overestimate the influence that mass communications have on the attitudes and behavior of others. More specifically, individuals who are members of an audience that is exposed to a persuasive communication (whether or not this communication is intended to be persuasive) will expect the communication to have a greater effect on others than on themselves. And whether or not these individuals are among the ostensible audience for the message, the impact that they expect this communication to have on others may lead them to take some action. Any effect that the communication achieves may thus be due not to the reaction of the ostensible audience but rather to the behavior of those who anticipate, or think they perceive, some reaction on the part of others.”

The argument here isn’t that propaganda works. The argument is that there are many people who believe propaganda doesn’t work on them, but they have concerns that it works on others.

That concern may lead the same enlightened people to take action which ultimately makes the propaganda effective.

In Davidson’s paper, he cites a couple of examples from military history that takes advantage of this. One is very similar to the technique Saddam Hussein purportedly used during the Iran-Iraq War to ground the Iranian F-14 fleet.

The History of the Psychological Warfare Division, Supreme Headquarters, Alled Expeditionary Force (Bad Homburg, Germany, 1945) tells us about Operation Huguenot – a project for undermining the efficiency of the German Air Force by suggesting that German flying personnel were deserting in their machines to the Allied side.

The Psychological Warfare Division history tells:

“The dividends from this operation were expected not so much in the actual number of desertions as in the effect of the countermeasures which the German authorities would be induced to take against glying personnel… sharpening up of anti-desertion measures and instructions to field polict to keep a suspicious eye on everyone – a course which would have serious effects on morale. Also, the promotion of officers on account of reliability rather than efficiency (p. 53).”

The Third Person Effect in Communication

It wasn’t about actually getting Germans to defect. It was about getting the German military to take action – unnecessary, painful action – to prevent defections from taking place.

The lesson here, as is often the case when it comes to propaganda, is to exercise patience, discretion, humility, and trust.

Patience to not react just because something happens in the information environment.

Discretion to be selective about what levers we choose to pull if and when we do react.

Humility to acknowledge that we are all vulnerable.

Trust in each other that they can do the above as well.

No matter how smart we think we are, or how immune we may be to the effects of slick marketing, social media algorithms, or plain old-fashioned propaganda, we are all made up of the same stuff as the person next to us.

We’re all vulnerable. Understanding that is the beginning of beating it.

Enjoy these posts? Sign up for the monthly newsletter.

Administrative Warfare: Deception + third person effect

iran f-14 winnie the pooh posing

The use of deception and the third-person effect to exploit an administrative process for military advantage.

He knew that they were paranoid.

He knew that the Iranians guarded their oil facilities with their F-14s, and his Air Force [the Iraqi’s] was terrified of dog-fighting the F-14s because at the time the F-14 was pretty much unmatched as a fighter aircraft.

So he figured the best way to get our aircraft to strike the oil refinery is to get the F-14s out of the air and the only way to get them out of the air is to ground them.

We don’t have the means to strike their airfield, so he called one of the Gulf leaders, I’m not sure if it was the Saudi king or somebody else, and he essentially told them, “Hey, we have received intelligence that an Iranian F-14 wants to defect in a couple of nights and they are going to come to your country, so just keep an eye out – there’s an F-14 coming.”

[Saddam] knowing full-well that that Gulf leader was going to leak that information to the Iranians – they did.

The Iranians heard ‘one of your F-14s is going to defect.

They panicked and put all of the F-14 pilots in jail, and while all the F-14 pilots were in jail being investigated for a possible treason plot, Saddam struck the oil refinery.

Aram Shabanian, How the Iran-Iraq War Shaped the Modern World, Angry Planet

Photo source.

Enjoy these posts? Sign up for the monthly newsletter.

Ok, but what should the Army do to combat this?

quake 3 arena logo

Good episode of The Convergence Podcast last month. Guests were Joe Littell and Maggie Smith, who recently co-authored a good article on information warfare for the Modern War Institute.

In the podcast, they discuss the article and its implications for the military.

What I like about both the article and the podcast is that we are hearing directly from practitioners – in this case in the fields of psychological operations and cyber.

Often – and especially as of late – we are hearing everyone’s opinion on these fields, whether they hold expertise or not.

One thing that I think gets to the crux of many of the military’s issues in dealing with information warfare came in the form of a question. After a long back and forth on some of the background concerning information warfare on a grand scale – political polarization, distrust in media, misinformation/disinformation, etc – the host poses the following question?

“How does the Army combat this?”

It’s not a bad question – and it is literally referencing the problem addressed in the guests’ article. The issue here is the solution to the problem goes way beyond the scope of what the Army can do. Even those tiny parts of the Army that deal exclusively with these issues.

What is the role of the Army? To win our nation’s wars.

We do ourselves a disservice if we ask it to do more than that.

There are limits to what the military can achieve in a traditional sense. Look at Afghanistan.

But there are also limits to what the military can achieve in an irregular sense. It doesn’t matter what combination of tactics, techniques, or tools you can pull together. There are extreme limits to what can be accomplished when dealing with the complexities of the human condition.

Thinking that it’s possible to fix everything, that we just haven’t discovered the right tool or educated the right people in the right way is dangerous.

This isn’t a cause for cynicism. Rather, it’s a cause for critical thinking and clearly understanding the role of the military and executing accordingly.

And pushing back when asked to do the impossible.

Lastly, there was a good conversation towards the end on the need to move away from the terms misinformation and disinformation. I agree. They are used everywhere now, mostly interchangeably or without a clear meaning.

Unfortunately, I don’t think they’re going anywhere. For what it’s worth, this is how I think of them.

For those who hang in there until the end, you’ll learn a couple of interesting facts about Joe and Maggie.

“Hangin’ with railbait like you is gonna lower my rep.”

Enjoy these posts? Sign up for the monthly newsletter.

Administrative Warfare: Fake Bomb Threats

the city of kyiv at dusk with no lights

Read this yesterday afternoon about the ongoing “hybrid” war taking place in Ukraine.

Another new tactic, according to Ukrainian authorities, is bomb threats.

Ukrainian police said there were nearly 1,000 anonymous messages in January, mostly by email, falsely claiming bomb threats against nearly 10,000 locations, from schools to critical infrastructure.

Kateryna Morozova’s 7-year-old daughter called her last month asking to be collected from school as teachers had told her to leave quickly. A teacher soon said on a messenger group that there had been a bomb threat against the school. Children who had been swimming had to grab what clothes they could and rush outside into the cold and snow, she said.

Russians Have Already Started Hybrid War With Bomb Threats, Cyberattacks, Ukraine Says, Wall Street Journal

Many places have automatic procedures that take place when a bomb threat is received. This is easily exploitable by someone willing to take advantage of it.

This is a form of administrative warfare. That is, tactics that take advantage of administrative policies and procedures that can wreak havoc at minimal cost.

There are lots of possibilities for this kind of warfare.

The only limitations are willingness and imagination.

Enjoy these posts? Sign up for the monthly newsletter.

The Truth Sandwich

you have to beLIEve me

Last week I wrote about the illusory truth effect – the psychological phenomenon wherein a lie that is repeated – even in refutation – is more likely to be remembered than the truth.

It turns out that there is a counter to this – the “truth sandwich.”

How to use it?

  1. Start with the truth. This is the frame.
  2. Introduce the lie – clearly stating that it is a lie.
  3. End with the truth.

It doesn’t always work. Especially if the recipient is no longer engaging in critical thought.

But for those who might be swayed, those who are still among the few willing to be wrong from time to time, it may nudge them towards the truth.

In the race to correct false information, the lie often gets too much air. You have to frame it in the right way.

And even then, most of the time the lie is not even worth refuting. Patience and trust will win the day.

Leaders – especially military leaders – need to suppress the urge to “do something” all the time.

“How are we countering this!?” screams the agitated military leader.

“We’re not, sir. It’s nonsense. And it will pass.”

Enjoy these posts? Sign up for the monthly newsletter.