The February 29, 2012 front page of USA Today screamed at the apparent wastefulness and ineffectiveness of US military ‘info ops’ efforts in Afghanistan and Iraq. (Source: http://www.usatoday.com/news/military/story/2012-02-29/afghanistan-iraq-military-information-operations-usa-today-investigation/53295472/1). The article also virtually filled the second page except for an article entitled “Modern US wars influence psychiatric thought” which dealt with the mental state of returning veterans. I’m sure that juxtaposition was unintentional (he said with tongue in cheek).
The article puts together a cornucopia of facts and statements to paint a picture of waste and ineptitude. The article’s wrath appears to be directed against the futility of DOD backed ‘giant marketing campaigns’ and the contractors that run them.
There are some areas where the article clearly got it wrong.
· They try to push readers into making the judgment that because one of the defense contractors on the program (Los Angeles based Leonie Industries) does not appear to have suitable experience and its principals have financial problems – the entire effort is tainted. While facts about Leonie may be right on, the conclusion is not.
· Another area where I think the article got it wrong is the choice of military experts. For example, one of the sources for the article is COL (R) Paul Yingling who was apparently an artillery officer with scant IO involvement (see his Bio @ http://www.marshallcenter.org/mcpublicweb/en/component/content/article/747-art-bio-yingling-paul.html?directory=30).
· The article did not feature knowledgeable sources from the Military Information Support Operations (MISO) chain of command nor anyone retired from any of the MISO or PSYOP organizations.
· The article intimates that the information engagement efforts are ineffectual because they failed to stem the recent outrage caused by the burning of Qurans in Afghanistan. To assume that any short time information effort could accomplish this goal is naive and ludicrous.
· Another element of the article is that the fact that the US did not blast out the source of many of the products used in the campaign, was, in and of itself, something devious. This statement in particular seems to me to indicate a bias on behalf of the authors against the military and lack of understanding of the principles of information engagement.
However, there are some very key points that are raised that should be addressed. For example, the article correctly points out that there is a real dearth of reliable measures of effectiveness (MOE). MOE can be especially difficult to get with fragmented and often illiterate populations.
The article should have highlighted that the fact the efforts may be wasteful and disjointed is that the US lacks a cohesive national information engagement strategy. A strategy that not only synergizes the instrumentalities of government and national power but which can be adapted by Ambassadors and Combatant Commanders in their own areas of operations.
The article fails to point out the need for consistency across multi-year information engagement efforts that are often less than optimally executed because of a constant change of personnel responsible for them.
Having said all of this one of the positive side effects of the article may be to rouse Congress to investigate the nature of what the paper calls ‘info ops’ and discover the challenges currently facing the PSYOP/MISO community on the military side.
Some Congressional attention may be all that is needed to push the long simmering proponent debate to swift and decisive conclusion or exploit it into another election year issue without a suitable resolve.
Interested readers are urged to let USA Today know their thoughts about the article.
3 comments:
Is there room for improvement? Of course, though the brother-sister duo is more an indictment of the DOD contracting system than anything else. Unless its a sole-source contract, the who the contract goes to is often out of the hands of the practitioners.
Furthermore, stories like the USA Today article make the amount spent on influence seem shocking by leaving out comparative information to place the expenditures in context. In the context of total expenditures for OIF and OEF, this amount was nothing. One estimate for air conditioning is pegged at 20 billion a year, and the total authorizations for OIF and OEF from 9/11 to March 2011 was 1.25 trillion.
This is what happens when behavior manipulation is replaced with information engagement. Information does not yield an attitudinal or behavioral shift.
That being said, when a Quran is altered, as the burned ones were, it is no longer considered the unaltered word of god, therefore it is not technically a Quran anymore.
Despite this, any PSYOP soldier worth his or her salt, operating in an Islamic country, should have known that burial is the proper method of disposal.
Our job is not to simply put out fires, but to prevent boneheaded moves like this in the first place.
But, given the lack of training, and the lack of serious effort behind building a successful and effective regiment, it doesnt surprise me that this happened.
I agree with everything you bring up in your article,and what the other commentators had to say.
Not only is there no systematic attempt at monitoring and evaluating progress, but such a program is counter to the interests of contractors. Due to the nature of continuous unit and personnel turn-overs, and the way OERs incentivize easily measured, short-term, self chosen, and compartmentalized accomplishments. There's really no incentive to get at any reality that could hurt the careers of those involved. It's a sad reality, but anyone that's operated at higher command levels have seen how politically, careerist motives usually out way the welfare of troops or the success of a mission.
Even when unit managers are ethically motivated professionals, which is often the case, they rarely have any training or experience in how to evaluate competing reports. Unless they have advanced degrees in a social science, they aren't gonna question the methods contractors use, or even ask them how things were done. I was trained to keep all briefs simple, fast, and to the point. They're busy people, under a lot of stress, and are used to making fast & decisive decisions with incomplete information. This reality has created a culture where contractors and GS researchers have learned that BS is rarely questioned, and if anyone does dig into how a study is done the people who did it or the commanding unit will already be gone and off to other things. There's no incentive to do things differently. Mistakes and fraud are only looked for if something newsworthy goes wrong, and by that time OERs, and GS & contractor evals are already written. Everyone moves on to a new contract, mission or job, and these new jobs aren't evaluated with past performance included. So, it would require an investigation showing evidence that willful malfeasance or incompetence was involved. That's hard to prove in the best of circumstances, but when you add the fact that proving it would hurt the people that gave contractors positive evals in the past, the ones that awarded current contracts guilty parties, and especially the people whose jobs are tied to the program that the contractors are working under.
Another big problem is the explosion of isolated experts studying these structural problems, and offering solutions. So, anything they report is subject to the same competing interests vying for power within commands (within a command & between the same staff elements at different levels), between military and civilian elements, and between coalition elements.
There's a line of thought in medicine, and the sciences that one shouldn't test or measure something if they don't know what to do with that test. With elements operating downrange the opposite approach is usually taken. The assumption is that measuring or testing something is a good in itself, and will somehow have a positive effect after the information is gathered. There's no thought into how this would actually happen, or attempt to align and incentivize competing elements to take predetermined courses of action depending on what a study concludes. Plus, these studies need independent peer reviewing to made sure researchers aren't just telling their customers what they want to hear; which is something that happens more often than not.
Post a Comment