all repos — site @ fcf383887b033bdb79aae706ed77dd655f0acf45

source for my site, found at icyphox.sh

pages/txt/disinfo.txt (view raw)

  1---
  2date: '2019-09-10'
  3subtitle: 'Misinformation, but deliberate'
  4template: text.html
  5title: Disinformation demystified
  6url: disinfo
  7---
  8
  9As with the disambiguation of any word, let's start with its etymology
 10and definiton. According to
 11[Wikipedia](https://en.wikipedia.org/wiki/Disinformation),
 12*disinformation* has been borrowed from the Russian word ---
 13*dezinformatisya* (дезинформа́ция), derived from the title of a KGB black
 14propaganda department.
 15
 16> Disinformation is false information spread deliberately to deceive.
 17
 18To fully understand disinformation, especially in the modern age, we
 19need to understand the key factors of any successful disinformation
 20operation:
 21
 22-   creating disinformation (what)
 23-   the motivation behind the op, or its end goal (why)
 24-   the medium used to disperse the falsified information (how)
 25-   the actor (who)
 26
 27At the end, we'll also look at how you can use disinformation techniques
 28to maintain OPSEC.
 29
 30In order to break monotony, I will also be using the terms "information
 31operation", or the shortened forms---"info op" & "disinfo".
 32
 33Creating disinformation
 34-----------------------
 35
 36Crafting or creating disinformation is by no means a trivial task.
 37Often, the quality of any disinformation sample is a huge indicator of
 38the level of sophistication of the actor involved, i.e. is it a 12 year
 39old troll or a nation state?
 40
 41Well crafted disinformation always has one primary characteristic ---
 42"plausibility". The disinfo must sound reasonable. It must induce the
 43notion it's *likely* true. To achieve this, the target --- be it an
 44individual, a specific demographic or an entire nation --- must be well
 45researched. A deep understanding of the target's culture, history,
 46geography and psychology is required. It also needs circumstantial and
 47situational awareness, of the target.
 48
 49There are many forms of disinformation. A few common ones are staged
 50videos / photographs, recontextualized videos / photographs, blog posts,
 51news articles & most recently --- deepfakes.
 52
 53Here's a tweet from [the grugq](https://twitter.com/thegrugq), showing a
 54case of recontextualized imagery:
 55
 56```{=html}
 57<blockquote class="twitter-tweet" data-dnt="true" data-theme="dark" data-link-color="#00ffff">
 58```
 59```{=html}
 60<p lang="en" dir="ltr">
 61```
 62Disinformation. `<br>`{=html}`<br>`{=html} The content of the photo is
 63not fake. The reality of what it captured is fake. The context it's
 64placed in is fake. The picture itself is 100% authentic. Everything,
 65except the photo itself, is fake.
 66`<br>`{=html}`<br>`{=html}Recontextualisation as threat vector.
 67`<a href="https://t.co/Pko3f0xkXC">`{=html}pic.twitter.com/Pko3f0xkXC`</a>`{=html}
 68```{=html}
 69</p>
 70```
 71--- thaddeus e. grugq (@thegrugq)
 72`<a href="https://twitter.com/thegrugq/status/1142759819020890113?ref_src=twsrc%5Etfw">`{=html}June
 7323, 2019`</a>`{=html}
 74```{=html}
 75</blockquote>
 76```
 77```{=html}
 78<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
 79```
 80Motivations behind an information operation
 81-------------------------------------------
 82
 83I like to broadly categorize any info op as either proactive or
 84reactive. Proactively, disinformation is spread with the desire to
 85influence the target either before or during the occurence of an event.
 86This is especially observed during elections.[^1] In offensive
 87information operations, the target's psychological state can be affected
 88by spreading **fear, uncertainty & doubt**, or FUD for short.
 89
 90Reactive disinformation is when the actor, usually a nation state in
 91this case, screws up and wants to cover their tracks. A fitting example
 92of this is the case of Malaysian Airlines Flight 17 (MH17), which was
 93shot down while flying over eastern Ukraine. This tragic incident has
 94been attributed to Russian-backed separatists.[^2] Russian media is
 95known to have desseminated a number of alternative & some even
 96conspiratorial theories[^3], in response. The number grew as the JIT's
 97(Dutch-lead Joint Investigation Team) investigations pointed towards the
 98separatists. The idea was to **muddle the information** space with these
 99theories, and as a result, potentially correct information takes a
100credibility hit.
101
102Another motive for an info op is to **control the narrative**. This is
103often seen in use in totalitarian regimes; when the government decides
104what the media portrays to the masses. The ongoing Hong Kong protests is
105a good example.[^4] According to
106[NPR](https://www.npr.org/2019/08/14/751039100/china-state-media-present-distorted-version-of-hong-kong-protests):
107
108> Official state media pin the blame for protests on the "black hand" of
109> foreign interference, namely from the United States, and what they
110> have called criminal Hong Kong thugs. A popular conspiracy theory
111> posits the CIA incited and funded the Hong Kong protesters, who are
112> demanding an end to an extradition bill with China and the ability to
113> elect their own leader. Fueling this theory, China Daily, a state
114> newspaper geared toward a younger, more cosmopolitan audience, this
115> week linked to a video purportedly showing Hong Kong protesters using
116> American-made grenade launchers to combat police. ...
117
118Media used to disperse disinfo
119------------------------------
120
121As seen in the above example of totalitarian governments, national TV
122and newspaper agencies play a key role in influence ops en masse. It
123guarantees outreach due to the channel/paper's popularity.
124
125Twitter is another, obvious example. Due to the ease of creating
126accounts and the ability to generate activity programmatically via the
127API, Twitter bots are the go-to choice today for info ops. Essentially,
128an actor attempts to create "discussions" amongst "users" (read: bots),
129to push their narrative(s). Twitter also provides analytics for every
130tweet, enabling actors to get realtime insights into what sticks and
131what doesn't. The use of Twitter was seen during the previously
132discussed MH17 case, where Russia employed its troll factory --- the
133[Internet Research
134Agency](https://en.wikipedia.org/wiki/Internet_Research_Agency) (IRA) to
135create discussions about alternative theories.
136
137In India, disinformation is often spread via YouTube, WhatsApp and
138Facebook. Political parties actively invest in creating group chats to
139spread political messages and memes. These parties have volunteers whose
140sole job is to sit and forward messages. Apart from political
141propaganda, WhatsApp finds itself as a medium of fake news. In most
142cases, this is disinformation without a motive, or the motive is hard to
143determine simply because the source is impossible to trace, lost in
144forwards.[^5] This is a difficult problem to combat, especially given
145the nature of the target audience.
146
147The actors behind disinfo campaigns
148-----------------------------------
149
150I doubt this requires further elaboration, but in short:
151
152-   nation states and their intelligence agencies
153-   governments, political parties
154-   other non/quasi-governmental groups
155-   trolls
156
157This essentially sums up the what, why, how and who of disinformation.
158
159Personal OPSEC
160--------------
161
162This is a fun one. Now, it's common knowledge that **STFU is the best
163policy**. But sometimes, this might not be possible, because afterall
164inactivity leads to suspicion, and suspicion leads to scrutiny. Which
165might lead to your OPSEC being compromised. So if you really have to,
166you can feign activity using disinformation. For example, pick a place,
167and throw in subtle details pertaining to the weather, local events or
168regional politics of that place into your disinfo. Assuming this is
169Twitter, you can tweet stuff like:
170
171-   "Ugh, when will this hot streak end?!"
172-   "Traffic wonky because of the Mardi Gras parade."
173-   "Woah, XYZ place is nice! Especially the fountains by ABC street."
174
175Of course, if you're a nobody on Twitter (like me), this is a non-issue
176for you.
177
178And please, don't do this:
179
180![mcafee opsecfail](/static/img/mcafeetweet.png)
181
182Conclusion
183----------
184
185The ability to influence someone's decisions/thought process in just one
186tweet is scary. There is no simple way to combat disinformation. Social
187media is hard to control. Just like anything else in cyber, this too is
188an endless battle between social media corps and motivated actors.
189
190A huge shoutout to Bellingcat for their extensive research in this
191field, and for helping folks see the truth in a post-truth world.
192
193[^1]: [This](https://www.vice.com/en_us/article/ev3zmk/an-expert-explains-the-many-ways-our-elections-can-be-hacked)
194    episode of CYBER talks about election influence ops (features the
195    grugq!).
196
197[^2]: The [Bellingcat
198    Podcast](https://www.bellingcat.com/category/resources/podcasts/)'s
199    season one covers the MH17 investigation in detail.
200
201[^3]: [Wikipedia section on MH17 conspiracy
202    theories](https://en.wikipedia.org/wiki/Malaysia_Airlines_Flight_17#Conspiracy_theories)
203
204[^4]: [Chinese newspaper spreading
205    disinfo](https://twitter.com/gdead/status/1171032265629032450)
206
207[^5]: Use an adblocker before clicking
208    [this](https://www.news18.com/news/tech/fake-whatsapp-message-of-child-kidnaps-causing-mob-violence-in-madhya-pradesh-2252015.html).