Disinformation post Signed-off-by: Anirudh Oppiliappan <x@icyphox.sh>
jump to
@@ -21,7 +21,6 @@ <html>
<title> About </title> -<script src="//instant.page/1.1.0" type="module" integrity="sha384-EwBObn5QAxP8f09iemwAJljc+sU+eUXeL9vSBw1eNmVarwhKk2F9vBEpaN9rsrtp"></script> <div class="container-text"> <header class="header">
@@ -0,0 +1,234 @@
+<!DOCTYPE html> +<html lang=en> +<link rel="stylesheet" href="/static/style.css" type="text/css"> +<link rel="stylesheet" href="/static/syntax.css" type="text/css"> +<link rel="shortcut icon" type="images/x-icon" href="/static/favicon.ico"> +<meta name="description" content="Misinformation, but deliberate"> +<meta name="viewport" content="initial-scale=1"> +<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1"> +<meta content="#021012" name="theme-color"> +<meta name="HandheldFriendly" content="true"> +<meta name="twitter:card" content="summary_large_image"> +<meta name="twitter:site" content="@icyphox"> +<meta name="twitter:title" content="Disinformation demystified"> +<meta name="twitter:description" content="Misinformation, but deliberate"> +<meta name="twitter:image" content="/static/icyphox.png"> +<meta property="og:title" content="Disinformation demystified"> +<meta property="og:type" content="website"> +<meta property="og:description" content="Misinformation, but deliberate"> +<meta property="og:url" content="https://icyphox.sh"> +<meta property="og:image" content="/static/icyphox.png"> +<html> + <title> + Disinformation demystified + </title> +<div class="container-text"> + <header class="header"> + + <a href="/">home</a> + <a href="/blog">blog</a> + <a href="/reading">reading</a> + <a href="https://twitter.com/icyphox">twitter</a> + <a href="/about">about</a> + + </header> +<body> + <div class="content"> + <div align="left"> + <code>2019-09-10</code> + <h1>Disinformation demystified</h1> + <h2>Misinformation, but deliberate</h2> + <p>As with the disambiguation of any word, let’s start with its etymology and definiton. +According to <a href="https://en.wikipedia.org/wiki/Disinformation">Wikipedia</a>, +<em>disinformation</em> has been borrowed from the Russian word — <em>dezinformatisya</em> (дезинформа́ция), +derived from the title of a KGB black propaganda department.</p> + +<blockquote> + <p>Disinformation is false information spread deliberately to deceive.</p> +</blockquote> + +<p>To fully understand disinformation, especially in the modern age, we need to understand the +key factors of any successful disinformation operation:</p> + +<ul> +<li>creating disinformation (what)</li> +<li>the motivation behind the op, or its end goal (why)</li> +<li>the medium used to disperse the falsified information (how)</li> +<li>the actor (who)</li> +</ul> + +<p>At the end, we’ll also look at how you can use disinformation techniques to maintain OPSEC.</p> + +<p>In order to break monotony, I will also be using the terms “information operation”, or the shortened +forms – “info op” & “disinfo”.</p> + +<h3 id="creating-disinformation">Creating disinformation</h3> + +<p>Crafting or creating disinformation is by no means a trivial task. Often, the quality +of any disinformation sample is a huge indicator of the level of sophistication of the +actor involved, i.e. is it a 12 year old troll or a nation state?</p> + +<p>Well crafted disinformation always has one primary characteristic — “plausibility”. +The disinfo must sound reasonable. It must induce the notion it’s <em>likely</em> true. +To achieve this, the target — be it an individual, a specific demographic or an entire +nation — must be well researched. A deep understanding of the target’s culture, history, +geography and psychology is required. It also needs circumstantial and situational awareness, +of the target.</p> + +<p>There are many forms of disinformation. A few common ones are staged videos / photographs, +recontextualized videos / photographs, blog posts, news articles & most recently — deepfakes.</p> + +<p>Here’s a tweet from <a href="https://twitter.com/thegrugq">the grugq</a>, showing a case of recontextualized +imagery:</p> + +<blockquote class="twitter-tweet" data-dnt="true" data-theme="dark" data-link-color="#00ffff"> +<p lang="en" dir="ltr">Disinformation. +<br><br> +The content of the photo is not fake. The reality of what it captured is fake. The context it’s placed in is fake. The picture itself is 100% authentic. Everything, except the photo itself, is fake. +<br><br>Recontextualisation as threat vector. +<a href="https://t.co/Pko3f0xkXC">pic.twitter.com/Pko3f0xkXC</a> +</p>— thaddeus e. grugq (@thegrugq) +<a href="https://twitter.com/thegrugq/status/1142759819020890113?ref_src=twsrc%5Etfw">June 23, 2019</a> +</blockquote> + +<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> + +<h3 id="motivations-behind-an-information-operation">Motivations behind an information operation</h3> + +<p>I like to broadly categorize any info op as either proactive or reactive. +Proactively, disinformation is spread with the desire to influence the target +either before or during the occurence of an event. This is especially observed +during elections.<sup class="footnote-ref" id="fnref-1"><a href="#fn-1">1</a></sup> +In offensive information operations, the target’s psychological state can be affected by +spreading <strong>fear, uncertainty & doubt</strong>, or FUD for short.</p> + +<p>Reactive disinformation is when the actor, usually a nation state in this case, +screws up and wants to cover their tracks. A fitting example of this is the case +of Malaysian Airlines Flight 17 (MH17), which was shot down while flying over +eastern Ukraine. This tragic incident has been attributed to Russian-backed +separatists.<sup class="footnote-ref" id="fnref-2"><a href="#fn-2">2</a></sup> +Russian media is known to have desseminated a number of alternative & some even +conspiratorial theories<sup class="footnote-ref" id="fnref-3"><a href="#fn-3">3</a></sup>, in response. The number grew as the JIT’s (Dutch-lead Joint +Investigation Team) investigations pointed towards the separatists. +The idea was to <strong>muddle the information</strong> space with these theories, and as a result, +potentially correct information takes a credibility hit.</p> + +<p>Another motive for an info op is to <strong>control the narrative</strong>. This is often seen in use +in totalitarian regimes; when the government decides what the media portrays to the +masses. The ongoing Hong Kong protests is a good example.<sup class="footnote-ref" id="fnref-4"><a href="#fn-4">4</a></sup> According to <a href="https://www.npr.org/2019/08/14/751039100/china-state-media-present-distorted-version-of-hong-kong-protests">NPR</a>:</p> + +<blockquote> + <p>Official state media pin the blame for protests on the “black hand” of foreign interference, + namely from the United States, and what they have called criminal Hong Kong thugs. + A popular conspiracy theory posits the CIA incited and funded the Hong Kong protesters, + who are demanding an end to an extradition bill with China and the ability to elect their own leader. + Fueling this theory, China Daily, a state newspaper geared toward a younger, more cosmopolitan audience, + this week linked to a video purportedly showing Hong Kong protesters using American-made grenade launchers to combat police. + …</p> +</blockquote> + +<h3 id="media-used-to-disperse-disinfo">Media used to disperse disinfo</h3> + +<p>As seen in the above example of totalitarian governments, national TV and newspaper agencies +play a key role in influence ops en masse. It guarantees outreach due to the channel/paper’s +popularity.</p> + +<p>Twitter is another, obvious example. Due to the ease of creating accounts and the ability to +generate activity programmatically via the API, Twitter bots are the go-to choice today for +info ops. Essentially, an actor attempts to create “discussions” amongst “users” (read: bots), +to push their narrative(s). Twitter also provides analytics for every tweet, enabling actors to +get realtime insights into what sticks and what doesn’t. +The use of Twitter was seen during the previously discussed MH17 case, where Russia employed its troll +factory — the <a href="https://en.wikipedia.org/wiki/Internet_Research_Agency">Internet Research Agency</a> (IRA) +to create discussions about alternative theories.</p> + +<p>In India, disinformation is often spread via YouTube, WhatsApp and Facebook. Political parties +actively invest in creating group chats to spread political messages and memes. These parties +have volunteers whose sole job is to sit and forward messages. +Apart from political propaganda, WhatsApp finds itself as a medium of fake news. In most cases, +this is disinformation without a motive, or the motive is hard to determine simply because +the source is impossible to trace, lost in forwards.<sup class="footnote-ref" id="fnref-5"><a href="#fn-5">5</a></sup> +This is a difficult problem to combat, especially given the nature of the target audience.</p> + +<h3 id="the-actors-behind-disinfo-campaigns">The actors behind disinfo campaigns</h3> + +<p>I doubt this requires further elaboration, but in short:</p> + +<ul> +<li>nation states and their intelligence agencies</li> +<li>governments, political parties</li> +<li>other non/quasi-governmental groups</li> +<li>trolls</li> +</ul> + +<p>This essentially sums up the what, why, how and who of disinformation. </p> + +<h3 id="personal-opsec">Personal OPSEC</h3> + +<p>This is a fun one. Now, it’s common knowledge that +<strong>STFU is the best policy</strong>. But sometimes, this might not be possible, because +afterall inactivity leads to suspicion, and suspicion leads to scrutiny. Which might +lead to your OPSEC being compromised. +So if you really have to, you can feign activity using disinformation. For example, +pick a place, and throw in subtle details pertaining to the weather, local events +or regional politics of that place into your disinfo. Assuming this is Twitter, you can +tweet stuff like:</p> + +<ul> +<li>“Ugh, when will this hot streak end?!”</li> +<li>“Traffic wonky because of the Mardi Gras parade.”</li> +<li>“Woah, XYZ place is nice! Especially the fountains by ABC street.”</li> +</ul> + +<p>Of course, if you’re a nobody on Twitter (like me), this is a non-issue for you.</p> + +<p>And please, don’t do this:</p> + +<p><img src="/static/img/mcafeetweet.png" alt="mcafee opsecfail" /></p> + +<h3 id="conclusion">Conclusion</h3> + +<p>The ability to influence someone’s decisions/thought process in just one tweet is +scary. There is no simple way to combat disinformation. Social media is hard to control. +Just like anything else in cyber, this too is an endless battle between social media corps +and motivated actors.</p> + +<p>A huge shoutout to Bellingcat for their extensive research in this field, and for helping +folks see the truth in a post-truth world.</p> + +<div class="footnotes"> +<hr /> +<ol> +<li id="fn-1"> +<p><a href="https://www.vice.com/en_us/article/ev3zmk/an-expert-explains-the-many-ways-our-elections-can-be-hacked">This</a> episode of CYBER talks about election influence ops (features the grugq!). <a href="#fnref-1" class="footnoteBackLink" title="Jump back to footnote 1 in the text.">↩</a></p> +</li> + +<li id="fn-2"> +<p>The <a href="https://www.bellingcat.com/category/resources/podcasts/">Bellingcat Podcast</a>’s season one covers the MH17 investigation in detail. <a href="#fnref-2" class="footnoteBackLink" title="Jump back to footnote 2 in the text.">↩</a></p> +</li> + +<li id="fn-3"> +<p><a href="https://en.wikipedia.org/wiki/Malaysia_Airlines_Flight_17#Conspiracy_theories">Wikipedia section on MH17 conspiracy theories</a> <a href="#fnref-3" class="footnoteBackLink" title="Jump back to footnote 3 in the text.">↩</a></p> +</li> + +<li id="fn-4"> +<p><a href="https://twitter.com/gdead/status/1171032265629032450">Chinese newspaper spreading disinfo</a> <a href="#fnref-4" class="footnoteBackLink" title="Jump back to footnote 4 in the text.">↩</a></p> +</li> + +<li id="fn-5"> +<p>Use an adblocker before clicking <a href="https://www.news18.com/news/tech/fake-whatsapp-message-of-child-kidnaps-causing-mob-violence-in-madhya-pradesh-2252015.html">this</a>. <a href="#fnref-5" class="footnoteBackLink" title="Jump back to footnote 5 in the text.">↩</a></p> +</li> +</ol> +</div> + + </div> + <hr /> + <p class="muted">Questions or comments? Open an issue at <a href="https://github.com/icyphox/site">this repo</a>, or send a plain-text email to <a href="mailto:x@icyphox.sh">x@icyphox.sh</a>.</p> + <footer> + <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/"> + <img src="https://licensebuttons.net/l/by-nc-sa/4.0/80x15.png"> + </a> + </footer> + </body> + </div> + </html>
@@ -22,7 +22,6 @@ <html>
<title> Picking the FB50 smart lock (CVE-2019-13143) </title> -<script src="//instant.page/1.1.0" type="module" integrity="sha384-EwBObn5QAxP8f09iemwAJljc+sU+eUXeL9vSBw1eNmVarwhKk2F9vBEpaN9rsrtp"></script> <div class="container-text"> <header class="header">@@ -36,9 +35,9 @@ </header>
<body> <div class="content"> <div align="left"> - <p> 2019-08-05 </p> - <h1> Picking the FB50 smart lock (CVE-2019-13143) </h1> - <h2> … and lessons learnt in IoT security </h2> + <code>2019-08-05</code> + <h1>Picking the FB50 smart lock (CVE-2019-13143)</h1> + <h2>… and lessons learnt in IoT security</h2> <p>(<em>originally posted at <a href="http://blog.securelayer7.net/fb50-smart-lock-vulnerability-disclosure">SecureLayer7’s Blog</a>, with my edits</em>)</p> <h3 id="the-lock">The lock</h3>
@@ -11,7 +11,189 @@ <link>https://icyphox.sh/blog/</link>
</image> <language>en-us</language> <copyright>Creative Commons BY-NC-SA 4.0</copyright> - <item><title>Setting up my personal mailserver</title><description><![CDATA[<p>A mailserver was a long time coming. I’d made an attempt at setting one up + <item><title>Disinformation demystified</title><description><![CDATA[<p>As with the disambiguation of any word, let’s start with its etymology and definiton. +According to <a href="https://en.wikipedia.org/wiki/Disinformation">Wikipedia</a>, +<em>disinformation</em> has been borrowed from the Russian word — <em>dezinformatisya</em> (дезинформа́ция), +derived from the title of a KGB black propaganda department.</p> + +<blockquote> + <p>Disinformation is false information spread deliberately to deceive.</p> +</blockquote> + +<p>To fully understand disinformation, especially in the modern age, we need to understand the +key factors of any successful disinformation operation:</p> + +<ul> +<li>creating disinformation (what)</li> +<li>the motivation behind the op, or its end goal (why)</li> +<li>the medium used to disperse the falsified information (how)</li> +<li>the actor (who)</li> +</ul> + +<p>At the end, we’ll also look at how you can use disinformation techniques to maintain OPSEC.</p> + +<p>In order to break monotony, I will also be using the terms “information operation”, or the shortened +forms – “info op” & “disinfo”.</p> + +<h3 id="creating-disinformation">Creating disinformation</h3> + +<p>Crafting or creating disinformation is by no means a trivial task. Often, the quality +of any disinformation sample is a huge indicator of the level of sophistication of the +actor involved, i.e. is it a 12 year old troll or a nation state?</p> + +<p>Well crafted disinformation always has one primary characteristic — “plausibility”. +The disinfo must sound reasonable. It must induce the notion it’s <em>likely</em> true. +To achieve this, the target — be it an individual, a specific demographic or an entire +nation — must be well researched. A deep understanding of the target’s culture, history, +geography and psychology is required. It also needs circumstantial and situational awareness, +of the target.</p> + +<p>There are many forms of disinformation. A few common ones are staged videos / photographs, +recontextualized videos / photographs, blog posts, news articles & most recently — deepfakes.</p> + +<p>Here’s a tweet from <a href="https://twitter.com/thegrugq">the grugq</a>, showing a case of recontextualized +imagery:</p> + +<blockquote class="twitter-tweet" data-dnt="true" data-theme="dark" data-link-color="#00ffff"> +<p lang="en" dir="ltr">Disinformation. +<br><br> +The content of the photo is not fake. The reality of what it captured is fake. The context it’s placed in is fake. The picture itself is 100% authentic. Everything, except the photo itself, is fake. +<br><br>Recontextualisation as threat vector. +<a href="https://t.co/Pko3f0xkXC">pic.twitter.com/Pko3f0xkXC</a> +</p>— thaddeus e. grugq (@thegrugq) +<a href="https://twitter.com/thegrugq/status/1142759819020890113?ref_src=twsrc%5Etfw">June 23, 2019</a> +</blockquote> + +<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> + +<h3 id="motivations-behind-an-information-operation">Motivations behind an information operation</h3> + +<p>I like to broadly categorize any info op as either proactive or reactive. +Proactively, disinformation is spread with the desire to influence the target +either before or during the occurence of an event. This is especially observed +during elections.<sup class="footnote-ref" id="fnref-1"><a href="#fn-1">1</a></sup> +In offensive information operations, the target’s psychological state can be affected by +spreading <strong>fear, uncertainty & doubt</strong>, or FUD for short.</p> + +<p>Reactive disinformation is when the actor, usually a nation state in this case, +screws up and wants to cover their tracks. A fitting example of this is the case +of Malaysian Airlines Flight 17 (MH17), which was shot down while flying over +eastern Ukraine. This tragic incident has been attributed to Russian-backed +separatists.<sup class="footnote-ref" id="fnref-2"><a href="#fn-2">2</a></sup> +Russian media is known to have desseminated a number of alternative & some even +conspiratorial theories<sup class="footnote-ref" id="fnref-3"><a href="#fn-3">3</a></sup>, in response. The number grew as the JIT’s (Dutch-lead Joint +Investigation Team) investigations pointed towards the separatists. +The idea was to <strong>muddle the information</strong> space with these theories, and as a result, +potentially correct information takes a credibility hit.</p> + +<p>Another motive for an info op is to <strong>control the narrative</strong>. This is often seen in use +in totalitarian regimes; when the government decides what the media portrays to the +masses. The ongoing Hong Kong protests is a good example.<sup class="footnote-ref" id="fnref-4"><a href="#fn-4">4</a></sup> According to <a href="https://www.npr.org/2019/08/14/751039100/china-state-media-present-distorted-version-of-hong-kong-protests">NPR</a>:</p> + +<blockquote> + <p>Official state media pin the blame for protests on the “black hand” of foreign interference, + namely from the United States, and what they have called criminal Hong Kong thugs. + A popular conspiracy theory posits the CIA incited and funded the Hong Kong protesters, + who are demanding an end to an extradition bill with China and the ability to elect their own leader. + Fueling this theory, China Daily, a state newspaper geared toward a younger, more cosmopolitan audience, + this week linked to a video purportedly showing Hong Kong protesters using American-made grenade launchers to combat police. + …</p> +</blockquote> + +<h3 id="media-used-to-disperse-disinfo">Media used to disperse disinfo</h3> + +<p>As seen in the above example of totalitarian governments, national TV and newspaper agencies +play a key role in influence ops en masse. It guarantees outreach due to the channel/paper’s +popularity.</p> + +<p>Twitter is another, obvious example. Due to the ease of creating accounts and the ability to +generate activity programmatically via the API, Twitter bots are the go-to choice today for +info ops. Essentially, an actor attempts to create “discussions” amongst “users” (read: bots), +to push their narrative(s). Twitter also provides analytics for every tweet, enabling actors to +get realtime insights into what sticks and what doesn’t. +The use of Twitter was seen during the previously discussed MH17 case, where Russia employed its troll +factory — the <a href="https://en.wikipedia.org/wiki/Internet_Research_Agency">Internet Research Agency</a> (IRA) +to create discussions about alternative theories.</p> + +<p>In India, disinformation is often spread via YouTube, WhatsApp and Facebook. Political parties +actively invest in creating group chats to spread political messages and memes. These parties +have volunteers whose sole job is to sit and forward messages. +Apart from political propaganda, WhatsApp finds itself as a medium of fake news. In most cases, +this is disinformation without a motive, or the motive is hard to determine simply because +the source is impossible to trace, lost in forwards.<sup class="footnote-ref" id="fnref-5"><a href="#fn-5">5</a></sup> +This is a difficult problem to combat, especially given the nature of the target audience.</p> + +<h3 id="the-actors-behind-disinfo-campaigns">The actors behind disinfo campaigns</h3> + +<p>I doubt this requires further elaboration, but in short:</p> + +<ul> +<li>nation states and their intelligence agencies</li> +<li>governments, political parties</li> +<li>other non/quasi-governmental groups</li> +<li>trolls</li> +</ul> + +<p>This essentially sums up the what, why, how and who of disinformation. </p> + +<h3 id="personal-opsec">Personal OPSEC</h3> + +<p>This is a fun one. Now, it’s common knowledge that +<strong>STFU is the best policy</strong>. But sometimes, this might not be possible, because +afterall inactivity leads to suspicion, and suspicion leads to scrutiny. Which might +lead to your OPSEC being compromised. +So if you really have to, you can feign activity using disinformation. For example, +pick a place, and throw in subtle details pertaining to the weather, local events +or regional politics of that place into your disinfo. Assuming this is Twitter, you can +tweet stuff like:</p> + +<ul> +<li>“Ugh, when will this hot streak end?!”</li> +<li>“Traffic wonky because of the Mardi Gras parade.”</li> +<li>“Woah, XYZ place is nice! Especially the fountains by ABC street.”</li> +</ul> + +<p>Of course, if you’re a nobody on Twitter (like me), this is a non-issue for you.</p> + +<p>And please, don’t do this:</p> + +<p><img src="/static/img/mcafeetweet.png" alt="mcafee opsecfail" /></p> + +<h3 id="conclusion">Conclusion</h3> + +<p>The ability to influence someone’s decisions/thought process in just one tweet is +scary. There is no simple way to combat disinformation. Social media is hard to control. +Just like anything else in cyber, this too is an endless battle between social media corps +and motivated actors.</p> + +<p>A huge shoutout to Bellingcat for their extensive research in this field, and for helping +folks see the truth in a post-truth world.</p> + +<div class="footnotes"> +<hr /> +<ol> +<li id="fn-1"> +<p><a href="https://www.vice.com/en_us/article/ev3zmk/an-expert-explains-the-many-ways-our-elections-can-be-hacked">This</a> episode of CYBER talks about election influence ops (features the grugq!). <a href="#fnref-1" class="footnoteBackLink" title="Jump back to footnote 1 in the text.">↩</a></p> +</li> + +<li id="fn-2"> +<p>The <a href="https://www.bellingcat.com/category/resources/podcasts/">Bellingcat Podcast</a>’s season one covers the MH17 investigation in detail. <a href="#fnref-2" class="footnoteBackLink" title="Jump back to footnote 2 in the text.">↩</a></p> +</li> + +<li id="fn-3"> +<p><a href="https://en.wikipedia.org/wiki/Malaysia_Airlines_Flight_17#Conspiracy_theories">Wikipedia section on MH17 conspiracy theories</a> <a href="#fnref-3" class="footnoteBackLink" title="Jump back to footnote 3 in the text.">↩</a></p> +</li> + +<li id="fn-4"> +<p><a href="https://twitter.com/gdead/status/1171032265629032450">Chinese newspaper spreading disinfo</a> <a href="#fnref-4" class="footnoteBackLink" title="Jump back to footnote 4 in the text.">↩</a></p> +</li> + +<li id="fn-5"> +<p>Use an adblocker before clicking <a href="https://www.news18.com/news/tech/fake-whatsapp-message-of-child-kidnaps-causing-mob-violence-in-madhya-pradesh-2252015.html">this</a>. <a href="#fnref-5" class="footnoteBackLink" title="Jump back to footnote 5 in the text.">↩</a></p> +</li> +</ol> +</div> +]]></description><link>https://icyphox.sh/blog/disinfo</link><pubDate>Tue, 10 Sep 2019 00:00:00 +0000</pubDate><guid>https://icyphox.sh/blog/disinfo</guid></item><item><title>Setting up my personal mailserver</title><description>< DMs.
# latest post -2019-08-15 — [Setting up my personal mailserver](/blog/mailserver) +`2019-09-10` — [Disinformation demystified](/blog/disinfo) ([see all](/blog))
@@ -6,13 +6,15 @@ ---
# all posts ([rss](/blog/feed.xml)) -2019-08-15 — [Setting up my personal mailserver](/blog/mailserver) +`2019-09-10` — [Disinformation demystified](/blog/disinfo) -2019-08-06 — [Picking the FB50 smart lock (CVE-2019-13143)](/blog/fb50) +`2019-08-15` — [Setting up my personal mailserver](/blog/mailserver) -2019-06-06 — [Return Oriented Programming on ARM (32-bit)](/blog/rop-on-arm) +`2019-08-06` — [Picking the FB50 smart lock (CVE-2019-13143)](/blog/fb50) -2019-13-05 — [My Setup](/blog/my-setup) +`2019-06-06` — [Return Oriented Programming on ARM (32-bit)](/blog/rop-on-arm) -2019-02-08 — [Python for Reverse Engineering #1: ELF Binaries](/blog/python-for-re-1/) +`2019-13-05` — [My Setup](/blog/my-setup) + +`2019-02-08` — [Python for Reverse Engineering #1: ELF Binaries](/blog/python-for-re-1/)
@@ -0,0 +1,160 @@
+--- +template: text.html +title: Disinformation demystified +subtitle: Misinformation, but deliberate +date: 2019-09-10 +--- + +As with the disambiguation of any word, let's start with its etymology and definiton. +According to [Wikipedia](https://en.wikipedia.org/wiki/Disinformation), +_disinformation_ has been borrowed from the Russian word --- _dezinformatisya_ (дезинформа́ция), +derived from the title of a KGB black propaganda department. + +> Disinformation is false information spread deliberately to deceive. + +To fully understand disinformation, especially in the modern age, we need to understand the +key factors of any successful disinformation operation: + +- creating disinformation (what) +- the motivation behind the op, or its end goal (why) +- the medium used to disperse the falsified information (how) +- the actor (who) + +At the end, we'll also look at how you can use disinformation techniques to maintain OPSEC. + +In order to break monotony, I will also be using the terms "information operation", or the shortened +forms -- "info op" & "disinfo". + +### Creating disinformation + +Crafting or creating disinformation is by no means a trivial task. Often, the quality +of any disinformation sample is a huge indicator of the level of sophistication of the +actor involved, i.e. is it a 12 year old troll or a nation state? + +Well crafted disinformation always has one primary characteristic --- "plausibility". +The disinfo must sound reasonable. It must induce the notion it's _likely_ true. +To achieve this, the target --- be it an individual, a specific demographic or an entire +nation --- must be well researched. A deep understanding of the target's culture, history, +geography and psychology is required. It also needs circumstantial and situational awareness, +of the target. + +There are many forms of disinformation. A few common ones are staged videos / photographs, +recontextualized videos / photographs, blog posts, news articles & most recently --- deepfakes. + +Here's a tweet from [the grugq](https://twitter.com/thegrugq), showing a case of recontextualized +imagery: + +<blockquote class="twitter-tweet" data-dnt="true" data-theme="dark" data-link-color="#00ffff"> +<p lang="en" dir="ltr">Disinformation. +<br><br> +The content of the photo is not fake. The reality of what it captured is fake. The context it’s placed in is fake. The picture itself is 100% authentic. Everything, except the photo itself, is fake. +<br><br>Recontextualisation as threat vector. +<a href="https://t.co/Pko3f0xkXC">pic.twitter.com/Pko3f0xkXC</a> +</p>— thaddeus e. grugq (@thegrugq) +<a href="https://twitter.com/thegrugq/status/1142759819020890113?ref_src=twsrc%5Etfw">June 23, 2019</a> +</blockquote> +<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> + +### Motivations behind an information operation + +I like to broadly categorize any info op as either proactive or reactive. +Proactively, disinformation is spread with the desire to influence the target +either before or during the occurence of an event. This is especially observed +during elections.[^1] +In offensive information operations, the target's psychological state can be affected by +spreading **fear, uncertainty & doubt**, or FUD for short. + +Reactive disinformation is when the actor, usually a nation state in this case, +screws up and wants to cover their tracks. A fitting example of this is the case +of Malaysian Airlines Flight 17 (MH17), which was shot down while flying over +eastern Ukraine. This tragic incident has been attributed to Russian-backed +separatists.[^2] +Russian media is known to have desseminated a number of alternative & some even +conspiratorial theories[^3], in response. The number grew as the JIT's (Dutch-lead Joint +Investigation Team) investigations pointed towards the separatists. +The idea was to **muddle the information** space with these theories, and as a result, +potentially correct information takes a credibility hit. + +Another motive for an info op is to **control the narrative**. This is often seen in use +in totalitarian regimes; when the government decides what the media portrays to the +masses. The ongoing Hong Kong protests is a good example.[^4] According to [NPR](https://www.npr.org/2019/08/14/751039100/china-state-media-present-distorted-version-of-hong-kong-protests): + +> Official state media pin the blame for protests on the "black hand" of foreign interference, +> namely from the United States, and what they have called criminal Hong Kong thugs. +> A popular conspiracy theory posits the CIA incited and funded the Hong Kong protesters, +> who are demanding an end to an extradition bill with China and the ability to elect their own leader. +> Fueling this theory, China Daily, a state newspaper geared toward a younger, more cosmopolitan audience, +> this week linked to a video purportedly showing Hong Kong protesters using American-made grenade launchers to combat police. +> ... + + +### Media used to disperse disinfo + +As seen in the above example of totalitarian governments, national TV and newspaper agencies +play a key role in influence ops en masse. It guarantees outreach due to the channel/paper's +popularity. + +Twitter is another, obvious example. Due to the ease of creating accounts and the ability to +generate activity programmatically via the API, Twitter bots are the go-to choice today for +info ops. Essentially, an actor attempts to create "discussions" amongst "users" (read: bots), +to push their narrative(s). Twitter also provides analytics for every tweet, enabling actors to +get realtime insights into what sticks and what doesn't. +The use of Twitter was seen during the previously discussed MH17 case, where Russia employed its troll +factory --- the [Internet Research Agency](https://en.wikipedia.org/wiki/Internet_Research_Agency) (IRA) +to create discussions about alternative theories. + +In India, disinformation is often spread via YouTube, WhatsApp and Facebook. Political parties +actively invest in creating group chats to spread political messages and memes. These parties +have volunteers whose sole job is to sit and forward messages. +Apart from political propaganda, WhatsApp finds itself as a medium of fake news. In most cases, +this is disinformation without a motive, or the motive is hard to determine simply because +the source is impossible to trace, lost in forwards.[^5] +This is a difficult problem to combat, especially given the nature of the target audience. + +### The actors behind disinfo campaigns + +I doubt this requires further elaboration, but in short: + +- nation states and their intelligence agencies +- governments, political parties +- other non/quasi-governmental groups +- trolls + +This essentially sums up the what, why, how and who of disinformation. + +### Personal OPSEC + +This is a fun one. Now, it's common knowledge that +**STFU is the best policy**. But sometimes, this might not be possible, because +afterall inactivity leads to suspicion, and suspicion leads to scrutiny. Which might +lead to your OPSEC being compromised. +So if you really have to, you can feign activity using disinformation. For example, +pick a place, and throw in subtle details pertaining to the weather, local events +or regional politics of that place into your disinfo. Assuming this is Twitter, you can +tweet stuff like: + +- "Ugh, when will this hot streak end?!" +- "Traffic wonky because of the Mardi Gras parade." +- "Woah, XYZ place is nice! Especially the fountains by ABC street." + +Of course, if you're a nobody on Twitter (like me), this is a non-issue for you. + +And please, don't do this: + + + +### Conclusion + +The ability to influence someone's decisions/thought process in just one tweet is +scary. There is no simple way to combat disinformation. Social media is hard to control. +Just like anything else in cyber, this too is an endless battle between social media corps +and motivated actors. + +A huge shoutout to Bellingcat for their extensive research in this field, and for helping +folks see the truth in a post-truth world. + +[^1]: [This](https://www.vice.com/en_us/article/ev3zmk/an-expert-explains-the-many-ways-our-elections-can-be-hacked) episode of CYBER talks about election influence ops (features the grugq!). +[^2]: The [Bellingcat Podcast](https://www.bellingcat.com/category/resources/podcasts/)'s season one covers the MH17 investigation in detail. +[^3]: [Wikipedia section on MH17 conspiracy theories](https://en.wikipedia.org/wiki/Malaysia_Airlines_Flight_17#Conspiracy_theories) +[^4]: [Chinese newspaper spreading disinfo](https://twitter.com/gdead/status/1171032265629032450) +[^5]: Use an adblocker before clicking [this](https://www.news18.com/news/tech/fake-whatsapp-message-of-child-kidnaps-causing-mob-violence-in-madhya-pradesh-2252015.html).
@@ -11,7 +11,189 @@ <link>https://icyphox.sh/blog/</link>
</image> <language>en-us</language> <copyright>Creative Commons BY-NC-SA 4.0</copyright> - <item><title>Setting up my personal mailserver</title><description><![CDATA[<p>A mailserver was a long time coming. I’d made an attempt at setting one up + <item><title>Disinformation demystified</title><description><![CDATA[<p>As with the disambiguation of any word, let’s start with its etymology and definiton. +According to <a href="https://en.wikipedia.org/wiki/Disinformation">Wikipedia</a>, +<em>disinformation</em> has been borrowed from the Russian word — <em>dezinformatisya</em> (дезинформа́ция), +derived from the title of a KGB black propaganda department.</p> + +<blockquote> + <p>Disinformation is false information spread deliberately to deceive.</p> +</blockquote> + +<p>To fully understand disinformation, especially in the modern age, we need to understand the +key factors of any successful disinformation operation:</p> + +<ul> +<li>creating disinformation (what)</li> +<li>the motivation behind the op, or its end goal (why)</li> +<li>the medium used to disperse the falsified information (how)</li> +<li>the actor (who)</li> +</ul> + +<p>At the end, we’ll also look at how you can use disinformation techniques to maintain OPSEC.</p> + +<p>In order to break monotony, I will also be using the terms “information operation”, or the shortened +forms – “info op” & “disinfo”.</p> + +<h3 id="creating-disinformation">Creating disinformation</h3> + +<p>Crafting or creating disinformation is by no means a trivial task. Often, the quality +of any disinformation sample is a huge indicator of the level of sophistication of the +actor involved, i.e. is it a 12 year old troll or a nation state?</p> + +<p>Well crafted disinformation always has one primary characteristic — “plausibility”. +The disinfo must sound reasonable. It must induce the notion it’s <em>likely</em> true. +To achieve this, the target — be it an individual, a specific demographic or an entire +nation — must be well researched. A deep understanding of the target’s culture, history, +geography and psychology is required. It also needs circumstantial and situational awareness, +of the target.</p> + +<p>There are many forms of disinformation. A few common ones are staged videos / photographs, +recontextualized videos / photographs, blog posts, news articles & most recently — deepfakes.</p> + +<p>Here’s a tweet from <a href="https://twitter.com/thegrugq">the grugq</a>, showing a case of recontextualized +imagery:</p> + +<blockquote class="twitter-tweet" data-dnt="true" data-theme="dark" data-link-color="#00ffff"> +<p lang="en" dir="ltr">Disinformation. +<br><br> +The content of the photo is not fake. The reality of what it captured is fake. The context it’s placed in is fake. The picture itself is 100% authentic. Everything, except the photo itself, is fake. +<br><br>Recontextualisation as threat vector. +<a href="https://t.co/Pko3f0xkXC">pic.twitter.com/Pko3f0xkXC</a> +</p>— thaddeus e. grugq (@thegrugq) +<a href="https://twitter.com/thegrugq/status/1142759819020890113?ref_src=twsrc%5Etfw">June 23, 2019</a> +</blockquote> + +<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> + +<h3 id="motivations-behind-an-information-operation">Motivations behind an information operation</h3> + +<p>I like to broadly categorize any info op as either proactive or reactive. +Proactively, disinformation is spread with the desire to influence the target +either before or during the occurence of an event. This is especially observed +during elections.<sup class="footnote-ref" id="fnref-1"><a href="#fn-1">1</a></sup> +In offensive information operations, the target’s psychological state can be affected by +spreading <strong>fear, uncertainty & doubt</strong>, or FUD for short.</p> + +<p>Reactive disinformation is when the actor, usually a nation state in this case, +screws up and wants to cover their tracks. A fitting example of this is the case +of Malaysian Airlines Flight 17 (MH17), which was shot down while flying over +eastern Ukraine. This tragic incident has been attributed to Russian-backed +separatists.<sup class="footnote-ref" id="fnref-2"><a href="#fn-2">2</a></sup> +Russian media is known to have desseminated a number of alternative & some even +conspiratorial theories<sup class="footnote-ref" id="fnref-3"><a href="#fn-3">3</a></sup>, in response. The number grew as the JIT’s (Dutch-lead Joint +Investigation Team) investigations pointed towards the separatists. +The idea was to <strong>muddle the information</strong> space with these theories, and as a result, +potentially correct information takes a credibility hit.</p> + +<p>Another motive for an info op is to <strong>control the narrative</strong>. This is often seen in use +in totalitarian regimes; when the government decides what the media portrays to the +masses. The ongoing Hong Kong protests is a good example.<sup class="footnote-ref" id="fnref-4"><a href="#fn-4">4</a></sup> According to <a href="https://www.npr.org/2019/08/14/751039100/china-state-media-present-distorted-version-of-hong-kong-protests">NPR</a>:</p> + +<blockquote> + <p>Official state media pin the blame for protests on the “black hand” of foreign interference, + namely from the United States, and what they have called criminal Hong Kong thugs. + A popular conspiracy theory posits the CIA incited and funded the Hong Kong protesters, + who are demanding an end to an extradition bill with China and the ability to elect their own leader. + Fueling this theory, China Daily, a state newspaper geared toward a younger, more cosmopolitan audience, + this week linked to a video purportedly showing Hong Kong protesters using American-made grenade launchers to combat police. + …</p> +</blockquote> + +<h3 id="media-used-to-disperse-disinfo">Media used to disperse disinfo</h3> + +<p>As seen in the above example of totalitarian governments, national TV and newspaper agencies +play a key role in influence ops en masse. It guarantees outreach due to the channel/paper’s +popularity.</p> + +<p>Twitter is another, obvious example. Due to the ease of creating accounts and the ability to +generate activity programmatically via the API, Twitter bots are the go-to choice today for +info ops. Essentially, an actor attempts to create “discussions” amongst “users” (read: bots), +to push their narrative(s). Twitter also provides analytics for every tweet, enabling actors to +get realtime insights into what sticks and what doesn’t. +The use of Twitter was seen during the previously discussed MH17 case, where Russia employed its troll +factory — the <a href="https://en.wikipedia.org/wiki/Internet_Research_Agency">Internet Research Agency</a> (IRA) +to create discussions about alternative theories.</p> + +<p>In India, disinformation is often spread via YouTube, WhatsApp and Facebook. Political parties +actively invest in creating group chats to spread political messages and memes. These parties +have volunteers whose sole job is to sit and forward messages. +Apart from political propaganda, WhatsApp finds itself as a medium of fake news. In most cases, +this is disinformation without a motive, or the motive is hard to determine simply because +the source is impossible to trace, lost in forwards.<sup class="footnote-ref" id="fnref-5"><a href="#fn-5">5</a></sup> +This is a difficult problem to combat, especially given the nature of the target audience.</p> + +<h3 id="the-actors-behind-disinfo-campaigns">The actors behind disinfo campaigns</h3> + +<p>I doubt this requires further elaboration, but in short:</p> + +<ul> +<li>nation states and their intelligence agencies</li> +<li>governments, political parties</li> +<li>other non/quasi-governmental groups</li> +<li>trolls</li> +</ul> + +<p>This essentially sums up the what, why, how and who of disinformation. </p> + +<h3 id="personal-opsec">Personal OPSEC</h3> + +<p>This is a fun one. Now, it’s common knowledge that +<strong>STFU is the best policy</strong>. But sometimes, this might not be possible, because +afterall inactivity leads to suspicion, and suspicion leads to scrutiny. Which might +lead to your OPSEC being compromised. +So if you really have to, you can feign activity using disinformation. For example, +pick a place, and throw in subtle details pertaining to the weather, local events +or regional politics of that place into your disinfo. Assuming this is Twitter, you can +tweet stuff like:</p> + +<ul> +<li>“Ugh, when will this hot streak end?!”</li> +<li>“Traffic wonky because of the Mardi Gras parade.”</li> +<li>“Woah, XYZ place is nice! Especially the fountains by ABC street.”</li> +</ul> + +<p>Of course, if you’re a nobody on Twitter (like me), this is a non-issue for you.</p> + +<p>And please, don’t do this:</p> + +<p><img src="/static/img/mcafeetweet.png" alt="mcafee opsecfail" /></p> + +<h3 id="conclusion">Conclusion</h3> + +<p>The ability to influence someone’s decisions/thought process in just one tweet is +scary. There is no simple way to combat disinformation. Social media is hard to control. +Just like anything else in cyber, this too is an endless battle between social media corps +and motivated actors.</p> + +<p>A huge shoutout to Bellingcat for their extensive research in this field, and for helping +folks see the truth in a post-truth world.</p> + +<div class="footnotes"> +<hr /> +<ol> +<li id="fn-1"> +<p><a href="https://www.vice.com/en_us/article/ev3zmk/an-expert-explains-the-many-ways-our-elections-can-be-hacked">This</a> episode of CYBER talks about election influence ops (features the grugq!). <a href="#fnref-1" class="footnoteBackLink" title="Jump back to footnote 1 in the text.">↩</a></p> +</li> + +<li id="fn-2"> +<p>The <a href="https://www.bellingcat.com/category/resources/podcasts/">Bellingcat Podcast</a>’s season one covers the MH17 investigation in detail. <a href="#fnref-2" class="footnoteBackLink" title="Jump back to footnote 2 in the text.">↩</a></p> +</li> + +<li id="fn-3"> +<p><a href="https://en.wikipedia.org/wiki/Malaysia_Airlines_Flight_17#Conspiracy_theories">Wikipedia section on MH17 conspiracy theories</a> <a href="#fnref-3" class="footnoteBackLink" title="Jump back to footnote 3 in the text.">↩</a></p> +</li> + +<li id="fn-4"> +<p><a href="https://twitter.com/gdead/status/1171032265629032450">Chinese newspaper spreading disinfo</a> <a href="#fnref-4" class="footnoteBackLink" title="Jump back to footnote 4 in the text.">↩</a></p> +</li> + +<li id="fn-5"> +<p>Use an adblocker before clicking <a href="https://www.news18.com/news/tech/fake-whatsapp-message-of-child-kidnaps-causing-mob-violence-in-madhya-pradesh-2252015.html">this</a>. <a href="#fnref-5" class="footnoteBackLink" title="Jump back to footnote 5 in the text.">↩</a></p> +</li> +</ol> +</div> +]]></description><link>https://icyphox.sh/blog/disinfo</link><pubDate>Tue, 10 Sep 2019 00:00:00 +0000</pubDate><guid>https://icyphox.sh/blog/disinfo</guid></item><item><title>Setting up my personal mailserver</title><description><![CDATA[<p>A mailserver was a long time coming. I’d made an attempt at setting one up around ~4 years ago (ish), and IIRC, I quit when it came to DNS. And I almost did this time too.<sup class="footnote-ref" id="fnref-1"><a href="#fn-1">1</a></sup></p>
@@ -29,9 +29,9 @@ </header>
<body> <div class="content"> <div align="left"> - <p> {{ date }} </p> - <h1> {{ title }} </h1> - <h2> {{ subtitle }} </h2> + <code>{{ date }}</code> + <h1>{{ title }}</h1> + <h2>{{ subtitle }}</h2> {{ body }} </div> <hr />