2020 Deepfakes
2020 Deepfakes
Abstract—The recent practical advances realized by Artificial character, or alter the voice of someone to say or do things
Intelligence, have also given rise to the phenomenon of deepfakes, that do not adhere to reality and never were performed [2]–[4].
which can be considered as a form of fake news. Deepfakes is Such technological capabilities have the potential to reshape
the phenomenon of creation of realistic digital products, and a
plethora of videos have emerged over the last two years in social the digital media, while the societal implications could also
media. Especially the low technical expertise and equipment be severe, as they undermine the public confidence on what
required to create deepfakes, means that such content can be is seen, heard, and eventually believed to be true.
easily produced by anyone and distributed online. The societal Several popular deepfakes circulate the Internet and can be
implications are significant and far-reaching. This work investi- found on popular websites such as YouTube. One of the first
gates the deepfakes via multi-angled perspectives that include
media and society, media production, media representations, and probably the most famous deepfake, is the 2018 video
media audiences, gender, law, and regulation, as well as politics. of Barack Obama [5] where ironically, he warns about the
Some key implications of these viewpoints are identified and dangers of deepfakes, something that Obama never actually
critically discussed. The results indicate that as a society, we are did. Some other deepfakes are celebrity porn videos, as well
not ready to deal with the emergence of deepfakes at any level. as a misrepresentation of politicians e.g., replacing the face
That we have not witnessed any severe impacts so far is due
to their early stage of development, that shows imperfections of Angela Merkel’s with Donald Trump’s, or Donald Trump’s
To address the issue, a combination of technology, education, with that of Mr. Bean. Popular deepfakes at the time of writing,
training, and governance are urgently needed. include an alternative version of some parts of the 1976 iconic
Index Terms—Deepfakes, Artificial Intelligence, Digital Media, American psychological thriller film ”Taxi Driver”, where the
Society. main actor’s face, i.e., Robert De Niro’s, has been replaced
with that of Al Pacino’s [6], and real-time creation of avatars
for popular teleconferencing systems from a photo [7].
I. I NTRODUCTION Deepfakes have the capability not only to impose photos in
Digital media dominance characterizes our era, where digi- a video but also to generate new content. This, for instance,
tal information can easily be created, communicated, and read means realistic human face photos, as shown in [8], as a result
globally. While this has increased access to information, its of AI studying other human photos [9]. Available software
plethora also means that it has become increasingly challeng- even lets users create in real-time deepfake avatars for Skype
ing for the citizens, to verify and trust such information. The and Zoom teleconference tools, simply from a photo [7].
recent practical advances in Artificial Intelligence (AI), have Similar technology (deep learning) can be used to create
a profound impact on a variety of domains, including that fake CVs, as shown in [10]. While both of these may still
of digital media overall, and with critical implications. AI seem like an ”innocent” playground, consider the scenario
is considered as a paradigm-change technology, due to its where the human resources department of a major company
manifold practical applications that have been demonstrated is overwhelmed with realistically looking CVs and photos of
in the last years, most of which are attributed to a sub-field potential hiring candidates, or with fake users impersonating
of AI named ”Deep Learning” [1]. in real-time others in business teleconferences (where the
Some of the most recent tangible applications demonstrated majority of the world conducts business during the COVID-
show interdisciplinary and widespread areas of applicability, 19 pandemic). Today, there is simply no sophisticated defense
e.g., self-driving cars, super-human performance in games, against such kind of deepfake attacks.
human-level spoken interaction, intelligent robots, creation While the technology is complex, its complexity is hidden
of cures to diseases, creation of new works such as texts, behind common easy-to-use tools [11] and services that are
paintings, films, etc. When the sophisticated capabilities of available to the general public. The tools to create deepfakes
Deep Learning are applied, images, text, and voices can be have low technical requirements, which usually imply conven-
created or altered in a highly realistic way [2]. As such, Deep tional home PCs that are equipped with gaming graphics cards
Learning has also enabled the creation of fake texts, fake (which have fueled the deep learning advances). For instance,
voices, fake videos, and fake photos, all of which may at first the Taxi Driver deepfake was easily created with low-cost
look appear strikingly genuine and realistic, while they are hardware and publicly available software DeepFaceLab [12]
not. AI has demonstrated recently, the capability of creating that utilizes AI to replace faces in videos. Because of the low
realistic fake videos, as it can, e.g., take an existing video learning-curve, public access to the technology, deepfakes can
and superimpose a person’s photo on the face of the main be created easily even by home users and without the need
for deep technical expertise. This can lead to the creation of that sufficient material and access to it was available. This is
realistic fake content that may be hard to verify its authenticity, challenging, considering that the phenomenon of deepfakes is
and which in combination with its distribution to social media, very new (approx. 2-3 years old), and as such, peer-reviewed
can put existing fake news actions on steroids. Deepfakes are literature that deals with it is limited. In the core literature
not a ”one day, this might be possible” technology, but they explicitly dealing with deepfakes was considered, e.g., [1]–
have already been used in practice. Fabrication of photos and [4], [16]–[18], which was complemented with the analysis of
in some cases video is not something new, but the easiness by several other documents including technical reports, press re-
which this can be achieved, coupled with the realistic results, leases, legislation proposals, and audiovisual material available
is something that is new and exciting but also worrying. on freely accessible Internet platforms (e.g., YouTube). The
As with any disruptive technology, mixed feelings accom- aim is to synthesize findings, views, examples from different
pany it, that focus on the strengths or the weaknesses of perspectives, as discussed in section III – section IX and
the technology, and often attempt to make predictions about in addition to critically reflect on the phenomenon overall
the future and how it will impact individuals and society. as discussed in section X. For each perspective, additional
However, such views, especially in the media, are often literature is considered that reflects key aspects so relevant to
asymmetrically focusing on some aspects of the technology that viewpoint’s context.
and its impacts, and as such, may guide the public perception The different angles selected to investigate this phe-
towards supporting or rejecting its applications. Deepfake is nomenon, are inspired by intersectionality, an approach that
also such technology, for which some of its results have been can be used to investigate how categories are inter-/intra-
featured in mainstream media. Therefore, it is of interest to connected and how they interact at different levels. Via inter-
see how AI, its applications, and impacts are communicated sectional analysis, we can attempt to identify and understand
via modern media and how these are perceived as a threat or the impact in different domains of digital media and impli-
an opportunity by the public. cated social phenomena such as injustice, inequality, political
Digital media have several key characteristics that influence influence, etc. To investigate a new and complex phenomenon
their nature and the way they are utilized. Although they form such as deepfakes, multiple perspectives are needed in order
a continuation of traditional media, their capabilities make the to be able to capture its essence. Therefore, it was decided to
information copying and transmission very easy, and as such, start from the perspectives that digital media can offer, i.e.,
the target audiences can be huge masses without limitations on media and society, media production, media representation,
location, time, or content size. The utilization of digital media and media audiences. However, since the societal aspects are
as a communication medium over the Internet has enabled in focus, additional perspectives were deemed necessary, i.e.,
instant communication, global audiences, and interactivity, gender, politics as well as law & regulation. While these
which have already been capitalized, e.g., in social revolutions may not be exhaustive, we consider that they do provide a
such as the Arab spring [13]. Deepfakes are strongly connected sufficient basis for the discussions that shed some light on the
with digital media, and especially social media, via which phenomenon and enable us to discuss its implications.
they reach a wide audience. Since the text, images, videos, It must be pointed out though, that although inspired by
sound, are the key elements of interaction and communication intersectionality, it is more freely followed rather than strictly
within the public sphere, its implications for affecting it, are implemented, and this assists towards the aim of providing a
significant. While the power and capabilities of AI are still to multi-angled view and analysis of an everyday phenomenon
be investigated, and far from being regulated, such efforts are from different perspectives. The undertaking of identifying
reported in media with diverging views, and their portrayed implications in a methodological manner is not trivial, as it
implications range from the extinction of the human race to is methodologically difficult to isolate and identify the effects
the cornerstone to its survivability. of media across the social space [19].
The intersection of digital media and AI is of interest since it
binds the cornerstone of media communication in the modern III. M EDIA AND S OCIETY P ERSPECTIVE
era, with the latest technological developments in AI, which
blurs the boundaries of reality and information dissemination. Culture constantly changes [20], and its impact in society
The result may be an amplification of digital media impacts is intertwined with the way people interact and use modern
on an unprecedented scale, which has the potential to bring technologies. Mediatization captures such aspects as it ”tries
new benefits to society but also be misused to guide public to capture long-term interrelation processes between media
opinion. With deepfakes, ”the ability to distort reality has taken change, on the one hand, and social and cultural change on the
an exponential leap forward” [14]. Therefore it is of interest other” [21]. Overall, mediatization is a process but ”can also
to investigate this intersection from different perspectives. be seen as a container in which observations can be collected”
[22], and as ”a dynamic process of increasing media influence,
cannot be regarded as a deterministic and linear development”
II. M ETHODOLOGY
[23]. Mass media have evolved from newspapers and TV in
The aim of this work is to better understand the intersection the past, to digital networks and platforms that are perceived
of digital media and AI and its implications in modern as a ”shared space of increased visibility and connectivity”
society. To that extent, a variety of methodologies [15] can [24]. Overall, mass media have effects that lead to ”change in
be utilized. Document analysis is selected, and it was verified an outcome within a person or social entity that is due to mass
2
Preprint version of doi:10.1109/TTS.2020.3001312
media influence following exposure to a mass media message In such assessments, the focus is often put on the individual,
or series of messages” [25]. As such, content communicated and as such, only a specific view of the social is evaluated,
over any kind of mass media, for instance, deepfakes over while for a more holistic viewpoint and understanding of
social media, raises some concerns. the interplay, it needs to be extended to the community and
In line with mediatization, the attempts to envision and society [31]. The latter larger context, where the interplay
capture the effect of AI on the life of a community, especially of technology and society evolves, still needs to be better
from a social, political, or ethical point of view, have already addressed, and this holds true also for deepfakes in digital
been an issue in science fiction and novels. There, AI is media. The direct effect of deepfakes is just the first layer that
often embodied in the existence of intelligent machines that needs to be considered, while the real danger is the distrust on
autonomously manage trivial tasks and, therefore, free up time organizations, processes, and people that are induced. Looking
for people to pursue more spiritual and fun activities [26]. On at the AI from the media and societal perspective is interesting,
the basis of this context, social processes and their regulation especially considering that today many situations involving
need to be seen in a new perspective. Today, regulation of deepfakes are insufficiently addressed due to lack of expertise
deepfakes hardly exists (see also discussion in section VIII), and difficulty in managing complex social, conflict, and real-
while in social processes, they fall under the larger category of world aspects [32]. AI can enable us to understand people
Internet memes, fun, or simply fake news. Fiction can be seen and group views better, combine their knowledge in a timely
as a form of interrogation that can deliver different aspects than fashion, simulate reactions, and find negotiation win-win for
the traditional journalism or academic writing [27], and in the complex situations that are beneficial for all [32]. However,
digital media era, this is well represented in literature, films, for that to be materialized, such actions need to be done over
games, etc. However, we are now entering a new era, where a healthy and truthful basis, something that deepfakes could
this interactivity is becoming much more real, as AI systems potentially jeopardize.
play and win sophisticated games against humans (e.g., the Not everything is put in a positive light in this interplay
board game Go, strategy game StarCraft II), while they are between AI and society. Many times new technologies that are
able to achieve super-human performance on specific tasks not well understood have been misused, as there is a lack of
(e.g., image classification). This implies that soon enough, appropriate regulatory frameworks in place, e.g., for robotics
what is up to now discussed only in fiction, may step-over to [33], [34]. Especially in the press, as well as in social media,
the real-world realm, where it will have an impact on existing often articles have circulated that paint a dystopian future,
societal processes. where AI is controlling everything and enforcing or enabling
The first impacts of AI are already visible, as several pro- questionable practices and ethics in humans. Deepfakes enable
cesses in economy and business are enhanced via AI, and more individuals and organizations to create content that could be
advanced products can be offered as companies can manage utilized for nefarious purposes. The interplay of AI digital
profitability and risks more efficiently [28]. In modern digital media and society is complex, and although some of the issues
media, large amounts of digital information can be collected analyzed do provide some insights, it would be of interest
and analyzed, while processes can be automatized and highly to investigate how political-economic structures in industry
customized towards even individuals. For instance, many so- forge or hinder individual agency around the utilization of AI
cial media platforms and other service providers feature user- applications.
targeted marketing campaigns based on individual user history,
actions, and preferences. Such customized messages selected
IV. M EDIA P RODUCTION P ERSPECTIVE
by sophisticated AI algorithms also include political messages
as well as views of other like-minded citizens in social media. One of the key areas affected by AI in digital media
The utilization of deepfakes in these processes significantly is that of media production, where a bidirectional power
extends the effectiveness and outreach of such actions, as relationship between production and consumption exists. The
they may reinforce beliefs or provoke actions. While today key question of who decides what is to be produced, was often
social platforms engineer sociality by enabling and forging addressed via tedious processes that attempted to understand
connections, and emerge as active mediators between users, the audiences and provide them with content that matched
technologies and content [29], in the future, empowered with their interests. In the digital media era however, such processes
AI, they will be even more well-integrated in such processes, are automatized and provide instant insights at unparalleled
eventually creating ”individual perception bubbles” that could detail. AI algorithms analyze in real-time massive amounts
potentially be misused to manipulate individuals and eventu- of data (big data) and derive detailed profiles on users, their
ally the public opinion. As such, politics and digital media are interests, their needs, and their satisfaction [35]. This, coupled
strongly coupled with AI overall, and the deepfakes could be with recommendation engines, can automatize the profiling
an additional enabler towards manipulating the real-world. efforts [36] in media organizations and enable the generation
Mediatization is ”constituted in the mutually influencing and of content specifically for selected target groups based on
molding relationship between institutions and the actors that very fine and sometimes very personal details. In this manner,
reproduce, maintain, and develop them through their agency” deepfakes further amplify this targeting, as the content can
[30]. In this context, the impact of new technologies (such as be shaped in a way that is more appealing to the specific
AI) and society influence each other, as ”technology also influ- characteristics of an individual, rather than that of a mass.
ences the way we think about the social and the political” [31]. For instance, a message could be produced in support of a
3
Preprint version of doi:10.1109/TTS.2020.3001312
political candidate, featuring a specific actor or leader that as there is still no clear understanding of the interworkings
the user finds trustworthy and in a language that s/he is in many of the AI algorithms and the decisions taken by
susceptible to believe. Nowadays, business organizations rely them [46]. Deepfake-driven content may easily find its way
on social media for their decision-making processes, and when through an automatized process that has too few ”controls” in
this is coupled with the learning abilities of AI, new media place to check for veracity and ethical issues, that might (still)
intelligence approaches can emerge that are social and multi- be detectable by humans. In addition, robot journalism has
modal [37]. ”significant practical, sociopolitical, psychological, legal and
News is a product, and how it is produced can be understood occupational implications for news organizations, journalists
by looking at cultural, economic, political-economic, or other and their audiences” [47] as for instance, there are discrepan-
power relationships [38]. The interplay of media and jour- cies between authorship and credits for the produced material.
nalism is very complex, with multiple angles, and as such, Although today there is a need for employees that have multi-
it may not be easy to understand its full extent. For this to modal media production skills [48], if such tasks can be
happen, the full nature of journalism needs to be understood, automatized and (even partially) delegated to AI-empowered
i.e., its focus on making news in the digital era, while processes, the need of such skills might not be necessary or
also considering the observed trends, e.g. (i) the use of the will have to shift focus, e.g., towards collaboration with AI
affordances of news websites, (ii) radical commercialization, robots on content creation.
(iii) participating audience and (iv) the multi-skilling and de- Journalism is nowadays not only produced by journalists but
skilling of journalists [30]. To this end, the participation of also individual citizens. Citizen-driven journalism is evident in
the audience, as well as utilization of such journalism over digital media, and even at the mainstream media, where we
social media platforms, its impacts, and implications are not increasingly see user-generated content (which may include
well understood, as evidenced by the fake news and Facebook deepfakes). This is often also presented to wider audiences,
contemporary discussions pertaining to the 2016 US election e.g., via TV or social channels of large organizations as such,
[39]. Hence, there is a need to approach it at the system level, which, however, relaxes the veracity checks that are applied
as ”systems theory is analytically powerful in describing the and the responsibility is shifted to a large extent to those
changing social power of journalism and mass media” [40]. individuals, e.g., with a simple remark such as ”citizen cap-
Especially since journalism is now highly dependent on the tured video or photo”. However, democracy needs high-quality
public attention [40], and in the era of social media, this investigative journalism, and not only creative variations of
is done in real-time, the power relationships between media text, or even worse, creative untruthful content, both of which
platforms, social platforms, and other stakeholders need to be can be easily created with deepfake technology. In addition, in
investigated. This also implies that journalists need to be able the age of the market-driven operations, the production of news
to verify the authenticity of claims, and as such, they need to supported by deepfakes may incentivize unethical and profit-
have the necessary tools and potentially the expertise to spot maximization actions that contradict institutional ethics of
deepfakes. professionalism [49]. As such, the emergence of deepfakes has
Another key aspect is the automated production of media a significant impact on media as its production will certainly
that is fueled by AI it has demonstrated its capability to be misused according to the agenda of its creators.
create text, pictures, and videos, based on what it learns from
available sources in digital platforms, e.g., social media such
V. M EDIA R EPRESENTATIONS P ERSPECTIVE
as Facebook, Twitter, Instagram, etc. Deep Learning has en-
abled this significant progress in Natural Language Processing The representation of AI overall and its capabilities in
(NLP) advances, including the capability of summarizing texts digital media is of interest as it shows how such technological
[41], as well as the reproduction of voices (e.g., narrating) advances and their impacts are communicated to the public. It
based on limited available samples (e.g., a 5-second speech is pointed out that ”culture is central to shaping collective per-
sample) [42], via which deepfakes can be created, at a level ceptions and the dynamics of media representation reproduces
that even humans find it convincing [43]. As such, in an forms of symbolic power” [50]. AI has been in most cultures
era where digital information is easily accessible and can linked to tangible physical objects, mostly as robots or highly
be copied, the creation of automated news (including spoken advanced computer systems that have a human-like logic
language) is possible. This has significant effects on media and conversational capabilities. This is evidenced in media,
production, as complex content can now be created by AI, including fiction, images, and films, eventually creating a robot
given adequate sources e.g., a publicly available voice sample culture over the last decades. However, with the modern AI
from a speech, and a photo from social media, can lead to the systems, a physical embodiment is optional, while AI overall is
creation of a realistic deepfake video [44]. evidenced in everyday life artifacts and processes, even when
With global audiences, the production of news needs to not directly recognizable, e.g., interaction with voice assistants
increase. Automatization in production is a step towards this in smartphones.
direction, and is also associated with less human involvement. The perception of AI in the general public is that of
In a recent survey [45], some have a negative feeling that (sometimes humanoid) robots that follow stereotypes, e.g.,
robots are likely to damage journalism’s value, while others are either villains or saviors [51]. Such stereotypes pose
also see the positive sides. Such robot journalism raises several a way of structuring our understanding of the world, our
challenges, including questions of bias, ethical considerations, values, and experiences, and position the robots in a specific
4
Preprint version of doi:10.1109/TTS.2020.3001312
context; therefore, also our expectations and interactions with relationships are again going to be significantly affected. A
them. How media nowadays represents AI is often biased mass media effect is a change in people or overall in society
and unbalanced, and the same holds for deepfakes. General as a result of the exposure to mass media influence [25]. When
AI is often portrayed in the media as ”a mixture of flawed considering the capabilities of deepfakes that combine (i) the
entertainment and fear” [52]. For instance, deepfakes are creation of new content and (ii) the easy dissemination of it
shown as a result that a machine created, and which may via digital media, it is evident that this will be a game-changer
raise a wow effect, but may also be coupled with some often and have an interdisciplinary impact. For instance, people have
unsubstantiated implications, e.g., what would happen when different behaviors when interacting with AI, e.g., robots [56],
machines could imitate everyone or how such actions would and it has also been observed that ”social surrogates have the
mean the financial catastrophe of film and media industry. potential to cause psychological harm” [57]. While deepfake
Such representations, e.g., in popular culture, may have an similar studies are not available, existing ones on the large
effect on public attitude and lead towards a specific view on domain of fake news [58] also exemplify the potential of
deepfakes and the associated technologies. Such actions have deepfakes to cause harm and empower existing fake news
been demonstrated in the past, for instance, while science techniques.
fiction may ”harden anti–killer robot attitudes among that
portion of the population who consume a lot of science fiction” Social media in the modern context are mass media that
the same can not be claimed for the general public and have the potential to interfere with societal actions, e.g.,
its opposition to autonomous weapons [53]. The ”uncanny social movements [59]. Nowadays, more than ever, a single
valley” hypothesis [54], suggests that end-products that closely individual can have a significant impact, since if his/her
resemble human behavior (but not exactly) can show uncanny message goes viral, it gets to be seen by millions of people,
or strangely familiar feelings of eeriness. However, there is an which affects them. Deepfakes, especially portraying ”new
indication that science fiction can reduce the eeriness of robots facts” or controversial issues, have such potential to become
[55], something that, in the future, we might also witness for viral, as people might be less reluctant to check their veracity,
deepfake products. in light of the time or subject sensitivity of the issue. As an
Deepfakes can be generated with the use of AI, and while example, political messages in times of conflict among nations
”robot” or ”robotic processes” are used in popular media can spread uncontrollably and lead to irrational and emotional
sometimes, they simply want to piggy-back on the general reactions.
notion of AI and robotics, denoting non-human intelligence Fandom may be another aspect that is affected. Today ”me-
hosted somewhere. Due to the robotic culture developed over dia convergence, new technologies, and transmedia marketing
the decades, it is easy to associate AI overall and deepfakes have all created new types of fans” [60], and deepfakes can
to robots or machines that are intelligent enough to produce potentially be another enabler. By lowering the capability
new content and fool humans. However, deepfakes merely rely of creating media products, deepfakes have the potential to
on the application of sophisticated algorithms to content, in enable communities to emerge more easily, and the generated
order to create high-quality video, sound, and images, but do deepfake content could more easily attract fans, something that
not act autonomously and not intelligently, even when they implies that industry may no longer solely control such spaces.
outperform humans in specific tasks, e.g., object detection. This is an angle that can further act as an enabler in the scope
Nevertheless, their representation as such in media is done to of the ”convergence culture”, where mass media are seen as
ease their introduction to the public and due to lack of real a form of it [61].
expertise in the area.
Deepfakes, similar to fake news, have also been portrayed Since ”uses and gratifications is a media-effects perspec-
in digital media, with social media being their prime channel tive”, the exposure to a medium is typically captured via
of distribution, but also traditional news media. How these are ”measures of one or more types of audience activity such as
represented there, however, differs. In social media, much of selectivity, media and content preferences, level of attention,
the content so far had entertainment character (e.g., replacing and involvement with content” [62]. Due to the AI technol-
faces of celebrities), and it was easily identified as fake. In ogy advances, two things can be realized which are game-
traditional media, it has been portrayed as an example of what changing (i) access at mass to people, their individual percep-
technology can do, and often put in the context of larger social tions/habits/views, etc. as these are captured by digital media,
questions and dilemmas. Media representations play a role in and (ii) instant and continuous evaluation of the available data,
how the phenomenon of deepfakes is perceived, but up to now, as well as correlation at global levels. As such, studies may be
this has mostly relied on the fiction and public perception of easier to carry out and may reveal new insights based on the
fake news. However, with the rapid advances in the technology more detailed data as well as their potential for longitudinal
behind deepfakes, such issues will need to be revisited and be realization. The results can then flow from the audiences to
better understood. the systems and approaches that create, manage, and operate
the digital media services and content, therefore providing a
better match between user needs and end-products. To this end,
VI. M EDIA AUDIENCES P ERSPECTIVE
deepfakes may be utilized as enablers and multipliers of the
Digital media have certainly changed the way people in- efforts done to increase the gratification aspects and engage
teract, and now with AI on the rise, it seems that such more with the audiences.
5
Preprint version of doi:10.1109/TTS.2020.3001312
VII. G ENDER P ERSPECTIVE US, the proposed Malicious Deep Fake Prohibition Act of
Deepfakes have a gendered angle. Feminism can be seen 2018 ”establishes a new criminal offense related to the creation
as ”an emancipatory, transformational movement aimed at or distribution of fake electronic media records that appear
undoing domination and oppression” [63]. In the modern era, realistic” [64]. In addition, an accountability act was followed
different forms of feminism have emerged that utilize modern up in 2019, in order to ”combat the spread of disinformation
media [24]. When looking collectively towards media effects through restrictions on deep-fake video alteration technology”
and feminism, the existence of ”feminist philosophies, con- [65].
cepts, and logics articulating feminist principles and concepts However, even with such legislative actions in place, while
to media processes such as hiring, production, and distribution; the problem is recognized, its effective addressing is chal-
to patterns of representation in news and entertainment across lenging. AI and its implications are not well understood, and
platforms; and to reception” can be evidenced [63]. therefore assumptions are made, decisions are taken, while
The question that is raised is what may be the interplay their applicability on AI is questionable. For instance, current
of deepfakes and feminism. Since contemporary feminism, laws cannot handle the complexity of AI.
also known as ”hashtag feminism”, ”takes place online and, at In the case of the exemplified prohibition act [64], what is
times, exclusively through social media platforms” [59], it is proposed is to toughen the consequences, at the federal level,
heavily susceptible to the deepfakes. It should not be forgotten for a practice that is already unlawful. As such, a traditional
that the initial deepfake material that appeared portrayed approach is taken, which, however, does not lower the risks
predominantly fake female celebrity pornographic videos and associated with deepfakes. The accountability act [65] goes a
revenge porn on females [2], [4]. Although it was argued often step further and provides a better understanding of the area and
in media that this was created for ”entertainment” and was lays out potential actions that need to be undertaken. However,
addressing mainly the male audience, they were also made some of these actions are unrealistic. For instance, it would
available to the wider public in well-known pornographic sites, require watermarks and clear labeling on deepfake content,
effectively attacking the identity and moral stands of those something that surely the creators of deepfakes, especially
targeted. While such content is illegal, and various websites those with nefarious intents, will not abide to. As such, its
make active efforts to remove it once detected, often such effectiveness is seen as limited. In addition, there are also
actions come too late or are not efficient. some concerns raised for some of its exclusion aspects, as for
Of particular importance to the gendered angle of deepfakes, instance, these conditions would not apply in specific cases
is the revenge porn, which is an evolution of the existing of public safety or national security if this is government-
non-consensual image sharing (e.g., nude photos and videos). generated, i.e. ”produced by an officer or employee of the
Now realistic videos with matching voices can be created and United States, or under the authority thereof, in furtherance of
distributed to online audiences easily. To exemplify the issue, public safety or national security” [65].
The discrepancy between deepfakes and its specific audi-
already one in twenty-five Americans has been a victim of
ence of regulators has tangible impacts on society as specific
”revenge porn” [58], something that is expected to increase
actions may be very difficult to be enforced. Without proper
considering the high-quality as well as easiness that deepfakes
legislative capturing of deepfakes, both the executive branch
bring into the table.
that carries out the law and the judicial branch that interprets
In literature there have been observed factors that limit the
the laws will face challenges. As it can be seen, the full
benefits of feminists; more specifically: ”feminists experience
context of deepfakes is not well understood, and there lies
new forms of exclusion of access to publicity and recognition,
the danger of (i) not addressing it in an effective manner but
as digital networks can be, at the same time, spaces of
only superficially, (ii) introduce actions whose implications are
uncertainty and empowerment, depending on skills, resources,
not well understood and weaken civil rights, that may lead to
and age” [24]. Deepfakes have the potential to increase such
long-term societal impacts.
uncertainty and limit empowerment, as they can easily create
and propagate discriminatory content. IX. P OLITICAL P ERSPECTIVE
It should be pointed out that the gendered aspects do not
mean that women are targeted, and men are not; similar actions Technology influences the way we think about the social
can, of course, be realized against men; however, the majority and the political [31]. Especially with the latest advances in
of cases so far have been against women and in specific roles. AI, not only fake news, photos, and videos (Deepfakes) can be
Overall, gender should be approached as a social construct, created, but also fake reviews, convincingly realistic text, and
and gendered aspects, in conjunction with deepfakes, should even conversations in real-time [41]. The political mobilization
be looked upon in the wider area of feminist theory and gender of the masses is now possible via social media such as
studies. Facebook [66], and instant messaging applications such as
WhatsApp, make global audiences reachable around the clock
and in a personalized manner. The power of these media has
VIII. L AW AND R EGULATION P ERSPECTIVE already been shown in their role in recent social movements
Several countries have laws and a regulatory framework such as the Arab Spring in some developing countries [67].
dealing with digital media and their processes. The spread Disinformation in such media e.g., via deepfake generated
of fake news, including deepfakes, is also attempted to be political videos [68] have the potential to raise uncertainty
addressed via the same known processes. For instance, in the and reduce the trust placed in the news on social media.
6
Preprint version of doi:10.1109/TTS.2020.3001312
Both social media and instant messaging applications are fact that anyone with low technical skills can do it easily, poses
seen as ”fertile ground for circulating deepfakes with explosive new challenges, and affects the media and societal interplay.
implication for politics” [1]. The fake news issue becomes It directly impacts the classical sociological dichotomy [73]
imminent as, in some developing countries, social media between individual agency (freedom and creativity) and struc-
penetration is so high, that it is often considered as the main ture (technological interface and peer group/community norms
source of information as well as fact-checking. The problem and expectations) for online content creation.
is that fake news reduces the trust of even legitimate sources Deepfakes constitute a new technology-empowered man-
and enables misinformation or disinformation tactics, which ifestation of the well-known phenomenon of fake content
can have detrimental effects on societal operations [68], [69]. creation. Technology savvy citizens and experts in specific
People may react emotionally or be guided to actions that are domains can be used to evaluate the realness of photographs
totally the creation of a computer program and may pose a [74]. Forensic technology tools may be developed that help
distorted view of reality. the effort of identification of deepfakes [16], [75]. While
Actions that lead to the weakening of critical functions are the threat can potentially be mitigated with current legal and
opposed to the fundamental rights in the social welfare state technological approaches or new ones that may be developed,
[70], and deepfakes do pose such a threat, as they have the none of them will solve it [1], [18]. Furthermore, while
potential to contribute to this weakening by lowering the trust the Global North may be more familiar with the cutting
on the public bodies and entities as well as the associated edge technologies and have more independent media, many
processes. developing countries in the Global South, due to the digital
divide or the lack of independent media plurality, have citizens
that fall often pray to fake news.
X. D ISCUSSION
The inquisitive nature of the user who is skeptical of the
The era of deepfakes where sensational, dishonest, or even communication s/he receives is seen as potentially beneficial
fabricated content propagates mostly through social media is in the effort of fighting deepfakes. A survey [76] identified
already here. With some healthy portion of skepticism and that a big part of falling for fake news is due to the ”general
cross-referencing, one might still be able to navigate through tendency to be overly accepting of weak claims”, as public
it. Traditional ways of thinking, captured via popular sayings susceptibility and lack of awareness are seen as a problem in
such as ”seeing is believing”, ”I trust what I see”, ”a picture the identification of fake news overall [77]. Fact-checking is
is worth a thousand words”, are going to be increasingly considered critical in the identification of fake news [78], and
challenged. Fabrication of photos and videos has always people that cross-check their sources have fewer chances of
been challenging and could be realized only with significant falling for fake material and further propagating it. There are
efforts and expertise, but this is not state of the art anymore. several websites that do fact-checking [79], and people with
Deepfakes have demonstrated convincingly, the easy access sufficient skepticism could verify the information received
to the capability of creating realistic fake videos [11] and as prior to trusting it or propagating it in social media. However,
such, media production aspects are significantly affected both with deepfakes, things are more challenging and complicated,
on ”how” as well as on ”what” is produced. especially due to their realistic nature.
To deal with the deepfakes phenomenon, we need to posi- Social media literacy so far, with fake news overall, may
tion it in the public sphere, where it mostly takes place and enable users to not fall for deepfakes. If they are critical of
also where its effects can be observed. As public sphere, we texts, photographs, and other material already, the presence of
consider the realm of social life where public opinion can deepfakes, although more challenging, could still be addressed.
be formed [70], which is the normative basis for a deliberate News literacy [80], [81], and overall technology literacy of
democracy. In this context, media is seen as a platform for the citizens are seen as fundamental, and reasoning in social
inclusive discussions and is linked to democracy and society, media environments is seen as a critical skill that, e.g., students
since individuals and groups mobilize via it their support for should be taught [82]. Digital literacy [83] can help to adopt
their perspectives. The role of media is crucial to the gover- a more healthy approach in social media and act as an enabler
nance and democratization since aspects such as community for combating the propagation of fake information, including
and social media (where deepfakes might be utilized) affect deepfakes. Therefore, there is an imminent need for education
key areas such as poverty, inequality, and society overall [71]. and training so that digital media literacy increases. However,
Social media platforms are where deepfakes are predominantly even technology-savvy and social media literate people, loose
distributed, and as such, they become an integral part of their confidence when they are confronted with the results of
complex dynamics. As such, deepfakes have large implications deepfakes.
since social platforms engineer sociality [29]. One can ap- Education and upskilling of the citizens is not the only
proach such dynamics via ”connecting ANT’s [Actor Network potential line of action. In addition, news agencies must adhere
Theory] recognition of the interdependence of technical, social to high-quality standards, and this may also be enforced via
and cultural aspects, and Castells’ political analysis of the legislative actions to penalize media misinformation distribu-
economic-legal-political stratum” [29]. tion. Transparency of media online news and source checking
There have been different media-technological innovations, should be the norm [80]. In addition, new tools that can
from which social media and smartphones are key in the digital enable both citizens and journalists to identify and check
era [72]. In this era, media falsification is not new [3], but the the authenticity of information and pinpoint the source of
7
Preprint version of doi:10.1109/TTS.2020.3001312
it are needed. There needs to be a combination of human- what can be done and at a larger scale, where resources
driven analysis [79] as well as automation of it, e.g., via AI of large enterprises are available (not to mention nation-
approaches [16] that can act in real-time. wide resources). Oppressing or manipulative governments
Deepfakes have far-fetching implications, especially in the and organizations can utilize practices such as astroturfing
era where information with great easiness propagates social [85] to tighten their grip and shape public opinion. Fighting
networks, gets communicated, read, and acted upon all over misinformation, may also be used as a Trojan horse to bypass
the world. While social media are fertile ground for circulating user privacy and undermine freedom of speech [18].
deepfakes [1] utilizing them in a specific context, e.g., politics The threats posed by deepfakes need to be addressed via
has the potential to disrupt societal processes. The misuse a combination of technology, regulation, and education. The
potential becomes evident if one couples a deepfake photo informed and intellectual citizenry is required [86] in order
with an appropriately crafted message, in order to create to push for reform-based political and social changes. Recent
ripples in society. Deepfakes could pose as a new form of research [75] portrays AI-based solutions that can detect
contemporary psychological warfare [84] and individual or deepfakes, including in many cases the software that was
group manipulation tool. used to create them. Others have proposed complementary
In political news journalism, i.e., news media coverage of technologies such as Blockchain [17] in order to link videos to
politics, journalists try to be in control of political stories rather trusted/reputable entities. While technical solutions for identi-
than passively report on what is promoted by political actors fication, verification, and removal of such fakes are underway
[23]. To do so, however, they must be able to not fall for [16], [17], [75], the problem is not strictly a technological
deepfake content that is produced by stakeholders that favor one, but one of trust to processes and stakeholders, e.g., to
such political actors directly or indirectly. For instance, in journalism which operates responsibly and provably.
the context of social movements and activism [67], such new Deepfakes and its underlying technology, pose not only
media technology could be used maliciously [84]. Combining threats but also opportunities. AI algorithms utilized in deep-
a deepfake video with a serious message that fits a specific fakes have a wide range of capabilities and can create new text,
political agenda will have an impact on people. In the heat of voice, video, works of art, etc. In addition, even the core deep-
the movement’s actions, such videos can be used to discredit fake technology can be seen as having beneficial effects in sim-
the opposition, create reality-near photos that would outrage ulation and training of personnel in customized/personalized
citizens, and even create a fake temporal reality with events realistic scenarios that would have been otherwise too costly
and images that support it, and by the time it is revealed to or impossible to realize. There are also potential uses e.g., in
be fake, it would have served its purpose. the lawful provision of deception content and tactics against
Trust in media, processes, and people is a major challenge, criminals, terrorists, and other adversaries acting against the
as removing trust from the news, images, videos, basically public good. The latter was attempted to be captured in the
removes trust in social interactions and structures, and ef- US accountability act [65], as discussed in section VIII.
fectively leaves open the door for doubt everywhere, even in Deepfakes demonstrate a powerful technology in an emerg-
legitimate cases. Therefore, misinformation or disinformation ing AI era, and as it is the case with all paradigm-shifting
tactics, can impact society and its processes [68], [69]. Truth technologies, its use is not determined only by its capabilities,
decay coincides with trust decay [14], which raises new but also the regulatory framework, ethics, culture, and other
concerns since society can no longer share and act on accurate societal norms [46]. Therefore, as avenues of future research,
perceptions of reality. one can consider several aspects that have been indicated
Capturing the long term implications of deepfakes, and in this work, which, however, need to be addressed more
comparing it to those of the fake news may also need to be in-depth. Such aspects include a diligent approach to the
addressed. Also, since not only people have agency, but objects relationship between deepfakes and society, as well as its
(e.g., algorithms) do also [29], it is relevant to investigate how impacts. This should include how they manifest as well as
technology is part of the process and how it influences and their behavior over time. In addition, beneficial would be
gets influenced by the ongoing processes in the area of digital a detailed intersectional approach that covers in detail the
media and deepfakes. While deepfakes in this early stage are identities, e.g., gender, race, class, sexuality, disability, and
targeted directly towards humans, in the future, this might not their role and impact for discrimination and social injustice.
be the case. The increasing reliance on AI-empowered cyber- The interplay with media, culture, and society is challenging,
physical systems, e.g., self-driving cars, may prompt towards and at this stage, empirical research, in combination with good
the creation of deepfakes targeting the machines and indirectly positioning in theoretical frameworks, is lacking. Empirical
the humans. Unlocking a self-driving car with a deepfake research that links concrete theoretical frameworks with the
voice, altering its behavior by projecting deepfake images to its utilization of deepfakes and impact in representative use
sensors, etc. could affect the designed behavior of the car and cases, and verifies or disproves proposed theoretical contexts
lead it towards unpredictable decisions (e.g., sudden braking) is necessary. Efforts should also be directed to technology-
which could harm the humans. More empirical research is driven identification of deepfake materials, e.g., video, voice,
needed, that is also bound to the appropriate theoretical text, and how these efforts may result in tools that can be
frameworks, and provide support or not for them. utilized by the stakeholders, e.g., journalists, citizens, etc.
Deepfakes are a demonstration of what is possible even by Approaches that enhance trust in digital media sources and
home users with moderate means, and one can only imagine dependent processes are also needed, and research could be
8
Preprint version of doi:10.1109/TTS.2020.3001312
devoted to constructing real-world platforms, services, and [12] “DeepFaceLab: the leading software for creating deepfakes,” GitHub,
tools that enable it. Since deepfakes intersect with several areas 2019. [Online]. Available: https://github.com/iperov/DeepFaceLab
[13] A. Smidi and S. Shahin, “Social Media and Social Mobilisation in
of modern life, it is also important to investigate the ethical the Middle East: A Survey of Research on the Arab Spring,” India
side of it, as well as the areas related to safety and security Quarterly: A Journal of International Affairs, vol. 73, no. 2, pp. 196–
in the societal context. Finally, research needs to be devoted 209, Jun. 2017.
[14] R. Chesney and D. K. Citron, “Deep fakes: A looming challenge for
to training and educational aspects of affected stakeholders, privacy, democracy, and national security,” SSRN Electronic Journal,
e.g., citizens as well as those involved in governance, e.g., 2018.
legislative, executive, and judicial branches. [15] L. Given, The SAGE Encyclopedia of Qualitative Research Methods.
SAGE Publications, Inc., 2008.
[16] M.-H. Maras and A. Alexandrou, “Determining authenticity of video
XI. C ONCLUSIONS evidence in the age of artificial intelligence and in the wake of deepfake
videos,” The International Journal of Evidence & Proof, vol. 23, no. 3,
Understanding the intersection of digital media and AI, in pp. 255–262, Oct. 2018.
the case of deepfakes, and the effect on modern society is [17] H. R. Hasan and K. Salah, “Combating deepfake videos using
imperative if this technology is to be properly put in context blockchain and smart contracts,” IEEE Access, vol. 7, pp. 41 596–41 606,
2019.
and the challenges it raises are effectively addressed. From [18] J. Pitt, “Deepfake videos and DDoS attacks (deliberate denial of satire)
the discussions, it is evident that the intersection of digital [editorial],” IEEE Technology and Society Magazine, vol. 38, no. 4, pp.
media and deepfakes, has several impacts on the individuals as 5–8, Dec. 2019.
[19] N. Couldry and A. Hepp, “Conceptualizing mediatization: Contexts,
well as the society overall. Understanding deepfakes in modern traditions, arguments,” Communication Theory, vol. 23, no. 3, pp. 191–
digital media, as well as the processes it affects and its overall 202, Jul. 2013.
implications, is seen as challenging, and therefore they should [20] S. Schech, Companion to Development Studies. Hoboken : Taylor and
Francis, 2014, ch. Culture and development, pp. 42–46.
be investigated from multiple angles that need to be considered [21] A. Hepp, S. Hjarvard, and K. Lundby, “Mediatization: theorizing the
(including a temporal aspect). To do so, however, appropriate interplay between media, culture and society,” Media, Culture & Society,
dimensions need to be defined (which is largely not the case vol. 37, no. 2, pp. 314–324, Feb. 2015.
[22] D. Deacon and J. Stanyer, “Mediatization: key concept or conceptual
today), and these should be sufficient to capture all the factors bandwagon?” Media, Culture & Society, vol. 36, no. 7, pp. 1032–1044,
involved in the interplay. This work has only revealed some Aug. 2014.
high-level aspects, and a much deeper investigation is needed. [23] K. Falasca, “Political news journalism: Mediatization across three news
reporting contexts,” European Journal of Communication, vol. 29, no. 5,
There is the inherent danger that the society will no longer be pp. 583–597, Jul. 2014.
able to credibly recognize in a timely fashion true and fake [24] A. Fotopoulou, “Digital and networked by default? women’s organisa-
aspects, which might lower trust in stakeholders, processes, tions and the social imaginary of networked feminism,” New Media &
Society, vol. 18, no. 6, pp. 989–1005, Sep. 2014.
and journalism, and ”everything is fake” motto may prevail. [25] W. J. Potter, “Conceptualizing mass media effect,” Journal of Commu-
While technical solutions for identification, verification, and nication, vol. 61, no. 5, pp. 896–915, Oct. 2011.
removal of such fakes is needed, the problem is not strictly [26] Y. Rumpala, “Artificial intelligences and political organization: An ex-
ploration based on the science fiction work of iain m. banks,” Technology
technological, but should also involve regulatory measures as in Society, vol. 34, no. 1, pp. 23–32, Feb. 2012.
well as educational aspects of users. [27] O. Hemer, Fiction and truth in transition: writning the present past in
South Africa and Agentina, ser. Freiburg studies in social antropology:
R EFERENCES Band 34. Wien : LIT, 2012.
[28] C. Dirican, “The impacts of robotics, artificial intelligence on business
[1] R. Chesney and D. Citron, “Deepfakes and the new disinformation war: and economics,” Procedia - Social and Behavioral Sciences, vol. 195,
The coming age of post-truth geopolitics.” Foreign Affairs, vol. 98, no. 1, pp. 564–573, Jul. 2015.
pp. 147–153, 2019. [29] J. van Dijck, “Facebook and the engineering of connectivity,” Conver-
[2] M. Westerlund, “The emergence of deepfake technology: A review,” gence: The International Journal of Research into New Media Technolo-
Technology Innovation Management Review, vol. 9, no. 11, pp. 39–52, gies, vol. 19, no. 2, pp. 141–155, Sep. 2012.
Jan. 2019. [30] A. Kammer, “The mediatization of journalism,” MedieKultur: Journal
[3] J. Fletcher, “Deepfakes, artificial intelligence, and some kind of dystopia: of media and communication research, vol. 29, no. 54, p. 18, Jun. 2013.
The new faces of online post-fact performance,” Theatre Journal, vol. 70, [31] M. Coeckelbergh, “Technology and the good society: A polemical
no. 4, pp. 455–471, 2018. essay on social ontology, political principles, and responsibility for
[4] J. Kietzmann, L. W. Lee, I. P. McCarthy, and T. C. Kietzmann, technology,” Technology in Society, vol. 52, pp. 4–9, Feb. 2018.
“Deepfakes: Trick or treat?” Business Horizons, vol. 63, no. 2, pp. 135– [32] D. J. Olsher, “New artificial intelligence tools for deep conflict resolution
146, Mar. 2020. and humanitarian response,” Procedia Engineering, vol. 107, pp. 282–
[5] “You Won’t Believe What Obama Says In This Video,” YouTube, 2018. 292, 2015.
[Online]. Available: https://www.youtube.com/watch?v=cQ54GDm1eL0 [33] C. Holder, V. Khurana, F. Harrison, and L. Jacobs, “Robotics and law:
[6] “Taxi Driver starring Al Pacino [DeepFake],” YouTube, 2019. [Online]. Key legal and regulatory implications of the robotics age (part i of II),”
Available: https://www.youtube.com/watch?v=9NkKj0aNB0s Computer Law & Security Review, vol. 32, no. 3, pp. 383–402, Jun.
[7] “Avatarify: Avatars for Zoom, Skype and other video-conferencing 2016.
apps,” GitHub, 2020. [Online]. Available: https://github.com/alievk/ [34] C. Holder, V. Khurana, J. Hook, G. Bacon, and R. Day, “Robotics and
avatarify law: Key legal and regulatory implications of the robotics age (part II
[8] “This person does not exist,” website, 2019. [Online]. Available: of II),” Computer Law & Security Review, vol. 32, no. 4, pp. 557–576,
https://thispersondoesnotexist.com Aug. 2016.
[9] T. Karras, S. Laine, and T. Aila, “A style-based generator architecture for [35] R. Alharthi, B. Guthier, and A. E. Saddik, “Recognizing human needs
generative adversarial networks,” IEEE Transactions on Pattern Analysis during critical events using machine learning powered psychology-based
and Machine Intelligence, 2020. framework,” IEEE Access, vol. 6, pp. 58 737–58 753, 2018.
[10] “This resume does not exist,” website, 2019. [Online]. Available: [36] P. Ducange, R. Pecori, and P. Mezzina, “A glimpse on big data analytics
https://thisresumedoesnotexist.com in the framework of marketing strategies,” Soft Computing, vol. 22, no. 1,
[11] A. Siarohin, S. Lathuilière, S. Tulyakov, E. Ricci, and N. Sebe, “First pp. 325–342, Mar. 2017.
order motion model for image animation,” in Advances in Neural [37] L. Degerstedt and S. Pelle, “More media, more people – on social &
Information Processing Systems 32. Curran Associates, Inc., 2019, multimodal media intelligence,” Human IT, vol. 13, no. 3, pp. 54–84,
pp. 7137–7147. 2017.
9
Preprint version of doi:10.1109/TTS.2020.3001312
[38] M. Schudson, “The sociology of news production,” Media, Culture & [64] B. Sasse, “Malicious deep fake prohibition act of 2018,” Dec.
Society, vol. 11, no. 3, pp. 263–282, Jul. 1989. 2018. [Online]. Available: https://www.congress.gov/115/bills/s3805/
[39] M. J. Kushin, M. Yamamoto, and F. Dalisay, “Societal majority, face- BILLS-115s3805is.pdf
book, and the spiral of silence in the 2016 US presidential election,” [65] Y. D. Clarke, “H. R. 3230, To combat the spread of disinformation
Social Media + Society, vol. 5, no. 2, p. 205630511985513, Apr. 2019. through restrictions on deep-fake video alteration technology,” Jun.
[40] R. Kunelius and E. Reunanen, “Changing power of journalism: The two 2019. [Online]. Available: https://www.congress.gov/116/bills/hr3230/
phases of mediatization,” Communication Theory, vol. 26, no. 4, pp. BILLS-116hr3230ih.pdf
369–388, Jul. 2016. [66] V. P. Miletskiy, D. N. Cherezov, and E. V. Strogetskaya, “Transforma-
[41] P. Janaszkiewicz, J. Krysińska, M. Prys, M. Kieruzel, T. Lipczyński, and tions of professional political communications in the digital society (by
P. Różewski, “Text summarization for storytelling: Formal document the example of the fake news communication strategy),” in 2019 Com-
case,” Procedia Computer Science, vol. 126, pp. 1154–1161, 2018. munication Strategies in Digital Society Workshop (ComSDS). IEEE,
[42] Y. Jia, Y. Zhang, R. J. Weiss, Q. Wang, J. Shen, F. Ren, Z. Chen, Apr. 2019.
P. Nguyen, R. Pang, I. L. Moreno, and Y. Wu, “Transfer learning from [67] M. Ancelovici, P. Dufour, and H. Nez, Eds., Street Politics in the Age
speaker verification to multispeaker text-to-speech synthesis,” CoRR, of Austerity: From the Indignados to Occupy. Amsterdam University
2018. Press, Dec. 2016.
[43] I. Solaiman et al., “Release strategies and the social impacts of [68] C. Vaccari and A. Chadwick, “Deepfakes and disinformation: Ex-
language models,” OpenAI, techreport, Nov. 2019. [Online]. Available: ploring the impact of synthetic political video on deception, uncer-
https://d4mucfpksywv.cloudfront.net/papers/GPT 2 Report.pdf tainty, and trust in news,” Social Media + Society, vol. 6, no. 1, p.
[44] J. Thies, M. Elgharib, A. Tewari, C. Theobalt, and M. Nießner, “Neural 205630512090340, Jan. 2020.
voice puppetry: Audio-driven facial reenactment,” arXiv 2019, 2019. [69] Z. Tufekci, Twitter and tear gas : the power and fragility of networked
[45] D. Kim and S. Kim, “Newspaper journalists’ attitudes towards robot protest. New Haven London: Yale University Press, 2017.
journalism,” Telematics and Informatics, vol. 35, no. 2, pp. 340–357, [70] J. Habermas, S. Lennox, and F. Lennox, “The public sphere: An
May 2018. encyclopedia article (1964),” New German Critique, no. 3, pp. 49–55,
[46] S. Karnouskos, “Self-driving car acceptance and the role of ethics,” IEEE 1974.
Transactions on Engineering Management, vol. 67, no. 2, pp. 252–265, [71] M. Scott, Media and development. Zed Books, 2014.
May 2020. [72] N. Couldry and A. Hepp, The Mediated Construction of Reality : society,
[47] T. Montal and Z. Reich, “I, robot. you, journalist. who is the author?” culture, mediatization. Polity Press, 2016.
Digital Journalism, vol. 5, no. 7, pp. 829–849, Aug. 2016. [73] V. Kalmus, P. Pruulmann-Vengerfeldt, A. Siibak, and P. Runnel, “Map-
[48] T. Hopp and H. Gangadharbatla, “Examination of the factors that ping the terrain of ”generation c”: Places and practices of online content
influence the technological adoption intentions of tomorrow’s new media creation among estonian teenagers.” Journal of Computer-Mediated
producers: A longitudinal exploration,” Computers in Human Behavior, Communication, vol. 14, no. 4, pp. 1257–1282, 2009.
vol. 55, pp. 1117–1124, Feb. 2016. [74] M. Gates, “Is seeing still believing: Factors that allow humans and
[49] E. Freidson, Professionalism : the third logic. University of Chicago machines to discriminate between real and generated images,” SMPTE
Press, 2001. Motion Imaging Journal, vol. 127, no. 9, pp. 70–78, Oct. 2018.
[75] A. Rossler, D. Cozzolino, L. Verdoliva, C. Riess, J. Thies, and
[50] S. Hall, Representation : cultural representations and signifying prac-
M. Niessner, “FaceForensics++: Learning to detect manipulated facial
tices., ser. Culture, media, and identities. London ; Thousand Oaks,
images,” in 2019 IEEE/CVF International Conference on Computer
Calif. : Sage in association with the Open University, 1997., 1997.
Vision (ICCV). IEEE, Oct. 2019.
[51] C. Bartneck, “Robots in the theatre and the media,” in Design &
[76] G. Pennycook and D. G. Rand, “Who falls for fake news? the roles
Semantics of Form & Movement (DeSForM), Wuxi, China, 2013, pp.
of bullshit receptivity, overclaiming, familiarity, and analytic thinking,”
64–70.
Journal of Personality, vol. 88, no. 2, pp. 185–200, Apr. 2019.
[52] S. L. Epstein, “Wanted: Collaborative intelligence,” Artificial Intelli- [77] K. Sharma, F. Qian, H. Jiang, N. Ruchansky, M. Zhang, and Y. Liu,
gence, vol. 221, pp. 36–45, Apr. 2015. “Combating fake news,” ACM Transactions on Intelligent Systems and
[53] K. L. Young and C. Carpenter, “Does science fiction affect political Technology, vol. 10, no. 3, pp. 1–42, Apr. 2019.
fact? yes and no: A survey experiment on “killer robots”,” International [78] L. Whitney, “How to spot fake news online.” PC Magazine, pp. 155–
Studies Quarterly, vol. 62, no. 3, pp. 562–576, Aug. 2018. 159, 2019.
[54] K. F. MacDorman and H. Ishiguro, “The uncanny advantage of using [79] J. Hyman, “Addressing fake news: Open standards & easy identifica-
androids in cognitive and social science research,” Interaction Studies, tion,” in 2017 IEEE 8th Annual Ubiquitous Computing, Electronics and
vol. 7, no. 3, pp. 297–337, Nov. 2006. Mobile Communication Conference (UEMCON). IEEE, Oct. 2017.
[55] M. Mara and M. Appel, “Science fiction reduces the eeriness of android [80] J. M. Pérez Tornero, S. Samy Tayie, S. Tejedor, and C. Pulido, “How
robots: A field experiment,” Computers in Human Behavior, vol. 48, pp. to confront fake news through news literacy? state of the art.” Doxa
156–162, Jul. 2015. Comunicación, pp. 211–235, 2018.
[56] Y. Mou and K. Xu, “The media inequality: Comparing the initial [81] G. Lotero-Echeverri, L. M. Romero-Rodrı́guez, and M. A. Pérez-
human-human and human-AI social interactions,” Computers in Human Rodrı́guez, “”fact-checking” vs. ”fake news”: Periodismo de confir-
Behavior, vol. 72, pp. 432–440, Jul. 2017. mación como recurso de la competencia mediática contra la desinfor-
[57] K. Nash, J. M. Lea, T. Davies, and K. Yogeeswaran, “The bionic blues: mación,” Index Comunicación, vol. 8, no. 2, pp. 295–316, May 2018.
Robot rejection lowers self-esteem,” Computers in Human Behavior, [82] S. McGrew, T. Ortega, J. Breakstone, and S. Wineburg, “The challenge
vol. 78, pp. 59–63, Jan. 2018. that’s bigger than fake news: Civic reasoning in a social media environ-
[58] A. Lenhart, M. Ybarra, and M. Price-Feeney, “Nonconsensual image ment.” American Educator, vol. 41, no. 3, pp. 4–9, 2017.
sharing: One in 25 americans has been a victim of ”revenge porn”,” Data [83] S. Horn and K. Veermans, “Critical thinking efficacy and transfer skills
& Society Research Institute, and Center for Innovative Public Health defend against ‘fake news’ at an international school in finland,” Journal
Research (CiPHR), USA, Tech. Rep., 2016. [Online]. Available: https: of Research in International Education, vol. 18, no. 1, pp. 23–41, Feb.
//datasociety.net/pubs/oh/Nonconsensual Image Sharing 2016.pdf 2019.
[59] R. Clark, ““hope in a hashtag”: the discursive activism of #WhyIStayed,” [84] K. A. Pantserev, “The malicious use of AI-based deepfake technology
Feminist Media Studies, vol. 16, no. 5, pp. 788–804, Feb. 2016. as the new threat to psychological security and political stability,” in Ad-
[60] K. Busse and J. Gray, “Fan cultures and fan communities,” in The vanced Sciences and Technologies for Security Applications. Springer
Handbook of Media Audiences. Wiley-Blackwell, Apr. 2011, pp. 425– International Publishing, 2020, pp. 37–55.
443. [85] C. H. Cho, M. L. Martens, H. Kim, and M. Rodrigue, “Astroturfing
[61] J. A. Brown, “#wheresRey: feminism, protest, and merchandising sexism global warming: It isn’t always greener on the other side of the fence,”
in star wars: The force awakens,” Feminist Media Studies, vol. 18, no. 3, Journal of Business Ethics, vol. 104, no. 4, pp. 571–587, Jul. 2011.
pp. 335–348, Apr. 2017. [86] B. Mutsvairo, Digital activism in the social media era : critical reflec-
[62] P. Haridakis, The International Encyclopedia of Media Studies. Black- tions on emerging trends in Sub-Saharan Africa. Cham, Switzerland:
well Publishing Ltd, 2013, ch. Uses and Gratifications: A Social and Palgrave Macmillan, 2016.
Psychological Perspective of Media Use and Effects.
[63] L. Steiner, “Feminist media theory,” in The Handbook of Media and
Mass Communication Theory. John Wiley & Sons, Inc., Mar. 2014,
pp. 359–379.
10
Preprint version of doi:10.1109/TTS.2020.3001312