Tuesday, October 1, 2024

Reflection of Deep Fakes


From watching the presentations, one topic that specifically stood out to me was deep fakes. According to Shruti Agarwal and Penny Ferret from the University of Berkeley, deep fakes are the manipulation of digital media using digital effects to create AI-synthesized content. During the presentation, it was also discussed that there are three types of deep fakes: Face Swap, Lip Sync, and Puppet Master deep faking. Personally, I was unaware of the widespread use of deep fakes before recently, and I was surprised to learn that this technology was used in the Star Wars movie to digitally recreate Princess Leia, originally played by the late Carrie Fisher.

Another example that resonated with me was the deep fake of Joe Kwon Oliver, a victim of the Parkland shooting in 2018. His parents, using AI, were able to create a video that replicated his voice and likeness to promote gun safety. What stood out to me was the emotional impact. Even though it was a digital recreation, the fact that his parents brought him back to deliver a message so personal to them made it incredibly powerful. Seeing him "speak" again gave the message a weight that wouldn’t have been as impactful if it came from a typical public service announcement.

While it’s fascinating that advancements in AI are leading to more realistic deep fakes and leading to new forms of creativity, it also raises serious concerns. As these AI-generated videos become more lifelike, the line between what’s real and what’s fake begins to blur, making it easier for scammers and sick individuals to exploit the technology. From further research, I found a report on Stanford’s website that warns individuals to “be wary of phone calls or videos that appear to come from trusted colleagues or senior executives when they involve unexpected demands or requests for financial transactions.” This highlights the darker side of deep fakes and their potential for fraud and manipulation.


In the presentation, the speaker mentioned how Penny Ferret found that, in 2022, there were over 1,000 LinkedIn accounts with profile pictures that were synthetically generated. These accounts would message users, trying to promote products or services, all while hiding behind AI-generated faces. This really stood out to me because I’ve seen it happen on my own LinkedIn—accounts with sketchy profiles reaching out with offers that just didn’t feel right.

It is clear that deep fake technology isn’t just about videos; it’s also being used to create fake identities on social media. The mix of AI and social engineering makes it hard to know who or what to trust online. As these fake profiles get more convincing, it becomes easier for people to get tricked into scams or misinformation.



Propoganda

Propaganda in the modern day is much more subtle and pervasive than the examples we associate with historical events like World War II. It is not only used by governments but by corporations, political parties, and media outlets, leveraging modern technology to shape public opinion.

While propaganda is typically viewed negatively, it can also be used for good when it promotes positive social change. This includes things like public health campaigns, environmental awareness, or human rights initiatives. Propaganda is effective at getting people on board with a cause and moving them quickly. An example of this would be the campaigns during the COVID-19 pandemic that encouraged safety precautions like mask use, social distancing, and vaccinations. The image below references positive propaganda related to COVID-19, drawing on the iconic WWII image of Rosie the Riveter.


However, modern propaganda can also have harmful effects. Misinformation (false or misleading information spread unintentionally) and disinformation (deliberately false information) are key tools in harmful propaganda. During the COVID-19 pandemic, for example, anti-vaccine propaganda falsely claimed vaccines were dangerous. This led to widespread vaccine hesitancy, contributing to the virus's continued spread and lead to preventable deaths.

At the same time, NetReputation notes that modern propaganda leverages social media algorithms to shape perceptions on a broader scale. This can bring people together around important causes like human rights, but it can also create division when misinformation spreads. The way it affects society really depends on how it’s used, highlighting just how powerful it can be in shaping our beliefs and actions.

Propaganda can affect different segments of society in distinct ways. One of these groups is the rich and poor. Rich individuals may be less influenced by economic scare tactics, while poorer populations might be more vulnerable to propaganda that exploits financial fears or job insecurity. Richer people are also more likely to be higher educated which can make them less susceptible to buying into negative propaganda. 


Older adults, especially those over 65, are more vulnerable to misinformation on social media. During the 2016 U.S. presidential election, older Twitter users encountered more political fake news than younger users. A study by the National Library of Medicine found that this group not only saw more misinformation but also shared it more often. Factors such as cognitive decline, reduced media literacy, and a lack of familiarity with new digital tools all play a role in this heightened vulnerability.

Younger people, though more familiar with technology, face different challenges. They are often targeted by propaganda through memes, influencers, and viral content that appeals to emotions rather than facts. This makes it easier to influence opinions without them realizing they're seeing biased information. Studies also show that younger people may be less likely to critically assess the credibility of sources, especially when the content is shared by friends. Additionally, younger people may be more susceptible to emotionally charged content, such as memes or viral posts, which can easily sway opinions without strong evaluation​(American Psychological Association).

Another group is gender-targeted propaganda which can reinforce traditional stereotypes, influencing men and women differently. For example, research highlights how marketing often taps into distinct emotional triggers for each gender. According to an article from Optimonk, women are more likely to be influenced by emotionally driven messages and content focusing on community and relationships, whereas men tend to respond to messages emphasizing individualism and achievement. This dynamic can lead to the perpetuation of outdated gender norms, with propaganda aiming to reinforce traditional roles for women, such as homemaking, while men are targeted with messages promoting authority and success.

Propaganda also affects majority and minority groups differently, particularly in relation to sexual orientation. LGBTQ+ communities might face messages that undermine their rights or promote harmful stereotypes, while straight individuals often see narratives that reinforce traditional family roles. Majorities, such as heterosexuals or specific racial groups, are often targeted with content that reinforces their privileged status. In contrast, minorities often face propaganda that’s splitting or dehumanizing, pushing them further to the margins. Digital tactics like targeted ads and influencer campaigns really amplify these effects for both groups.



As a Jewish woman in the LGBTQ+ community, I have witnessed firsthand how propaganda can shape perceptions and experiences within my communities.

For instance, the phrase "From the river to the sea, Palestine should be free" has often been used in antisemitic contexts, implying the eradication of the Jewish state of Israel. While advocating for Palestinian statehood is not inherently antisemitic, calls for the destruction of Israel certainly are. This kind of phrasing can push hatred and division, contributing to a toxic environment for Jewish individuals. For more information click here.

Since the events of October 7th, there has been a staggering 337% increase in antisemitic incidents, fueled largely by negative propaganda and misinformation. I have personally observed a rise in hate speech online, even before Israel's military response.

Propaganda doesn’t just impact me; it also affects my family and friends, often filling us with fear and vulnerability. For my generation, the real challenge is dealing with a flood of misinformation that can widen divides and keep cycles of hate going.


Anti War

Anti-war views have consistently been overlooked in mainstream media, both in the past and today. Back in 1919, figures like Charles Schenck, Eugene V. Debs, Herbert Frohwerk, and Jacob Abrams were jailed simply for expressing their anti-war opinions. Schenck's case even led to the establishment of the "clear and present danger" test, which defined when speech isn’t protected by the First Amendment. He was convicted for distributing pamphlets urging people to resist the draft during World War I. Debs, a well-known socialist leader, was also imprisoned for his anti-war stance, with the government claiming it posed a threat to national security. Frohwerk faced similar charges under the Espionage Act for publishing anti-war materials, reinforcing the idea that certain speech could be restricted if deemed dangerous. In Jacob Abrams v. United States, Justice Oliver Wendell Holmes famously dissented, arguing for free speech, even when it challenges government policies. He introduced the concept of the "marketplace of ideas," emphasizing that the truth is best tested through open discussion.

Fast forward to today, and the U.S. government is still involved in military operations around the globe. If you explore sites like Antiwar.com and The American Conservative, you’ll find strong anti-war voices that rarely make it into mainstream news. I wonder why that is. Before today, I wasn’t even aware of these websites, which makes me question why anti-war views are often frowned upon.

Part of the reason is that anti-war sentiments often get labeled as unpatriotic or disconnected from national interests. During conflicts, strong feelings of patriotism can make any opposition to war seem like it undermines national unity. For instance, during the early days of the Iraq War, large protests against the conflict received minimal media coverage because opposing the war was perceived as going against the country’s interests. This reluctance to showcase anti-war opinions likely stems from larger media outlets wanting to avoid being seen as unpatriotic or out of touch with national priorities.

Another reason anti-war perspectives are often missing from mainstream news is political and economic pressures. These pressures significantly shape how war and anti-war views are covered in the media. Many major media outlets are owned by large corporations with business interests linked to government policies or military spending. These companies may be involved in industries like defense contracts or rebuilding infrastructure after conflicts, meaning they could profit financially from military actions. Consequently, there is often a bias against anti-war views, as critical coverage of military actions might threaten their business interests. For example, major news networks are frequently part of larger corporations that also profit from military spending, creating a conflict of interest. This situation makes it less likely for anti-war voices to receive the attention they deserve in the media. For more information click here






Diffusion of innovation Theory

In 1962, a man named E.M. Rogers developed a theory called the Diffusion of Innovation Theory. This theory is one of the oldest social science theories, explaining how over time an idea or product gains momentum and spreads throughout a specific population or social system. The theory is broken down into five adopter categories: Innovators (or Pioneers), Early Adopters, Early Majority, Late Majority, and Laggards. 

In this blog post, we will explore the Diffusion of Innovation Theory concerning the invention of the first iPhone.

Innovators are the people who obtain a new product at its initial launch. This group consists of 2.5% of the population who are highly knowledgeable, typically have a high social status, and are risk-takers. Innovators are motivated by a desire to be at the forefront of innovation, often seeking novelty and uniqueness. Loyal Apple customers were informed about the launch of the iPhone through email before the majority. This group also plays a key role in providing early feedback and shaping the evolution of the product. For more insight into the role of innovators, check out this article on the product adoption lifecycle.

Early Adopters follow innovators but are still relatively early in the product's lifecycle, making up 13.5% of the population. They are slightly more risk-averse and have above-average social status. This group includes tech enthusiasts, professionals, and individuals interested in cutting-edge gadgets. They were drawn to the iPhone’s sleek design and groundbreaking combination of phone, internet browser, and iPod functionality. Many of these early users had higher disposable incomes, such as business executives and creatives, eager to integrate the iPhone into their professional lives. Their positive experiences and recommendations helped propel the iPhone into mainstream visibility.

The Early Majority, representing about 34% of the population, played a crucial role in the success of the first iPhone. This group typically has an above-average social status and is more cautious than innovators and early adopters. Rather than jumping on the latest tech trend, they prefer to wait until a product like the iPhone has proven itself to be stable, successful, and widely accepted. The early majority relies heavily on social proof and feedback from early adopters. They tend to purchase only after seeing positive experiences from others, ensuring the product meets their expectations for performance and reliability.

The Late Majority, also representing 34% of the population, is even more skeptical of new technologies and waits until a product is well-established and commonplace. This group is likely motivated by trends or peer pressure to purchase the new product. They are also more price-sensitive, often waiting for price drops or improvements in technology before committing to buy.

Laggards represent the final 16% of the population. These individuals tend to be traditionalists, skeptical of innovation, and stick with older products for as long as possible. They are often older and least likely to rely on social proof or influence from early adopters. Laggards typically adopt new technology out of necessity or a lack of alternatives.

While not in the Theory, Non-Adopters are individuals who do not engage with new technologies at all. This is often due to personal beliefs, financial constraints, or a general preference for traditional methods. For example, some people may resist adopting smartphones, including the iPhone, due to concerns about privacy, or a belief that their current devices still meet their needs. This group can impact the overall market by creating a segment of consumers who remain loyal to older technologies, which can influence how companies market new innovations.

In the end, while the iPhone changed how we communicate, it also brought along new issues like reliance on technology, privacy worries, and screen addiction. By understanding the Diffusion of Innovation Theory, we can better navigate these challenges and consider the trade-offs between the benefits and drawbacks of such innovations.

Used or Abused? Navigating My Relationship with Technology

My connection with technology is pretty nuanced, much like many people's experience. One thing I really appreciate is how it helps me st...