What the dead internet theory means for Gen Z

It’s impossible to browse the internet in any capacity these days without encountering a bot, but in recent months (especially on X) there has been a noticeable phenomenon of bot-generated content posts that may ignore links They reference videos that are clearly generated sentences or use artificially generated images of things that don’t exist—yet they go viral. Examining the responses, they almost all repeated two or three general ideas, sometimes using the same words or phrases (or using the exact same responses). You may also see a large number of responses generated on user accounts to posts advertising adult content (both viral and non-viral), most of which are obvious scams. Bots hinder replies on sites like Reddit or YouTube, while human-generated images hinder Facebook. How to explain the influx of bot-generated replies and more novel bot-generated viral content? What does this mean for user experience on social media?

In recent years, “dead internet theories” have emerged on internet chat forums, noting a sudden increase in bot activity on the internet. The theory is that social media is primarily sustained by interactions between algorithmically driven bot accounts. Semi-awareness of the absurd scale of the theory always accompanies any mention of it, but in the context of the current apparent scale of bot activity, the broader idea of ​​bots created for general consumption producing and sustaining viral content is less few. And less easily overlooked. While browsing social media, we are inundated with ads that use language that appears to be algorithmically profitable, contained within the norms of website discourse, engaging in agriculture through artificial virality, unintentional content and false emergencies, Etc. may profit from advertising and opinions.

The emergence of technologies like ChatGPT further confuses this untrue picture. LLMs—which stand for “large language models,” are essentially generators that predict the next markup style and can be trained on human writing styles—and image generators like BingAI, which allow anyone to create bots. These bots can be trained on viral “styles,” predicting which phrases and even how many words are likely to go viral most effectively.Images created to attract attention, from Pentagon explosions to imaginary fashion, often posted without disclosing the AI ​​generation, spread like wildfire. And the technology is only getting better and more seamless; while AI-generated video clips and images caused derision a year ago, they are now reliably fooling younger generations who have grown up on the internet. From dramatic and provocative events to everyday life, being surrounded by layers of unreality—images, replies, posters, events—can create a sense of displacement or even nowhere to return, and the Internet’s This feeling is heightened by the immaterial nature.

It’s also worth noting that this generated content – ​​textual, visual, whatever form it takes – relies on the algorithms of the various platforms. Generated content that goes viral on X looks different than on Facebook, and even within the platform, content targeting different audiences can vary significantly. Concerns about the nature of positive feedback loops facilitated by internet algorithms are best known in the oft-discussed “alt-right pipeline,” a phenomenon in which social media sites’ algorithms expose users to more extreme (and therefore compelling) content A version of some type of (here ideologically loaded) content they have consumed before.

Harnessing algorithmic power means generating viral content that both responds to the anxieties or desires of the wider culture, while also taking into account the niche each bot aims to cater to, leveraging the perception of its audience. Without any journalistic integrity, anything that generates traffic, attention and revenue from views and advertising will be prioritized. Going back to the fake images of the Pentagon explosion, these ideas were not random; A visually impressive threat, with immediate consequences for high levels of government, is more attractive than a similarly disruptive incident. Contrast the Pentagon explosion with massive oil spills in freshwater deposits such as the Great Lakes. Even without faking a disaster of this magnitude, the generated posts are usually designed to generate a lot of engagement over and over in a short period of time; it doesn’t matter if the message can be countered quickly as long as the message reaches as many people as possible. The overabundance of these images starts to cause fatigue when browsing this type of content; the heavy fabrication makes other generated posts less offensive in comparison. If it doesn’t rise to the level of an immediate threat, it’s not that interesting. In engagement bait, by contrast, the subterfuge becomes something to look forward to.

The internet and social media, like language, were developed to bring people together and make the world a smaller place. But generating advertising revenue through gamified engagement creates a haven for bot-created images, videos and short messages that are less visible in their creation. When browsing the internet, you know that much of what you see is probably fake, which can obscure the connections that social media is meant to make. The core of the dead internet theory, which only a few years ago was still a semi-ironic observation, is borne out in the history of bot-on-bot interactions. What was meant to be a conduit for information and interaction reflects the anxieties of viral chaos, profit motives, existing desires and dynamos seeking engagement. It’s hollow, disingenuous, and disorienting, and has shaped the way an entire generation interacts with media and the Internet—either with ignorance and confusion, or with suspicion and indifference, allowing a vast, fast flow The water in the river becomes turbid and it is easy to slide completely into it.

Daily Arts Writer Nat Johnson can be reached at nataljo@umich.edu.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *