How Social Media Owns Your Present and Future Thoughts

By Betuel Gag

A Real-Life Social Dilemma

We live in a world where fake news is expanding 6X faster than the truth. Advertisers are the customers and the users are the product being sold. And it’s free. You’re being sold every single day.

Let’s see the full picture.

Firstly, there are so-called massive scale contagion experiments. In other words, emotional states are being transferred to others via emotional contagion, leading people to experience the same emotions without their awareness by all social media companies. Gregory S. McNeal, an entrepreneur and tech & policy expert, explains the social media scheme where Facebook manipulated user news feeds to create emotional responses. Experiments like this have been tested way before Facebook would add the official clause in their policies.

Secondly, notification design is probably the least obvious and important element for the social media users but, in reality, it is one of the most powerful user-oriented, tech-designed algorithms that can predict our behaviour and emotions so that we are hooked by small bits of information 24/7.

This process is like a digital pacifier with on-screen notifications.

Tristan Harris has worked at Google for years, researching the effects of tech design and what causes us to be dependent on beeps. He said “If you see a notification, it schedules you to have thoughts that maybe you didn't intend to have. If you swipe over that notification, it schedules you into spending a little bit of time getting sucked into something that maybe you didn't intend to get sucked into.”

Lastly, Polarization shows a version of our “ideal world” rather than the real world. It is a false sense that everyone agrees with you. Polarization takes place everywhere on social media because the algorithm of your feed shows you what you want to know as “yes I do” principles and “everybody thinks like me” semantic posts.

As a result, we are witnessing an era of persuasion/manipulation technology—your brain VS 1000+ engineers and a super-computer.

Good luck.

Social media algorithms are actually a set of opinions embedded into a computer code for a company’s profit.

In other words, social media algorithms use your content to manipulate other people’s behaviour and destabilize the fabric of society. It’s not your faultbut be aware it is happening.

Did you know that whales are now more financially valuable when they are dead than when they were alive?

We are becoming like whales. We become more valuable online than offline.

Think of this example.

Money is a tool that can be used both for good and bad intentions, but social media is a tool that demands interaction. It demands attention currency and, therefore, embodies an internal war between your true values and your sense of popularity.

Is this a problem that social media companies will solve for you?

Not really. 

Let me tell you more.

betuel 1.png

Content You May Like

In March, I was looking for a job in London and I got invited to an interview at Taboola which is a monopoly right now all over the world in content you may like all across the web, with offices in Tel Aviv, London and headquarters in New York. I passed all my interviews and had a final meeting with one of the directors. I asked him after diving deeper in the subject, “How does your algorithm know what to recommend to a user that you don’t know anything about? You’re not Facebook, right?”

He said, “We don’t have to know everything about the user. The algorithm will analyse the user’s behaviour and the user’s intent. So, based on that, we get accurate recommendations for each user on the world wide web. If, for example, you type in ‘coronavirus,’ you will see here every source on the internet where this word was mentioned. From that point on, the algorithm will do its work.”

What is Marketing Modelling?

I think everyone in the future will start doing market modelling whether your starting point is content you may like, social media, SEO, web design or any form of digital marketing.These digital cash machines require lots of data from social media companies. Here is the short version.

You need high-accurate certainty about everything around, under and above.

Great certainty leads to great predictions. Great predictions lead to lots of dataa googol amount of data. Lots of data lead to building models that can predict behaviour. Behaviour prediction leads to building 3rd party desired habits. Habits influence emotions and naturally involve attention. 

Attention is time.

My Chinese Confession 

During my time in China, I studied how ads work and it’s totally natural for us to ask this question: Is China pumping fake engagement in its platforms, like TikTok?

Dennis Yu, a world-renown digital marketing entrepreneur in digital marketing, said recently in his interviews the following:

“Where does Google and Facebook invest most of their money? It’s in advertising. Why? Because that’s where the analytics are, where the algorithm is actually located.

There were 14 powers in history starting from Egyptians, Greeks, Romans and so on. And there were 14 transitions. Out of those 14, 12 happened through the war.

Now the question is, ‘Is this social media manipulation the digital war that transitions the power from the USA to China?’ Because Chinese officials do the same thing in China. What if this is the war where the super-powers are fighting with people as products in the digital space?

And ultimately, what is the outcome of this in the following decade?”

Now, for me everything reduces to simple concepts such as the Cialdini principles.

Persuasion versus manipulation.

Scarcity and reciprocity.

The above set of psychological elements is embedded in tricks like street magicians.

We should have realized by now that these principles are also part of the Facebook algorithm.

Question pic.png

Question.

What is the difference between persuasion and manipulation to you?

For me, persuasion means convincing someone to change their emotions and behaviour while they are aware of making the decision by themselves while manipulation is changing someone’s emotions and behaviour without them being aware of that fact. 

Conclusion

The goals of Facebook and social tycoons have changed. Due to user tailored marketing models and the emotional/behavioural marketing moulding machines, it’s not relevant for social platforms to actually show the whole picture with regards to engagement but to create the perfect digital utopia for each user on the planet.

Although this digital utopia may create a real-life dystopia…

They follow the action because more 3rd party control between the users means more power for the 3rd party social data collector. Their algorithm was designed in the first place to create digital connection between its users, but with any other algorithm on the planet, it is not trained to understand that when it creates digital connection—it also creates real disconnection. It’s a computer problem that Facebook and social media platforms don’t look after because this is their global power machine.

As a result, we are experiencing our own digital utopia which gives birth to our own real-life dystopia.

This is a problem that should be addressed by the public, through the public, and for the public.

When the illegal data collection of Cambridge Analytica was backfired by the public in 2018, things turned out well for a short period of time. Moreover, now we shall look at what we, as the public, can do to develop a safer digital future for the next generation.

Previous
Previous

Digital Dilemma

Next
Next

A legend has left us. Her impact will carry on.