Skip to main content
Tags: social media
OPINION

Should We Regulate Social Media?

Should We Regulate Social Media?

Michael Laitman By Friday, 14 October 2022 10:59 AM EDT Current | Bio | Archive

After years of avoiding this sensitive issue, the U.S. Supreme Court finally agreed this month “to decide whether social media platforms may be sued despite a law that shields the companies from legal responsibility for what users post on their sites,” writes The New York Times.

“The case, brought by the family of a woman killed in a terrorist attack, argues that YouTube’s algorithm recommended videos inciting violence. … Section 230 of the Communications Decency Act, a 1996 law intended to nurture … the internet. … The law said that online companies are not liable for transmitting materials supplied by others. Section 230 also helped enable the rise of huge social networks like Facebook and Twitter by ensuring that the sites did not assume new legal liability with every new tweet, status update and comment.”

However, the freedom from liability seems to have been abused.

“A growing group of bipartisan lawmakers, academics and activists have grown skeptical of Section 230,” continues the story, “and say that it has shielded giant tech companies from consequences for disinformation, discrimination and violent content that flows across their platforms.”

However, according to the plaintiffs, “The platforms forfeit their protections when their algorithms recommend content, target ads or introduce new connections to their users.”

This may sound like a legal battle over power and control, but Section 230 can cost lives.

“In one case,” continues the newspaper story, “the family of an American killed in a terrorist attack sued Facebook, claiming that its algorithm had bolstered the reach of content produced by Hamas.” The lawsuit was rejected but one judge said “Facebook’s algorithmic suggestions should not be protected by Section 230.”

Freedom of the internet is a problem. Because human nature pushes us to exploit whatever we can for our own good, when tech giants can exploit a platform by promoting content that will increase their income, no morals will inhibit them.

As a result, they promoted videos of ISIS beheadings, and other gruesome acts of terror to people they identified as potential sympathizers. The lawsuit claims that promoting such content not only boosts sales for tech giants, but also encourages potential terrorists to act.

There is certainly a need to restrain circulation of violent videos or content that incite to violence. Also, one of the arguments against social media is that if they target specific content to specific people, they are no longer uninvolved “billboards,” as they claim, but active players in shaping the minds of those using their platforms.

On the one hand, it is impossible to return to the days when there was no targeting. On the other hand, who will decide to what extent to target and by what criteria?

After all, we are all subject to the same weaknesses that tempt social media giants to misuse their platforms. How, then, can we guarantee that whoever is in charge of monitoring content will not fall prey to the same errors as the owners of social media platforms?

The only solution I can see is to launch a comprehensive, thorough, and long-term educational process that will make us aware of our interconnectedness. Only if we realize, at the deepest level of our being, that when we hurt others, we hurt ourselves, will we stop exploiting, oppressing, bullying, and otherwise harming one another.

At the moment, we are nowhere near the understanding that we need this process. We are persistently and insistently pushing ourselves down a tunnel that will end in a nuclear world war.

If we launch this educational process in time, we will reverse the trend that we are on. If we do not, we will inflict inconceivable suffering on each other until we realize that we are dependent on each other.

Michael Laitman is a global thinker living in Israel with a Ph.D. in Philosophy and Kabbalah and an MS in Medical Bio-Cybernetics. He has published more than 40 books. Laitman believes that only through unity and connection can we solve our problems, creating a better world for our children. Visit www.MichaelLaitman.com for more info. Read Michael Laitman's Reports — More Here.

© 2025 Newsmax. All rights reserved.


MichaelLaitman
Because human nature pushes us to exploit whatever we can for our own good, when tech giants can exploit a platform by promoting content that will increase their income, no morals will inhibit them.
social media
682
2022-59-14
Friday, 14 October 2022 10:59 AM
Newsmax Media, Inc.

Sign up for Newsmax’s Daily Newsletter

Receive breaking news and original analysis - sent right to your inbox.

(Optional for Local News)
Privacy: We never share your email address.
Join the Newsmax Community
Read and Post Comments
Please review Community Guidelines before posting a comment.
 
TOP

Interest-Based Advertising | Do not sell or share my personal information

Newsmax, Moneynews, Newsmax Health, and Independent. American. are registered trademarks of Newsmax Media, Inc. Newsmax TV, and Newsmax World are trademarks of Newsmax Media, Inc.

NEWSMAX.COM
America's News Page
© Newsmax Media, Inc.
All Rights Reserved
Download the Newsmax App
NEWSMAX.COM
America's News Page
© Newsmax Media, Inc.
All Rights Reserved