• The Fee Is Free Unless You Win®.
  • America's Largest Injury Law Firm™
  • Protecting Families Since 1988
  • 20 Billion+ Won
  • 1,000+ Lawyers Nationwide

Free Case Evaluation

Tell us about your situation so we can get started fighting for you. We tailor each case to meet our clients' needs.
Results may vary depending on your particular facts and legal circumstances. ©2024 Morgan and Morgan, P.A. All rights reserved.
Our results speak for themselves

The attorneys featured above are licensed in Florida. For a full list of attorneys in your state please visit our attorney page.

How Social Media Is Harmful

Social media has changed our society in many ways, from allowing us to stay updated in real-time with friends and family to putting us in touch with ideologies and information that we might never have come across otherwise.

It can even help people find support in communities when they are struggling with issues they wouldn't feel comfortable talking about with their closest friends in real life. Still, social media can and is used to harm and amplify damaging content detrimental to consumers, from promoting fabricated and unattainable standards of attractiveness and out-of-reach lifestyles to turning vulnerable youths into extremists.

However, when it comes to the law, social media platforms enjoy exceptional immunity from liability, even when published studies and leaked internal information show how social media is harmful, sometimes with reckless disregard. That may change, though, at any time. To delve into this, we have to take a look at current cases that are being reviewed by the Supreme Court, as well as present-day laws on the books which dictate how social media lawsuits are handled by the courts.

The lawyers at Morgan and Morgan understand that when you or someone is hurt by social media, you want answers and accountability. We believe that when social media platforms use algorithms and designs that are constructed to exploit and manipulate, they should have a responsibility when it inevitably leads to harm.

If you’ve been affected by the dangers of social media, contact us today to learn more about your ability to take action.

Scroll down for more

How it works

It's easy to get started.
The Fee Is Free™. Only pay if we win.

Results may vary depending on your particular facts and legal circumstances.

  • Step 1

    your claim

    With a free case evaluation, submitting your case is easy with Morgan & Morgan.

  • Step 2

    We take

    Our dedicated team gets to work investigating your claim.

  • Step 3

    We fight
    for you

    If we take on the case, our team fights to get you the results you deserve.

Client success
stories that inspire and drive change

Explore over 55,000 5-star reviews and 800 client testimonials to discover why people trust Morgan & Morgan.

Results may vary depending on your particular facts and legal circumstances. Based on Select nationwide reviews

  • Video thumbnail for 5l3q2e67j8
    Wistia video play button
  • Video thumbnail for yfe952tcop
    Wistia video play button
  • Video thumbnail for z1bqwg9hkl
    Wistia video play button
  • Video thumbnail for s5nb3hnvkv
    Wistia video play button
  • Video thumbnail for t4elibxene
    Wistia video play button
  • Video thumbnail for 5nr9efxqj3
    Wistia video play button
  • Video thumbnail for e8s1x6u5jp
    Wistia video play button


Get answers to commonly asked questions about our legal services and learn how we may assist you with your case.

  • What Is Section 230 of the Communications Decency Act?

    To understand why it's so hard to hold social media platforms responsible for hosting and even feeding harmful content to consumers, you first have to look at Section 230 of the Communications Decency Act of 1996. When it passed more than 30 years ago, it was done to protect website companies from being sued when users of the websites posted material that others might consider objectionable.

    For example, if a news site that allowed comments on its content ran an article about a local businessperson, and one commenter wrote, "that guy is a pedophile," the businessperson wouldn't be able to hold the news site liable because it didn't create the content. However, the businessperson might be able to sue the person who wrote the comment for defamation, that is, of course, if the statement was untrue.

    The second part of Section 230 allows website platforms to remove content they find is "obscene, violent, harassing, or otherwise objectionable," regardless of constitutional protections. Conversely, if the website platform chooses to leave the content up, it still cannot be held liable for any harm it causes.

    When the Communications Decency Act was passed, it was a different world than it is today. In 1996, only about 9.5 million Americans were using the Internet. Today, the number is over 300 million. That's just about all of us. Still, while we intend to talk about how social media is harmful and the ways to hold social media executives accountable for damages to families and individuals, what is being argued at the Supreme Court will likely determine a path forward.

    Likewise, politicians will have input into the future of internet platforms and how much accountability will lie at the feet of internet giants like Google, Meta, TikTok, Twitter, and others. According to a former official at the Federal Communications Commission, they have three options from which to choose. Leave it as it is, repeal, or replace it. Section 230 has come under fire from both Democrats and Republicans in recent years. Democrats believe Section 230 has allowed hate speech and misinformation to flourish. At the same time, it draws Republican ire for the alleged suppression of conservative content.

  • What Is the U.S. Justice System Considering Concerning Free Speech and the Internet in 2023?

    This year is shaping up to be pivotal regarding free expression on the Internet. Cases are expected to be heard that will assist in establishing new boundaries by which Internet platforms conduct business. Topics that are under consideration include how much responsibility these platforms will have to police content to ensure terrorist-act-inducing content isn't being disseminated and their own company-created algorithms aren't promoting it. Other issues being looked at include whether the government should be able to intervene and lay down safety guidelines and whether social media sites can remove content based on viewpoints.

    The potential ramifications of changes to our current system are not simple or clear-cut. Anytime you tweak an established system of operation, there will be winners and losers. Still, there are compelling arguments on both sides that must be considered. Here is what the U.S. Justice System is evaluating on Section 230 matters this year:

    Gonzales v Google - The Supreme Court has elected to hear this case that has the potential to alter how Internet platform business models work. Currently, there is minimal interference on what kind of content can be published because of the protections provided under Section 230. However, this case has been brought forward by the family of an American citizen who was murdered in Paris during the 2015 terrorist attack. The case aspires to establish whether Google is shielded from liability by Section 230 under the Anti-Terrorism Act (ACT). The case alleges Google aided and abetted the terrorist organization ISIS by promoting its videos through algorithms that recommend content to users.

    If you've ever spent time on YouTube, you may have noticed that once you watch a video on a certain topic, similar videos will appear that generally follow the theme of what you watched. If the court sides with the plaintiff, it could increase the liability for using algorithms. Companies like Google would have to rethink the risks.

    Twitter v. Taamneh - Another Supreme Court case being heard isn't directly related to Section 230. However, it may still affect how platforms decide to police content. This case is also brought under the ATA and concerns whether Twitter should have been more assertive when monitoring for content that advocated terrorism since it moderates content published on its site. Depending on the outcome of this case, the legal question for Internet platforms may be whether they should moderate at all because if they acknowledge the content exists, it could be used against them later in court cases.  

    Florida and Texas social media laws - The Supreme Court still hasn't decided to hear this case. Still, it concerns two tech industry groups that filed suit against the states of Florida and Texas over laws they are trying to enact that will interfere with the group's beliefs, policies, and terms of service. Florida and Texas are proposing laws that would prevent online platforms from deciding what kind of content to allow on their services. NetChoice and the Computer & Communications Industry Association argue the laws the states are trying to pass would violate First Amendment rights.

    NetChoice challenge to California law - In a separate lawsuit currently at the district court level, NetChoice is going up against a new California law intended to make the Internet safer for children. However, NetChoice argues this would restrict free speech guarded by the constitution. The Age-Appropriate Design Code Act requires Internet platforms that are expected to be visited by children to minimize risks to those users. However, NetChoice argues that the language of the law is vague and subject to the attorney general's interpretation of what is acceptable. They further argue that the law would restrict access to essential resources for vulnerable children already facing discrimination.    

    While it's understandable to want someone to be accountable for the harm caused by dangerous content, it will be very difficult for the U.S. Justice System to carve out solutions that will protect people from unsafe content but not interfere with constitutional rights. One area where there could be a liability is algorithms that push dangerous content to users without warning or permission.

  • Why Are People Bringing Lawsuits Against Social Media Platforms?

    Various lawsuits allege that social media platforms like Instagram, YouTube, and TikTok purposefully cause and contribute to mental health issues affecting vulnerable youth. These issues include eating disorders, low self-esteem, suicidal thoughts, anxiety, depression, and body dysmorphia. Some of the lawsuits suggest that sites like Meta, which owns Facebook and Instagram, created unreasonably dangerous products that result in addiction.

    It doesn't help that internal memos, presentations, and documents demonstrate these companies were aware of the addictive nature of their product, specifically designed it to be so, and further created dangerous algorithms that could feed harmful content to its users.

    Meta invested millions into designing its platform to appeal to youth in particular. Algorithms are designed to amplify content that gets the most likes, shares, and comments, which can be detrimental because, often, this content isn't based on reality. For example, on Instagram, the top posts are usually images of people that have been heavily filtered, creating impossible standards for physical beauty. Likewise, since these algorithms award visibility of posts based on the amount of engagement they get, users are likely to see posts that feature divisive content and dangerous stunts that can lead to others echoing them to get similar results.    

    Other lawsuits against Meta allege the platform encourages underage children to open multiple secret accounts without parental consent. Based on the gender identity you pick when opening an account, users might get suggestions to follow accounts that can be harmful, especially for young girls who are acutely concerned with body image. Meta understood through its own studies that it harmed one in three girls who used its product but took no action. Instead, they buried the research and continued to leave harmful content on the site, making it easy for people with body images to find more and more dangerous material.  

    While social media can be particularly bad for children, it can have adverse effects on people of any age. Our brains naturally compare, and when we're constantly bombarded with the highlights of other people's lives, it's easy to feel down on ourselves when our life doesn't compare. We generally only see the "best of" people's lives because everyone wants to put their best foot forward on social media.

    Likewise, since you can scroll endlessly on social media by design, you may end up looking at how wonderful everyone else in the world is doing and forget to lead your own life. The tendency for addiction to social media is not an accident. Social media companies have purposefully and artfully fine-tuned their platforms to keep users on their sites with features and exploitative algorithms with no heed to the consequences on society.

  • Have You or a Loved One Been Harmed by Social Media?

    If you're learning how social media is harmful, you've likely had some experience that drove you to seek legal guidance. Morgan and Morgan are here to help. While laws are still evolving, legal options may be available to you now. We'd like to hear from you about your concerns. While we believe in protecting first amendment rights, we also believe in protecting the rights of innocent victims that may have been exploited by social media. It's not right for billion-dollar companies to have no accountability, especially when they create a product that, by design, is harmful and use algorithms to amplify, glorify, and perpetuate destructive content.

    We're not afraid to tackle challenging issues with giant corporations. Morgan and Morgan have been fighting to recover compensation for injured parties for more than 35 years and, in doing so, have successfully won against several mega-corporations.

    Contact us today for a free case evaluation.

  • Scroll down for more Load More