{"id":88762,"date":"2026-02-14T15:04:52","date_gmt":"2026-02-14T18:04:52","guid":{"rendered":"https:\/\/tech.einnews.com\/article\/892413922"},"modified":"2026-02-14T15:04:52","modified_gmt":"2026-02-14T18:04:52","slug":"politically-unstable-section-230-and-holding-big-tech-accountable","status":"publish","type":"post","link":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/2026\/02\/14\/politically-unstable-section-230-and-holding-big-tech-accountable\/","title":{"rendered":"Politically Unstable: Section 230 and holding Big Tech accountable"},"content":{"rendered":"<div><img data-opt-id=758893364  fetchpriority=\"high\" decoding=\"async\" src=\"data:image\/gif;base64,R0lGODlhAQABAIAAAAAAAP\/\/\/ywAAAAAAQABAAACAUwAOw==\" fifu-lazy=\"1\" fifu-data-sizes=\"auto\" fifu-data-srcset=\"https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=75&resize=75&ssl=1 75w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=100&resize=100&ssl=1 100w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=150&resize=150&ssl=1 150w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=240&resize=240&ssl=1 240w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=320&resize=320&ssl=1 320w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=500&resize=500&ssl=1 500w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=640&resize=640&ssl=1 640w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=800&resize=800&ssl=1 800w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=1024&resize=1024&ssl=1 1024w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=1280&resize=1280&ssl=1 1280w, https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1&w=1600&resize=1600&ssl=1 1600w\" fifu-data-src=\"https:\/\/i3.wp.com\/twt-thumbs.washtimes.com\/media\/image\/2021\/02\/04\/B1youngLG230_c0-221-1600-1154_s1200x700.jpg?5ce9db07119eaf94bec97bb90348ce0224c44bc7&ssl=1\" class=\"ff-og-image-inserted\"><\/div>\n<p>In the early days of the internet, Section 230 was passed by Congress as part of the Communications Decency Act of 1996 to help protect platforms and shield them from liability.<\/p>\n<p>With the massive growth of social media and AI, however, the landscape has changed a lot since its passing. Now, many are considering whether Section 230 should be reformed or even done away with entirely due to the harms it can cause, especially to our children.<\/p>\n<p>Brad Carson, who is a former congressman and currently serves as the president of the University of Tulsa and the Under Secretary of the Army, sits down with Washington Times Commentary Editor <a href=\"https:\/\/www.washingtontimes.com\/staff\/kelly-sadler\/\" rel=\"noopener\" target=\"_blank\">Kelly Sadler<\/a> on <a href=\"https:\/\/www.washingtontimes.com\/specials\/politically-unstable-podcast\/\" rel=\"noopener\" target=\"_blank\">Politically Unstable<\/a> to take a deep dive into Section 230 and what could be done to hold tech companies accountable.<\/p>\n<div id>\n<p><strong>[SADLER] <\/strong>We\u2019re going to talk Section 230. It was the 30th anniversary of that law\u2019s passage this week, and the implications it has on basically shielding big tech from any liability of the harms that it does on our youth and children. Brad, could you tell our audience what Section 230 is and what its implications are?<\/p>\n<p><strong>[CARSON] <\/strong>Back in the early days of the Internet, Congress passed the Communications Decency Act. And most of that law was actually struck down by the Supreme Court. Section 230 survived that judicial review. And Section 230 does two things that are very important even today.<\/p>\n<p>One is, it defined the big companies like Facebook, now Google, others, as not being publishers. And so publishers, if you\u2019re the Washington Times or the New York Times, and you were to have someone say something or to write something that was defamatory, you could be sued for libel, as an example. But it protected the platforms from those kinds of claims. And so the notion was that these were platforms where lots of people were just posting things. It was hopefully going to grow. And if you required platforms to moderate all that kind of content, then they would never actually expand. It would be very expensive and litigation could ensue. And so it immunized them from the claims that most other publishers, whether you\u2019re a newspaper or other media outlets, were required to do.<\/p>\n<p>The second thing it did that\u2019s also important is it defined content moderation as something that you could do and that you wouldn\u2019t be liable for some kind of discriminatory activities or if you privileged one viewpoint over another. So you could engage in good faith content moderation without worrying in some way that you were prejudicing one particular group or the other. And so you could, for example, get rid of all conservative speech or all liberal speech you wanted to. Those are the two big parts of it.<\/p>\n<p><strong>[SADLER] <\/strong>Back during COVID, when a lot of conservatives were being censored or monitored or cut out of different chats, conservatives argued getting rid of Section 230 because of the censorship implications. But there are greater implications of it today when we look at our youth and social media and AI and becoming addicted to these platforms, basically leading to suicide or depression. The ill effects of that \u2014 we have Meta that\u2019s currently going through a court proceeding with a young girl who describes how basically the algorithm sucked her in and led to suicidal thoughts. How are companies like Meta using Section 230 to shield themselves from this accountability?<\/p>\n<p><strong>[CARSON] <\/strong>So you used a keyword there: algorithm. And things have changed in 30 years. You know, when Congress passed the Communications Decency Act, they were envisioning hobbyist chat rooms where people might go in, you\u2019re a ham radio operator and, you know, you\u2019re responding to one another. If you\u2019re old enough, you can kind of remember the mid-1990s, and they didn\u2019t want people to be liable if some person posted something that was defamatory or otherwise get you in trouble.<\/p>\n<p>Today, though, social media works very, very differently. It\u2019s an algorithmically curated environment. And so what we see are often driven by artificial intelligence now. And so a couple of years ago, courts held that that algorithm was actually also protected by Section 230. And so Meta will raise that in their defense.<\/p>\n<p>I think the criticism from really the right and the left \u2014 no one is satisfied with Section 230 today \u2014 is that in this algorithmically curated environment that the companies should be liable, right? They\u2019re weighing in heavily. And it\u2019s no longer like the hobbyist chat board it once was where you want to protect yourself from just kind of hobbyists who might post something.<\/p>\n<p><strong>[SADLER] <\/strong>Yeah, and they have an economic incentive to base that algorithm off of. They want to keep our youth addicted and keep scrolling and devote more time on their sites because they can sell more ads. The algorithms become very valuable in being able to identify people and target them with specific products or specific content to sell to other vendors. Where do we go from here? Because there\u2019s been efforts to repeal Section 230. What can Congress do?<\/p>\n<p>As a parent of three young boys, I\u2019m concerned that big tech is getting into their minds and there are safeguards that are out there, but parental controls can only do so much. Ultimately, these companies should bear some responsibility for the content they\u2019re putting out there.<\/p>\n<p><strong>CARSON] <\/strong>That\u2019s exactly right. As I mentioned, both the left and the right despise Section 230 these days. They get kind of bedeviled by trying to find what a new Section 230 could look like. But at a minimum, we should make sure that Section 230 is not extended into the realm of artificial intelligence. Because just like in Facebook, where they\u2019re engagement farming, or they\u2019re tuning algorithms and responses to maximize engagement. We know this from OpenAI who\u2019ve publicly reported that some models have more engagement than others, or they put certain safeguards on ChatGPT, it leads to less engagement.<\/p>\n<p>And so it\u2019s kind of terrifying in some ways to think about it, but there\u2019s almost a dial in Sam Altman\u2019s hands that can determine how much engagement you\u2019re going to have and what the model should look like.&nbsp;<\/p>\n<\/div>\n<p><em>Watch the video for the full conversation.<\/em><\/p>\n<p><strong><a href=\"https:\/\/blockads.fivefilters.org\"> <\/a><\/strong> <a href=\"https:\/\/blockads.fivefilters.org\/acceptable.html\"> <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>&#8230; could be done to hold <span class=\"match\">tech<\/span> companies accountable. [SADLER] We\u00e2\u0080\u0099re &#8230; has on basically shielding big <span class=\"match\">tech<\/span> from any liability of the &#8230; , I\u00e2\u0080\u0099m concerned that big <span class=\"match\">tech<\/span> is getting into their minds &#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"fifu_image_url":"","fifu_image_alt":"","footnotes":""},"categories":[1],"tags":[],"class_list":["post-88762","post","type-post","status-publish","format-standard","hentry","category-news","wpcat-1-id"],"_links":{"self":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts\/88762","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/comments?post=88762"}],"version-history":[{"count":0,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts\/88762\/revisions"}],"wp:attachment":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/media?parent=88762"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/categories?post=88762"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/tags?post=88762"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}