News Peppermint NewsPeppermint is a foreign media curated media that carefully selects and delivers "news that is not in Korea, but that Koreans need." News Peppermint

translates New York Times columns from soup and provides detailed commentary on the background and context. Taking advantage of my experience in eagerly reading, unraveling, and delivering events, news, and discussions in the United States and outside of Korea, I will diligently write so that even if it happens in distant places, it is easy and fun to read. (Words: Editor-in-Chief of News Peppermint Song)

Gonzalez v. Google.

On Jan. 21, the U.S. Supreme Court held oral arguments in a high-profile case. The plaintiff in the case is the bereaved family of 2015-year-old Neumi Gonzalez, a victim who was killed in a 23 terrorist attack at the Bataclan Theater in Paris, France, and other simultaneous terrorist attacks. The defendant is Google, which we all know well, and to be precise, it is the parent company of YouTube, so it is a party to the lawsuit.

At the time, the Islamic extremist terrorist group ISIS said it carried out the attack, and the plaintiff, Gonzalez's family, sued Google, claiming that YouTube was responsible for the attack. In other words, YouTube did not filter ISIS-related videos or videos that incite extremist terrorism, but rather allowed them to spread on the Internet, which contributed to the cultivation of terrorists. While there is no direct testimony that the terrorist decided to participate in the terrorist by watching the YouTube video, the plaintiff argues that, under the circumstances, many of the videos circulating on YouTube clearly contributed to the spread of extremist ideas.

The defendant, Google, countered by saying that Internet companies are not responsible for content such as user posts, comments, or user-generated videos posted on websites or platforms. Section 1996 of the Communications Decency Act, enacted by Congress in 230, is just that. This issue has been frequently discussed not only in this case, but also in relation to the responsibility of Internet platforms and Big Tech companies for a long time. Some say the future of the Internet depends on the outcome of this debate.

Today, I first translated a column by investigative journalist Julia Angwin arguing for a major reform or even repeal of Section 230 of the Communications Decency Act.

▶Read the New York Times column: The law should not be used as a one-size-fits-all shield for Big Tech companies

Is the Internet platform a bookstore? Is it the press?

New York Times Supreme Court reporter Adam Liptag, who covered the oral arguments himself, gave a vivid account of the workshop on the New York Times Daily Podcast. To sum up Liptag's explanation in one line, this is the question that strikes the core.

Whether YouTube is a bookstore or a newspaper is important.

What is a bookstore and what is a newspaper?

The bookstore has a huge selection of books. No matter how much a bookstore owner loves books, there is probably no bookstore in the world that reads all the books in the store and displays only the books he likes. So even if a bookstore sells a book with potentially problematic material, the bookstore is not responsible for an incident or accident caused by the book. The key text of Article 230 of the Communications Decency Act is a short sentence of 26 words:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

In line with today's online environment, which is very different from 1996, this article can be paraphrased as follows:

Companies that provide Internet platform services shall not be treated as publishers or speakers of any information posted by other content providers on the website or platform.

Internet companies that use this provision as a shield compare platforms like YouTube and social media to bookstores. Whether the content on the platform promotes extremism or contains hate speech, it is the fault of the person who posted it, and the platform should not be punished for it. If the bookstore owner is punished for the strange content of the books sold by the bookstore, the business of the bookstores will shrink greatly. The repeal of Section 230 of the Communications Decency Act would greatly diminished the Internet itself, which tech companies warn would be a violation of American society's most sacred right to freedom of expression.

The plaintiffs' lawyers have a different perspective. YouTube is more of a news organization, the plaintiffs argue, than a bookstore. Media organizations are responsible for the articles, videos, and content they write. Even columns written by outsiders to some extent do. Usually, newspapers write something like "the claims of a column have nothing to do with the editorial direction of this magazine," but when an article is published with absurd content, the newspaper, not to mention the author, is criticized, "What are you doing without picking out the writer?"

The plaintiffs compared YouTube to a news organization because of algorithms. Millions of videos will be uploaded every day, and it makes no sense to hold YouTube accountable for every single one of them. However, the plaintiff notes that YouTube uses algorithms to curse and display videos that users may be interested in. The videos shown according to the recommended algorithm are hand-picked by YouTube, and this algorithm is essentially the same as the editing function that the media company is responsible for writing articles and columns, and determining the front-page articles of newspapers or the top stories of broadcast news. Therefore, the plaintiff argues that the content on YouTube is not only the user's remarks, but also the utterances of YouTube that have been selected, filtered, and edited by YouTube. If so, it's only natural to hold YouTube accountable for the content as well.

Plaintiffs' attorney Eric Schnapper agrees to compare Internet platforms to bookstores, but says that YouTube's recommended videos are not just books in bookstores, but more like books that bookstores have voted "Book of the Month."

"We're not internet experts, are we?"

The Communications Decency Act was enacted in 1996, so that was one year before "Respond 1997." Given the incredible pace of technology advancing, can a nearly 1-year-old law properly regulate the current Internet environment? This is also true that Big Tech platforms now provide users with customized content filtered and selected through algorithms based on huge user data. The 30 law did not take big data and algorithms into account. In the meantime, internet companies continue to expand and expand the disclaimer in Section 1996. Avoiding legal liability has led to remarkable growth for Internet companies and become "Big Tech."

The way to achieve what the Supreme Court wants is very difficult, but simple. You just need to get the support of five of the nine justices. Lawyers on both sides make every effort to convince the justices, but because of the different points they consider and their understanding of the case are different, the trick that satisfies everyone is seldom found. That was the case this time.

Justice Clarence Thomas asked if YouTube's video recommendation algorithm is ultimately just a value-neutral, mechanical automated system, and whether this can be seen as YouTube's active action. Justice Thomas, one of the most conservative of the nine justices, values freedom of expression and is likely to vote for Section 230 of the Communications Decency Act. But in a question-and-answer session with liberal Justice Elena Kagan, Schnapper was somewhat at odds.

Justice Kagan gave us a chance to provide solid evidence to support the argument that the Section 230 disclaimer should not be applied indefinitely, to which Schkeepper said that if it is determined that not only YouTube videos, but also tweets on Twitter and even Google search results contributed to something as horrific as terrorism, then the platform that applied the algorithm should be held accountable. A tough stance against compromise may be applauded by like-minded people, but it's not a good attitude when you persuade a borderline justice. Justice Kagan seemed a little perplexed and frustrated at the other.

"The defense's response now sounds a bit extreme. I can't agree to that extent. This is the courthouse. In fact, our judges don't know much about what we're talking about. The nine of us are not here now because we are the best experts in the United States on the Internet, are we?"

There was laughter from the left, but even without the video, I could still guess the expression on Attorney Schkepper's face.

Are algorithms really neutral?

Attorney Lisa Black, who represented Google, spoke plainly about the need to uphold Section 230 of the Communications Decency Act. He said Section 230 of the Communications Decency Act did not provide an unlimited shield for internet companies. Internet companies naturally do their best to monitor and filter out obvious criminal or illegal content on their platforms, but he said it's impossible to completely monitor and filter all content. But why couldn't he do the impossible task, he says that holding the internet itself atrophied, he says, let's think of a world without Article 230 of the Communications Decency Act.