Security
Headlines
HeadlinesLatestCVEs

Headline

Fighting online censorship, or, encryption's latest surprise use-case, with Mallory Knodel: Lock and Code S04E05

Categories: Podcast This week on Lock and Code, we speak with Mallory Knodel about the stories that countries tell their people to fear encryption, and why the most recent threats to encryption are different.

(Read more…)

The post Fighting online censorship, or, encryption’s latest surprise use-case, with Mallory Knodel: Lock and Code S04E05 appeared first on Malwarebytes Labs.

Malwarebytes
#mac#apple#google#git

Government threats to end-to-end encryption—the technology that secures your messages and shared photos and videos—have been around for decades, but the most recent threats to this technology are unique in how they intersect with a broader, sometimes-global effort to control information on the Internet.

Take two efforts in the European Union and the United Kingdom. New proposals there would require companies to scan any content that their users share with one another for Child Sexual Abuse Material, or CSAM. If a company offers end-to-end encryption to its users, effectively locking the company itself out of being able to access the content that its users share, then it’s tough luck for those companies. They will still be required to find a way to essentially do the impossible—build a system that keeps everyone else out, while letting themselves and the government in.

While these government proposals may sound similar to previous global efforts to weaken end-to-end encryption in the past, like the United States’ prolonged attempt to tarnish end-to-end encryption by linking it to terrorist plots, they differ because of how easily they could become tools for censorship.

Today, on the Lock and Code podcast with host David Ruiz, we speak with Mallory Knodel, chief technology officer for Center for Democracy and Technology, about new threats to encryption, old and bad repeated proposals, who encryption benefits (everyone), and how building a tool to detect one legitimate harm could, in turn, create a tool to detect all sorts of legal content that other governments simply do not like.

“In many places of the world where there’s not such a strong feeling about individual and personal privacy, sometimes that is replaced by an inability to access mainstream media, news, accurate information, and so on, because there’s a heavy censorship regime in place,” Knodel said. “And I think that drawing that line between ‘You’re going to censor child sexual abuse material, which is illegal and disgusting and we want it to go away,’ but it’s so very easy to slide that knob over into ‘Now you’re also gonna block disinformation,’ and you might at some point, take it a step further and block other kinds of content, too, and you just continue down that path.”

Knodel continued:

“Then you do have a pretty easy way of mass-censoring certain kinds of content from the Internet that probably shouldn’t be censored.”

Tune in today.

You can also find us on Apple Podcasts, Spotify, and Google Podcasts, plus whatever preferred podcast platform you use.

Show notes and credits:

Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 4.0 License
http://creativecommons.org/licenses/by/4.0/
Outro Music: “Good God” by Wowa (unminus.com)

Have a burning question or want to learn more about our cyberprotection? Get a free business trial below.

GET STARTED

Malwarebytes: Latest News

122 million people’s business contact info leaked by data broker