Automated copyright takedown bot goes haywire (2018)

Summary:

For years, Google and YouTube have included a trusted flagger program by which certain entities that have shown they “are particularly effective at notifying YouTube” of content violations are given more powerful tools with which to do so.

DMCA notification; Automated copyright takedown bot

This is used often in the copyright context, and companies with a good history may be given access to things like bulk flagging tools and priority review of flagged content. One such trusted flagger for copyright was a company called Topple Track, which offered an automated service for musicians, searching the internet for infringing works and dashing off automated DMCA notices.

In May of 2015, digital music distribution company Symphonic purchased Topple Track, but appeared to keep the service running under its own brand.

In the summer of 2018, some people noticed that Topple Track’s automated DMCA notices appeared to go a bit haywire, sending DMCA notices for all kinds of perfectly legitimate content. Among those targeted with DMCA notices were the Electronic Frontier Foundation (EFF), the American Bar Association, NYU’s Law Review, the Crunchbase article about the company MP3Tunes and many, many more — including many artists’ own web stores. EFF’s summary of the wild takedowns gives a sample

Among others, these notices improperly target:

Other targets include an article about the DMCA in the NYU Law Review, an NBC News article about anti-virus scams, a Variety article about the Drake-Pusha T feud, and the lyrics to ‘Happier’ at Ed Sheeran’s official website. It goes on and on.

EFF published an article about this and noted that it seemed as yet another example of an automated DMCA reporting bot “running amok.” The group also questioned why such a company was in Google’s “trusted flagger” program.

Decisions to be made by Google / YouTube:

  • What qualifications are there for a partner to be considered a “trusted flagger”?
  • How often are trusted flaggers reviewed to make sure they still belong in the program?
  • What does it take to get a trusted flagger removed from the program?

Questions and policy implications to consider:

  • With more emphasis on the speed of removals, it is often tempting for regulators to promote “trusted flagging” or “priority” accounts that are able to get content removed at a much quicker pace. What are the benefits and risks of such programs?
  • Automated flagging and now AI/Machine Learning flagging are increasingly a part of the content moderation landscape. How are they calibrated? How frequently are they reviewed?
  • What should the response be when an automated bot is flagging many accounts mistakenly?

Resolution: 

After the EFF published its article about Topple Track, the parent company Symphonic Distribution apologized to the organization, saying: “bugs within the system that resulted in many whitelisted domains receiving these notices unintentionally.” As EFF pointed out in response, this seemed difficult to believe, seeing as the problem was not simply a mistake in domains that shouldn’t have been scanned, but simply claiming stuff that had nothing to do with the underlying copyright-covered material.

A few weeks after the article, YouTube also told EFF that Topple Track had been removed from its Trusted Flagger program, “due to a pattern of problematic notices.”

Some time after this, Topple Track, as a unique organization appeared to disappear, and the service and technology have apparently been subsumed into Symphonic Distribution’s catalog of services.


Written by The Copia Institute, February 2021

Copia logo