This also reads to me like the songwriters are pissed that they aren't the beneficiaries of YouTube, but other companies are. I dunno...
Thanks CBS for highlighting the biggest issue for artists & songwriters today. https://t.co/z8dI0OVxvo
— Irving Azoff (@irvingazoff) June 22, 2016
This is historic. 500+ creators and 20+ companies united. pic.twitter.com/zqeahdq2T2
— Irving Azoff (@irvingazoff) June 22, 2016
Labels made deals with @YouTube out of desperation. It’s pennies & whack-a-mole. How can labels renew YT deals with so much artist pressure?
— Irving Azoff (@irvingazoff) June 22, 2016
.@YouTube can stop leaks of their own programs and keep them behind the paid wall….
— Irving Azoff (@irvingazoff) June 22, 2016
…but @YouTube won’t keep music that leaks or demos that artists don’t want released off @YouTube.
— Irving Azoff (@irvingazoff) June 22, 2016
If selling ads doesn’t make enough money to pay artists, then it’s bad business. US vinyl sales generated more revenue than YT advertising.
— Irving Azoff (@irvingazoff) June 22, 2016
This isn’t about @YouTube.  It’s about safe harbor. The playing field should be even for all digital players…
— Irving Azoff (@irvingazoff) June 22, 2016
…Apple, Spotify & Pandora don’t hide behind safe harbor.
— Irving Azoff (@irvingazoff) June 22, 2016
The future is streaming. The future is also contingent on paying artists and writers fairly.
— Irving Azoff (@irvingazoff) June 22, 2016
Yeah, I think that's it. This pic really doesn't say anything, just that they want change which I can definitely understand. But... how? What do they propose? I very seriously doubt they want to take their music off youtube completely because that's one main way they get their music out there to make what money they do get from elsewhere. I don't know. I'm kinda confused to be honest.
I think this is the request that she should be involved in... not killing youtube
http://consequenceofsound.net/2016/06/thom-yorke-trent-reznor-eddie-vedder-sign-open-letter-to-congress-to-stop-gun-violence/ (http://consequenceofsound.net/2016/06/thom-yorke-trent-reznor-eddie-vedder-sign-open-letter-to-congress-to-stop-gun-violence/)
Yeah, I think that's it. This pic really doesn't say anything, just that they want change which I can definitely understand. But... how? What do they propose? I very seriously doubt they want to take their music off youtube completely because that's one main way they get their music out there to make what money they do get from elsewhere. I don't know. I'm kinda confused to be honest.
I agree Tracy. I wouldn't know half the artists I do if it wasn't for the 'Tube. It would be increasingly difficult to follow even Jewel without finding stuff about her on the internet. Radio just isn't cutting it anymore.
My understanding is they want to eliminate the "Safe Harbour" provision of the DMCA. Basically, Safe Harbour says that internet companies can't get sued for stuff their customers upload and aren't responsible for monitoring it all. But, still have to make reports to law enforcement in special cases (child porn has be to reported w/ in 72 hours, for example).
…but @YouTube won’t keep music that leaks or demos that artists don’t want released off @YouTube.
— Irving Azoff (@irvingazoff) June 22, 2016
If HBO and Netflix and Movie companies can figure out models to make intellectual property based entertainment work, so can musical artists. I think it's just that they're try to use big government to force people back into the past instead of evolving. That is just not going to work.
Times are changin'. Notice that there's no young artists in that list of signatures.
The letter offers several criticisms attributed to the DMCA safe harbor that don’t hold up.
“This law was written and passed in an era that is technologically out-of-date compared to the era in which we live.”
Far from being out-of-date, the 1998 DMCA is the most recent substantive revision of the Copyright Act. By this logic, the entire 1976 Copyright Act is technologically out-of-date too — some parts of which date all the way back to 1909. Even if we accept the dubious proposition that laws have a shelf life and an expiration date, then the DMCA is one of the freshest parts of the Copyright Act.
“Music consumption has skyrocketed, but the monies earned by individual writers and artists for that consumption has plummeted.”
Recent industry data and analysis reflects a growing music sector. PROs have announced growing, record payouts to songwriters, topping $1 billion annually. And consumers are spending more on music each year. It is true that some stakeholders, like the RIAA and music labels, are losing out as consumers spend a growing share of that money on streaming and live events. However, the DMCA is not responsible for consumers’ shifting preferences.
“The tech companies who benefit from the DMCA today were not the intended protectorate when it was signed into law nearly two decades ago.”
This is a strange view of how laws should work, that only the specific interests who were connected enough to have had lobbyists at the table when a law was enacted should be able to benefit from it. Sadly, this is how much of the Copyright Act is written. But the DMCA isn’t an example of that. Courts have noted that Congress explicitly intended that the DMCA should apply broadly, beyond those who existed in 1998. This makes sense when considered in light of Congress’ intention that the safe harbors provide legal certainty to an entire industry. It’s also somewhat disingenuous to suggest that only companies which were in existence in 1998 can benefit from the Internet, since most of the Internet post-dates 1998. This interpretation would ensure the law applies to virtually no one, which might be the desired outcome, but isn’t consistent with what Congress intended.
Undermining the DMCA would represent bad political judgment for several reasons. First, a vast number of platforms and users rely on the DMCA in the daily business. As the SOPA debacle indicated, reopening the DMCA to upset this reliance — particularly to mandate that everyone’s postings are affirmatively surveilled — is a “third rail” in IP politics.
The U.S. Copyright Office — which has a history of favoring the interests of Hollywood and record companies over Silicon Valley technologists — recently solicited public comments and held hearings on the notice-and-takedown provisions. The Copyright Office is widely expected in the next year to recommend that Congress make changes to the DMCA, possibly adopting Hollywood’s preferred approach: “notice and staydown.”
The idea of “notice and staydown” is that when an ISP receives a notice of copyright infringement it would then search out and delete all copies of that work and, more importantly, block that work from ever being uploaded again.
That sounds good in theory. But consider it in just a bit of depth and its appeal quickly falls apart. First, just because one user is infringing on a copyright doesn’t mean that a second user who posts the same content is also infringing. The second person may be licensed or making a sort of use — for example, a non-profit educational use — that the law often treats as permissible. Notice and staydown would guarantee that such perfectly legitimate uses would get blocked.
But there’s a worse problem. Notice and staydown effectively kills the chance of any startup or entrepreneur to compete with established players such as YouTube and Facebook.
It’s impossible to enforce any “staydown” without technologies that mark and identify copyrighted material. And that sort of technology is extremely expensive. YouTube has what is considered the most sophisticated system out there, called Content ID. It takes digital “fingerprints” of copyrighted works and checks all new uploads against those fingerprints. Something even more elaborate than Content ID would be required to make notice and staydown work.
Content ID cost Google more than $50 million to build. Not many startups can replicate that.
“Notice-and-Stay-Down” Is Really “Filter-Everything”
The Proposal Is Unfair to Both Users and Media Platforms
There’s a debate happening right now over copyright bots, programs that social media websites use to scan users’ uploads for potential copyright infringement. A few powerful lobbyists want copyright law to require platforms that host third-party content to employ copyright bots, and require them to be stricter about what they take down. Big content companies call this nebulous proposal “notice-and-stay-down,” but it would really keep all users down, not just alleged infringers. In the process, it could give major content platforms like YouTube and Facebook an unfair advantage over competitors and startups (as if they needed any more advantages). “Notice-and-stay-down” is really “filter-everything.”
At the heart of the debate sit the “safe harbor” provisions of U.S. copyright law (17 U.S.C. § 512), which were enacted in 2000 as part of the Digital Millennium Copyright Act. Those provisions protect Internet services from monetary liability based on the allegedly infringing activities of their users or other third parties.
Section 512 lays out various requirements for service providers to be eligible for safe harbor status—most significantly, that they comply with a notice-and-takedown procedure: if you have a reasonable belief that I’ve infringed on your copyright, you (or someone acting on your behalf) can contact the platform and ask to have my content removed. The platform removes my content and notifies me that it’s removed it. I have the opportunity to file a counter-notice, indicating that you were incorrect in your assessment: for example, that I didn’t actually use your content, or that I used it in a way that didn’t infringe your copyright. If you don’t take action against me in a federal court within 14 days, my content is restored.
The DMCA didn’t do much to temper content companies’ accusations that Internet platforms enable infringement. In 2007, Google was facing a lot of pressure over its recent acquisition YouTube. YouTube complied with all of the requirements for safe harbor status, including adhering to the notice-and-takedown procedure, but Hollywood wanted more.
Google was eager to court those same companies as YouTube adopters, so it unveiled Content ID. Content ID lets rightsholders submit large databases of video and audio fingerprints. YouTube’s bot scans every new upload for potential matches to those fingerprints. The rightsholder can choose whether to block, monetize, or monitor matching videos. Since the system can automatically remove or monetize a video with no human interaction, it often removes videos that make lawful fair uses of audio and video.
Now, some lobbyists think that content filtering should become a legal obligation: content companies are proposing that once a takedown notice goes uncontested, the platform should have to filter and block any future uploads of the same allegedly infringing content. In essence, content companies want the law to require platforms to develop their own Content-ID-like systems in order to enjoy safe harbor status.
For the record, the notice-and-takedown procedure has its problems. It results in alleged copyright infringement being treated differently from any other type of allegedly unlawful speech: rather than wait for a judge to determine whether a piece of content is in violation of copyright, the system gives the copyright holder the benefit of the doubt. You don’t need to look far to find examples of copyright holders abusing the system, silencing speech with dubious copyright claims.
That said, safe harbors are essential to the way the Internet works. If the law didn’t provide a way for web platforms to achieve safe harbor status, services like YouTube, Facebook, and Wikipedia could never have been created in the first place: the potential liability for copyright infringement would be too high. Section 512 provides a route to safe harbor status that most companies that use user-generated content can reasonably comply with.
A filter-everything approach would change that. The safe harbor provisions let Internet companies focus their efforts on creating great services rather than spend their time snooping their users’ uploads. Filter-everything would effectively shift the burden of policing copyright infringement to the platforms themselves, undermining the purpose of the safe harbor in the first place.
That approach would dramatically shrink the playing field for new companies in the user-generated content space. Remember that the criticisms of YouTube as a haven for infringement existed well before Google acquired it. The financial motivators for developing a copyright bot were certainly in place pre-Google too. Still, it took the programming power of the world’s largest technology company to create Content ID. What about the next YouTube, the next Facebook, or the next SoundCloud? Under filter-everything, there might not be a next.
Here’s something else to consider about copyright bots: they’re not very good. Content ID routinely flags videos as infringement that don’t copy from another work at all. Bots also don’t understand the complexities of fair use. In September, a federal appeals court confirmed that copyright holders must consider fair use before sending a takedown notice. Under the filter-everything approach, legitimate uses of works wouldn’t get the reasonable consideration they deserve. Even if content-recognizing technology were airtight, computers would still not be able to consider a work’s fair use status.
Again and again, certain powerful content owners seek to brush aside the importance of fair use, characterizing it as a loophole in copyright law or an old-fashioned relic. But without fair use, copyright isn’t compatible with the First Amendment. Do you trust a computer to make the final determination on your right to free speech?