Archive for the ‘Intellectual Property (IP)’ Category

Do You Feel Chilled? W3C and DRM

Monday, February 13th, 2017

Indefensible: the W3C says companies should get to decide when and how security researchers reveal defects in browsers by Cory Doctorow.

From the post:

The World Wide Web Consortium has just signaled its intention to deliberately create legal jeopardy for security researchers who reveal defects in its members’ products, unless the security researchers get the approval of its members prior to revealing the embarrassing mistakes those members have made in creating their products. It’s a move that will put literally billions of people at risk as researchers are chilled from investigating and publishing on browsers that follow W3C standards.

It is indefensible.

I enjoy Cory’s postings and fiction but I had to read this one more than once to capture the nature of Cory’s complaint.

As I understand it the argument runs something like this:

1. The W3C is creating a “…standardized DRM system for video on the World Wide Web….”

2. Participants in the W3C process must “…surrender the right to invoke their patents in lawsuits as a condition of participating in the W3C process….” (The keyword here is participants. No non-participant waives their patent rights as a result of W3C policy.)

3. The W3C isn’t requiring waiver of DCMA 1201 rights as a condition for participating in the video DRM work.

All true but I don’t see Cory gets to the conclusion:

…deliberately create legal jeopardy for security researchers who reveal defects in its members’ products, unless the security researchers get the approval of its members prior to revealing the embarrassing mistakes those members have made in creating their products.

Whether the W3C requires participants in the DRM system for video to waive DCMA 1201 rights or not, the W3C process has no impact on non-participants in that process.

Secondly, security researchers are in jeopardy if and only if they incriminate themselves when publishing defects in DRM products. As security researchers, they are capable of anonymously publishing any security defects they find.

Third, legal liability flows from statutory law and not the presence or absence of consensual agreement among a group of vendors. Private agreements can only protect you from those agreeing.

I don’t support DRM and never have. Personally I think it is a scam and tax on content creators. It’s unfortunate that fear that someone, somewhere might not be paying full rate, is enough for content creators to tax themselves with DRM schemes and software. None of which is free.

Rather than arguing about W3C policy, why not point to the years of wasted effort and expense by content creators on DRM? With no measurable return. That’s a plain ROI question.

DRM software vendors know the pot of gold content creators are chasing is at the end of an ever receding rainbow. In fact, they’re counting on it.

Court: Posting Standards Online Violates Copyright Law [+ solution]

Wednesday, February 8th, 2017

Court: Posting Standards Online Violates Copyright Law by Trey Barrineau.

From the post:

Last week, the U.S. District Court for the District of Columbia ruled that public-records activist Carl Malamud’s organization, Public.Resource.Org, violated copyright law by publicly sharing standards that are used in laws such as building codes. It also said organizations that develop these standards, including those used in the fenestration industry, have the right to charge reasonable fees to access them. Malamud told DWM in an e-mail that he’ll appeal the ruling.
… (emphasis in original)

I was working on a colorful rant, invoking Mr. Bumble in Charles Dickens’s Oliver Twist:

“If the law supposes that,” said Mr. Bumble, squeezing his hat emphatically in both hands, “the law is a ass- a idiot.

based on the report of the decision when I ran across the full court opinion:

AMERICAN SOCIETY FOR TESTING AND MATERIALS, et al., Plaintiffs, v. PUBLIC.RESOURCE.ORG, INC., Defendant. Case No. 13-cv-1215 (TSC)

The preservation of copyright despite being referenced in a law and/or regulation (pages 19-24) is one of the stronger parts of the decision.

In part it reads:


Congress was well aware of the potential copyright issue posed by materials incorporated by reference when it crafted Section 105 in 1976. Ten years earlier, Congress had extended to federal agencies the authority to incorporate private works by reference into federal regulations. See Pub. L. No. 90-23, § 552, 81 Stat. 54 (1967) (codified at 5 U.S.C. § 552) (providing that “matter reasonably available to the class of persons affected thereby is deemed published in the Federal Register when incorporated by reference therein with the approval of the Director of the Federal Register”). However, in the Copyright Act of 1976, Congress made no mention of these incorporated works in § 105 (no copyright for “any work of the United States Government”) or any other section. As the House Report quoted above indicates, Congress already carefully weighed the competing policy goals of making incorporated works publicly available while also preserving the incentives and protections granted by copyright, and it weighed in favor of preserving the copyright system. See H.R. Rep. No. 94-1476, at 60 (1976) (stating that under § 105 “use by the Government of a private work would not affect its copyright protection in any way”); see also M.B. Schnapper v. Foley, 667 F.2d 102, 109 (D.C. Cir. 1981) (analyzing Copyright Act and holding that “we are reluctant to cabin the discretion of government agencies to arrange ownership and publication rights with private contractors absent some reasonable showing of a congressional desire to do so”).

However, recognizing the importance of public access to works incorporated by reference into federal regulations, Congress still requires that such works be “reasonably available.” 5 U.S.C. § 552(a)(1). Under current federal regulations issued by the Office of the Federal Register in 1982, a privately authored work may be incorporated by reference into an agency’s regulation if it is “reasonably available,” including availability in hard copy at the OFR and/or the incorporating agency. 1 C.F.R. § 51.7(a)(3). Thirteen years later, Congress passed the National Technology Transfer and Advancement Act of 1995 (“NTTAA”) which directed all federal agencies to use privately developed technical voluntary consensus standards. See Pub. L. No. 104-113, 110 Stat. 775 (1996). Thus, Congress initially authorized agencies to incorporate works by reference, then excluded these incorporated works from § 105 of the Copyright Act, and, nearly twenty years later, specifically directed agencies to incorporate private works by reference. From 1966 through the present, Congress has remained silent on the question of whether privately authored standards and other works would lose copyright protection upon incorporation by reference. If Congress intended to revoke the copyrights of such standards when it passed the NTTAA, or any time before or since, it surely would have done so expressly. See Whitman v. Am. Trucking Ass’ns, Inc., 531 U.S. 457, 468 (2001) (“Congress . . . does not alter the fundamental details of a regulatory scheme in vague terms or ancillary provisions—it does not . . . hide elephants in mouseholes.”); United States v. Fausto, 484 U.S. 439, 453 (1988) (“[It] can be strongly presumed that Congress will specifically address language on the statute books that it wishes to change.”). Instead, Congress has chosen to maintain the scheme it created in 1966: that such standards must simply be made reasonably available. See 5 U.S.C. § 552(a)(1).
… (emphasis in original, pages 21-23)

Finding to the contrary, that is referencing a privately authored standard as terminating the rights of a copyright holder, creates obvious due process problems.

Some copyright holders, ASTM for example, report sales as a substantial portion of their yearly income. ASTM International 2015 Annual Report gives an annual operating income of $72,543,549, of which, $48,659,345 was from publications. (page 24)

Congress could improve both the “reasonable access” for citizens and the lot of standard developers by requiring:

  • for works incorporated by reference into federal regulations, agencies must secure a license renewable without time limit for unlimited digital reproduction of that work by anyone
  • digital reproductions of such works, whether by the licensing agency or others, must reference the work’s publisher for obtaining a print copy

That gives standard developing organizations a new source of revenue, increases the “reasonable access” of citizens, and if past experience is any guide, digital copies may drive print sales.

Any takers?

Expiring Patents

Tuesday, January 3rd, 2017

Expatents returns a list of patents expiring that day and you can sign up for a weekly digest of expiring patents.

The site claims that over 80% of patents are never commercially exploited.

Are expired patents, that is without commercial exploitation, like articles that are never cited by anyone?

Potential shareholder litigation over the not-so-trivial cost of patents that never resulted in commercial exploitation?

Was it inside or outside counsel that handled the patent filings?

There’s an interesting area for tracing relationships (associations) and expenses.

U.S. Navy As Software Pirates

Monday, November 14th, 2016

Navy denies it pirated 558K copies of software, says contractor consented by David Kravets.

From the post:

In response to a lawsuit accusing the US Navy of pirating more than 558,000 copies of virtual reality software, the Navy conceded Monday that it had installed the software on “hundreds of thousands of computers within its network” without paying the German software maker for it. But the Navy says it did so with the consent of the software producer.

I suspect that “consent” here means that Bitmanagement Software modified its product to remove installation restrictions in hopes the U.S. Navy would become utterly dependent upon the software and only then “notice” the Navy had licensed only 38 copies.

Nice try but sovereigns have been rolling citizens for generations.

The complaint and the government’s answer are both amusing reads.

The lesson here is you are responsible for protecting your property. Especially when exposing it to potential thieves.

The TPP Is Dead! Really Most Sincerely Dead! (Celebration Is In Order!)

Friday, November 11th, 2016

Obama Administration Gives Up on Pacific Trade Deal by William Mauldin.

From the post:

The Obama administration on Friday gave up all hope of enacting its sweeping Pacific trade agreement, a pact designed to preserve U.S. economic influence in fast-growing Asia that was buried by a wave of antitrade political sentiment that culminated with Tuesday’s presidential election….

Yes!

I have ranted about the largely secret Trans-Pacific Partnership (TPP) trade agreement on several occasions.

Negotiated entirely in secret and even worse, designed to be kept secret from the citizens of signing countries, among the worse provisions (there were several), were those enabling investors to sue sovereign countries if laws diminished their investments.

I don’t know, like health warnings on cigarettes for example.

With the election of Donald Trump, I should say president-elect Donald Trump, the TPP is dead. (full stop)

As the proverb says:

It’s an ill wind that blows nobody any good.

Whatever your feelings about president-elect Donald Trump and any of his decisions/policies as president, the defeat of the TPP is one for the win column.

Hazards and dangers lie ahead, just as they would for any presidency, but take a moment to appreciate this win.

Copyright Office Opens Up 512 Safe Harbor ($105 Fee Reduced To $6)

Tuesday, November 1st, 2016

After reading the Copyright Office explanation for the changes Elliot Harmon‘s complains of in Copyright Office Sets Trap for Unwary Website Owners, I see the Copyright Office as opening up the 512 safe harbor to more people.

In his rush to criticize the Copyright Office for not taking EFF advice, Elliot forgets to mention:


Transitioning to the electronic system has allowed the Office to substantially reduce the fee to designate an agent with the Office, from $105 (plus an additional fee of $35 for each group of one to ten alternate names used by the service provider) to $6 (with no additional fee for alternate names).

Copyright Office Announces Electronic System for Designating Agents under DMCA

Wow! Government fees going down?

Going from $105 (plus $35 for alternate names) to $6 and no additional fee for alternate names, opens up the 512 safe harbor to small owners/sites.

True enough, the new rule requires you to renew every three years but given the plethora of renewals we all face, what’s one more? Especially an important one.

The Copyright Office has prepared videos (with transcripts) to guide you to the new system.

A starting point for further reading: Copyright Office Reviews Section 512 Safe Harbor for Online User-Generated Content – The Differing Perceptions of Musicians and Other Copyright Holders and Online Service Providers on the Notice and Take-Down Process by David Oxenford. Just a starting point.

If you have or suspect you have copyright issues, consult an attorney. Law isn’t a safe place for self-exploration.

PS: I understand that EFF must write for its base, but closer attention to the facts of rules and changes would be appreciated.

Elsevier Awarded U.S. Patent For “Online Peer Review System and Method” [Sack Pool?]

Tuesday, August 30th, 2016

Elsevier Awarded U.S. Patent For “Online Peer Review System and Method” by Gary Price.

Gary quotes the abstract:

An online document management system is disclosed. In one embodiment, the online document management system comprises: one or more editorial computers operated by one or more administrators or editors, the editorial computers send invitations and manage peer review of document submissions; one or more system computers, the system computers maintain journals, records of submitted documents and user profiles, and issue notifications; and one or more user computers; the user computers submit documents or revisions to the document management system; wherein one or more of the editorial computers coordinate with one or more of the system computers to migrate one or more documents between journals maintained by the online document management system.

Is there a pool on the staff that recommended and pursued that patent being awarded the sack in the next week?

The Court That Rules The World

Sunday, August 28th, 2016

The Court That Rules The World by Chris Hamby.

If the Trans-Pacific Partnership (TPP) and investor-state dispute settlement (ISDS) don’t sound dangerous to you, this series will change your mind.

Imagine a private, global super court that empowers corporations to bend countries to their will.

Say a nation tries to prosecute a corrupt CEO or ban dangerous pollution. Imagine that a company could turn to this super court and sue the whole country for daring to interfere with its profits, demanding hundreds of millions or even billions of dollars as retribution.

Imagine that this court is so powerful that nations often must heed its rulings as if they came from their own supreme courts, with no meaningful way to appeal. That it operates unconstrained by precedent or any significant public oversight, often keeping its proceedings and sometimes even its decisions secret. That the people who decide its cases are largely elite Western corporate attorneys who have a vested interest in expanding the court’s authority because they profit from it directly, arguing cases one day and then sitting in judgment another. That some of them half-jokingly refer to themselves as “The Club” or “The Mafia.”

And imagine that the penalties this court has imposed have been so crushing — and its decisions so unpredictable — that some nations dare not risk a trial, responding to the mere threat of a lawsuit by offering vast concessions, such as rolling back their own laws or even wiping away the punishments of convicted criminals.

This system is already in place, operating behind closed doors in office buildings and conference rooms in cities around the world. Known as investor-state dispute settlement, or ISDS, it is written into a vast network of treaties that govern international trade and investment, including NAFTA and the Trans-Pacific Partnership, which Congress must soon decide whether to ratify.

These trade pacts have become a flashpoint in the US presidential campaign. But an 18-month BuzzFeed News investigation, spanning three continents and involving more than 200 interviews and tens of thousands of documents, many of them previously confidential, has exposed an obscure but immensely consequential feature of these trade treaties, the secret operations of these tribunals, and the ways that business has co-opted them to bring sovereign nations to heel.

The BuzzFeed News investigation explores four different aspects of ISDS. In coming days, it will show how the mere threat of an ISDS case can intimidate a nation into gutting its own laws, how some financial firms have transformed what was intended to be a system of justice into an engine of profit, and how America is surprisingly vulnerable to suits from foreign companies.

(emphasis in original)

Read carefully and take names.

Few, if any, are beyond one degree of separation from the Internet.

Do Your Part! Illegally Download Scientific Papers

Sunday, August 28th, 2016

download-papers-460

From Rob Beschizza’s post at: Do Your Part! Illegally Download Scientific Papers, which has a poster size, 1940 x 2521 pixel resolution, version.

Digital Rights – Privacy – Video Conference – Wednesday, June 29, 2016

Sunday, June 26th, 2016

Video conference for campus and community organizers (June 2016)

From the webpage:

student-organizing-460

Are you part of a campus or community organization concerned about digital rights?

If not, do you want to raise a voice in your community for privacy and access to the intellectual commons?

We'd like to help! EFF will host a video conference to highlight opportunities for grassroots organizers on Wednesday, June 29, 2016 at 3pm PST / 6pm EST.

We'll hear from speakers describing campaigns and events available for your group's support, as well as best practices that you might consider emulating with your friends and neighbors. We're also eager to hear from you about any digital rights campaigns on which you're working in your community, and to expose others in this growing grassroots network to social media opportunities to support your activism and organizing.

Please register to receive the link through which to participate using an open, encrypted, video chat platform.

No word on removing the tape from your video camera for this event. 😉

Spread the word about this video conference!

Speaking of Wasted Money on DRM / WWW EME Minus 2 Billion Devices

Friday, June 24th, 2016

Just earlier today I was scribbling about wasting money on DRM saying:


I feel sorry for content owners. Their greed makes them easy prey for people selling patented DRM medicine for the delivery of their content. In the long run it only hurts themselves (the DRM tax) and users. In fact, the only people making money off of DRM are the people who deliver content.

This evening I ran across: Chrome Bug Makes It Easy to Download Movies From Netflix and Amazon Prime by Michael Nunez.

Nunez points out an exploit in the open source Chrome browser enables users to save movies from Netflix and Amazon Prime.

Even once a patch appears, others can compile the code without the patch, to continue downloading, illegally, movies from Netflix and Amazon Prime.

Even more amusing:


Widevine is currently used in more than 2 billion devices worldwide and is the same digital rights management technology used in Firefox and Opera browsers. Safari and Internet Explorer, however, use different DRM technology.

Widevine plus properly configured device = broken DRM.

When Sony and others calculate their ROI from DRM, be sure to subtract 2 billion+ devices that probably won’t honor the no-record DRM setting.

Pride Goeth Before A Fall – DMCA & Security Researchers

Friday, June 24th, 2016

Cory Doctorow has written extensively on the problems with present plans to incorporate DRM in HTML5:

W3C DRM working group chairman vetoes work on protecting security researchers and competition – June 18, 2016.

An Open Letter to Members of the W3C Advisory Committee – May 12, 2016.

Save Firefox: The W3C’s plan for worldwide DRM would have killed Mozilla before it could start – May 11, 2016.

Interoperability and the W3C: Defending the Future from the Present – March 29, 2016.

among others.

In general I agree with Cory’s reasoning but I don’t see:

…Once DRM is part of a full implementation of HTML5, there’s a real risk to security researchers who discover defects in browsers and want to warn users about them…. (from Cory’s latest post)

Do you remember the Sony “copy-proof” CDs? Sony “copy-proof” CDs cracked with a marker pen Then, just as now, Sony is about to hand over bushels of cash to the content delivery crowd.

When security researchers discover flaws in the browser DRM, what prevents them from advising users?

Cory says the anti-circumvention provisions of the DMCA prevent security researchers from discovering and disclosing such flaws.

That’s no doubt true, if you want to commit a crime (violate the DMCA) and publish evidence of that crime with your name attached to it on the WWW.

Isn’t that a case of pride goeth before a fall?

If I want to alert other users to security defects in their browsers, possibly equivalent to the marker pen for Sony CDs, I post that to the WWW anonymously.

Or publish code to make that defect apparent to even a casual user.

What I should not do is put my name on either a circumvention bug report or code to demonstrate it. Yes?

That doesn’t answer Cory’s points about impairing innovation, etc. but once Sony realizes it has been had, again, by the content delivery crowd, what’s the point of more self-inflicted damage?

I feel sorry for content owners. Their greed makes them easy prey for people selling patented DRM medicine for the delivery of their content. In the long run it only hurts themselves (the DRM tax) and users. In fact, the only people making money off of DRM are the people who deliver content.

Should DRM appear as proposed in HTML5, any suggestions for a “marker pen” logo to be used by hackers of a Content Decryption Module?

PS: Another approach to opposing DRM would be to inform shareholders of Sony and other content owners they are about to be raped by content delivery systems.

PPS: In private email Cory advised me to consider the AACS encryption key controversy, where public posting of an encryption key was challenged with take down requests. However, in the long run, such efforts only spread the key more widely, not the effect intended by those attempted to limit its spread.

And there is the Dark Web, ahem, where it is my understanding that non-legal content and other material can be found.

A Plausible Explanation For The EC Human Brain Project

Saturday, June 18th, 2016

I have puzzled for years over how to explain the EC’s Human Brain Project. See The EC Brain if you need background on this ongoing farce.

While reading Reject Europe’s Plans To Tax Links and Platforms by Jeremy Malcolm, I suddenly understood the motivation for the Human Brain Project!

From the post:

A European Commission proposal to give new copyright-like veto powers to publishers could prevent quotation and linking from news articles without permission and payment. The Copyright for Creativity coalition (of which EFF is a member) has put together an easy survey and answering guide to guide you through the process of submitting your views before the consultation for this “link tax” proposal winds up on 15 June.

Since the consultation was opened, the Commission has given us a peek into some of the industry pressures that have motivated what is, on the face of it, otherwise an inexplicable proposal. In the synopsis report that accompanied the release of its Communication on Online Platforms, it writes that “Right-holders from the images sector and press publishers mention the negative impact of search engines and news aggregators that take away some of the traffic on their websites.” However, this claim is counter-factual, as search engines and aggregators are demonstrably responsible for driving significant traffic to news publishers’ websites. This was proved when a study conducted in the wake of introduction of a Spanish link tax resulted in a 6% decline in traffic to news websites, which was even greater for the smaller sites.

There is a severe shortage of human brains at the European Commission! The Human Brain Project is a failing attempt to remedy that shortage of human brains.

Before you get angry, Europe is full of extremely fine brains. But that isn’t the same thing as saying they found at the European Commission.

Consider for example, the farcical request for comments, having previously decided the outcome as cited above. EC customary favoritism and heavy-handedness.

I would not waste electrons submitting comments to the EC on this issue.

Spend your time mining EU news sources and making fair use of their content. Every now and again, gather up your links and send them to the publications and copy the EC. So publications can see the benefits of your linking versus the overhead of the EC.

As the Spanish link tax experience proves, link taxes may deceive property cultists into expecting a windfall, in truth their revenue will decrease and what revenue is collected, will go to the EC.

There’s the mark of a true EC solution:

The intended “beneficiary” is worse off and the EC absorbs what revenue, if any, results.

Doctorow on Encrypted Media Extensions (EME) @ W3C and DRM

Tuesday, June 7th, 2016

Cory Doctorow outlines the important public policy issues semi-hidden in W3C efforts to standardize Encrypted Media Extensions (EME).

I knew I would agree with Cory’s points, more or less, before even reading the post. But I also knew that many of his points, if not all, aren’t going to be persuasive to some in the DRM discussion.

If you already favor reasonable accommodation between consumers of content and rightsholders, recognition of “fair use,” and allowances for research and innovation, enjoy Cory’s post and do what you can to support the EFF and others in this particular dispute.

If you are currently a rightsholder and strong supporter of DRM, I don’t think Cory’s post is going to be all that persuasive.

Rather than focusing on public good, research, innovation, etc., I have a very different argument for rightsholders, who I distinguish from people who will profit from DRM and its implementations.

I will lay out all the nuances either tomorrow or the next day, but the crux of my argument is the question: “What is the ROI for rightsholders from DRM?

You will be able to satisfy yourself of my analysis, using your own confidential financial statements. The real ones, not the ones you show the taxman.

To be sure, someone intends to profit from DRM and its implementation, but it isn’t who you think it is.

In the meantime, enjoy Cory’s post!

Pamela Samuelson on Java and Fair Use – Test For Prospective Employers

Saturday, May 28th, 2016

Pamela Samuelson has posted a coherent and compelling narrative on why the Java API victory of Google over Oracle is a very good thing.

Here’s where she comes out:


Developers of software need some simple norms to live by. One such norm is that independent reimplementation of an API in one’s own original code does not infringe copyright. That’s the law as well as good public policy. The public has greatly benefited by the existence of this norm because anyone with a creative software idea can write programs that will run on existing platforms. The software industry has thrived under this norm, and the public has a wide array of choices of innovative programs in a competitive marketplace.

Put Pamela’s analysis to good use.

Ask at your next interview if the prospective employer agrees with Pamela’s post.

It’s 877 words and can double as an attention span test for the interviewer.

Ask before you leap.

Danger! Danger! Oracle Attorney Defends GPL

Saturday, May 28th, 2016

Op-ed: Oracle attorney says Google’s court victory might kill the GPL by Annette Hurst.

From the header:

Annette Hurst is an attorney at Orrick, Herrington & Sutcliffe who represented Oracle in the recent Oracle v. Google trial. This op-ed represents her own views and is not intended to represent those of her client or Ars Technica.

The Oracle v. Google trial concluded yesterday when a jury returned a verdict in Google’s favor. The litigation began in 2010, when Oracle sued Google, saying that the use of Java APIs in Android violated copyright law. After a 2012 trial, a judge held that APIs can’t be copyrighted at all, but that ruling was overturned on appeal. In the trial this month, Google successfully argued that its use of Java APIs, about 11,500 lines of code in all, was protected by “fair use.”

I won’t propogate Annette’s rant but you can read it for yourself at: http://arstechnica.com/tech-policy/2016/05/op-ed-oracle-attorney-says-googles-court-victory-might-kill-the-gpl/.

What are free software supporters to make of their long time deranged, drooling critic expressing support for GPL?

Should they flee as pursued by wraiths on wings?

Should they stuff their cloaks in their ears?

Are these like the lies of Suraman?

Or perhaps better, Wormtongue?

My suggestion? Point to Annette’s rant to alert others but don’t repeat it, don’t engage it, just pass over it in silence.

Repeating evil counsel gives it legitimacy.

Yours.

Reimplementation of an API is FAIR USE!

Thursday, May 26th, 2016

Google wins Oracle copyright fight over Android code by Russell Brandom.

Just one civil jury’s opinion but a major one considering there was $9 billion at stake.

Not a precedent for other cases but it may discourage this type of over-reaching.

Every now and again, even random dice roll a 7 for the good guys.

See Russell’s post for the details.

Thoughts On How-To Help Drown A Copyright Troll?

Thursday, May 19th, 2016

Copyright Trolls Rightscorp Are Teetering On The Verge Of Bankruptcy riff on (arstechnica.com).

Suggestions?

Think of it as a service to the entire community, including legitimate claimants to intellectual property.

I tried to think of any methods I would exclude and came up empty.

You?

Colleges Shouldn’t Have to Deal With Copyright Monitoring [Broods of Copyright Vipers]

Wednesday, May 18th, 2016

Colleges Shouldn’t Have to Deal With Copyright Monitoring by Pamela Samuelson.

From the post:

Colleges have a big stake in the outcome of the lawsuit that three publishers, Cambridge University Press, Oxford University Press, and Sage Publications, brought against Georgia State University officials for copyright infringement. The lawsuit, now in its eighth year, challenged GSU’s policy that allowed faculty members to upload excerpts (mainly chapters) of in-copyright books for students to read and download from online course repositories.

Four years ago, a trial court held that 70 of the 75 challenged uses were fair uses. Two years ago, an appellate court sent the case back for a reassessment under a revised fair-use standard. The trial court has just recently ruled that of the 48 claims remaining in the case, only four uses, each involving multiple chapters, infringed. The question now is, What should be the remedy for those four infringements?

Sage was the only publisher that prevailed at all, and it lost more infringement claims than it won. Cambridge and Oxford came away empty-handed. Despite the narrowness of Sage’s win, all three publishers have asked the court for a permanent injunction that would impose many new duties on GSU and require close monitoring of all faculty uploads to online course repositories.

I expected better out of Cambridge and Oxford, especially Cambridge, which has in recent years allowed free electronic access to some printed textbooks.

Sage and the losing publishers, Cambridge and Oxford, seek to chill the exercise of fair use by not only Georgia State University but universities everywhere.

Pamela details the outrageous nature of the demands made by the publishers and concludes that she is rooting for GSU on appeal.

We should all root for GSU on appeal but that seems so unsatisfying.

It does nothing to darken the day for the broods of copyright vipers at Cambridge, Oxford or Sage.

In addition to creating this money pit for their publishers, the copyright vipers want to pad their nests by:


As if that were not enough, the publishers want the court to require GSU to provide them with access to the university’s online course system and to relevant records so the publishers could confirm that the university had complied with the record-keeping and monitoring obligations. The publishers have asked the court to retain jurisdiction so that they could later ask it to reopen and modify the court order concerning GSU compliance measures.

I don’t know how familiar you are with academic publishing but every academic publisher has a copyright department that shares physical space with acquisitions and publishing.

Whereas acquisitions and publishing are concerned with collection and dissemination of knowledge, while recovering enough profit to remain viable, the copyright department could just as well by employed by Screw.

Expanding the employment rolls of copyright departments to monitor fair use by publishers is another drain on their respective publishers.

If you need proof of copyright departments being a dead loss for their publishers, consider the most recent annual reports for Cambridge and Oxford.

Does either one highlight their copyright departments as centers of exciting development and income? Do they tout this eight year long battle against fair use?

No? I didn’t think so but wanted your confirmation to be sure.

I can point you to a history of Sage, but as a privately held publisher, it has no public annual report. Even that history, over changing economic times in publishing, finds no space to extol its copyright vipers and their role in the GSU case.

Beyond rooting for GSU, work with the acquisitions and publication departments at Cambridge, Oxford and Sage, to help improve their bottom line profit and drown their respective broods of copyright vipers.

How?

Before you sign a publishing agreement, ask your publisher for a verified statement of the ROI contributed by their copyright office.

If enough of us ask, the question will resonant across the academic publishing community.

Elsevier – “…the law is a ass- a idiot.”

Friday, May 6th, 2016

Elsevier Complaint Shuts Down SCI-HUB Domain Name by Ernesto.

From the post:


However, as part of the injunction Elsevier is able to request domain name registrars to suspend Sci-Hub’s domain names. This happened to the original .org domain earlier, and a few days ago the Chinese registrar Now.cn appears to have done the same for Sci-hub.io.

The domain name has stopped resolving and is now listed as “reserved” according to the latest WHOIS info. TorrentFreak reached out to Sci-Hub founder Alexandra Elbakyan, who informed us that the registrar sent her a notice referring to a complaint from Elsevier.

In addition to the alternative domain names users can access the site directly through the IP-address 31.184.194.81, or its domain on the Tor-network, which is pretty much immune to any takedown efforts.

Meanwhile, academic pirates continue to flood to Sci-Hub, domain seizure or not.

The best response to Elsevier is found in Oliver Twist by Charles Dickens, Chapter 52

“If the law supposes that,” said Mr. Bumble, squeezing his hat emphatically in both hands, “the law is a ass- a idiot.”

I do disagree with Ernesto’s characterization of users of Sci-Hub as “academic pirates.”

Elsevier and others have fitted their business model to a system of laws that exploits the unpaid labor of academics, based on research funded by the public, profiting from sales to libraries and preventing wider access out of spite.

There is piracy going on in academic publishing but it isn’t on the part of those seeking to access published research.

Please share access points for Sci-Hub widely and often.

Indigo is the new Blue

Wednesday, April 20th, 2016

Letter from Carl Malamud to Mr. Michael Zuckerman, Harvard Law Review Association.

You can read Carl’s letter for yourself.

Recommend to law students, law professors, judges, lawyers, people practicing for Jeopardy appearances, etc., The Indigo Book: An Open and Compatible Implementation of A Uniform System of Citation.

In any pleadings, briefs, essays, cite this resource as:

Sprigman et al., The Indigo Book: A Manual of Legal Citation, Public Resource (2016).

Every download of the Indigo Book saves someone $25.89 over a competing work on Amazon, which I won’t name for copyright reasons.

Takedowns Hurt Free Expression

Friday, April 1st, 2016

EFF to Copyright Office: Improper Content Takedowns Hurt Online Free Expression.

From the post:

Content takedowns based on unfounded copyright claims are hurting online free expression, the Electronic Frontier Foundation (EFF) told the U.S. Copyright Office Friday, arguing that any reform of the Digital Millennium Copyright Act (DMCA) should focus on protecting Internet speech and creativity.

EFF’s written comments were filed as part of a series of studies on the effectiveness of the DMCA, begun by the Copyright Office this year. This round of public comments focuses on Section 512, which provides a notice-and-takedown process for addressing online copyright infringement, as well as “safe harbors” for Internet services that comply.

“One of the central questions of the study is whether the safe harbors are working as intended, and the answer is largely yes,” said EFF Legal Director Corynne McSherry. “The safe harbors were supposed to give rightsholders streamlined tools to police infringement, and give service providers clear rules so they could avoid liability for the potentially infringing acts of their users. Without those safe harbors, the Internet as we know it simply wouldn’t exist, and our ability to create, innovate, and share ideas would suffer.”

As EFF also notes in its comments, however, the notice-and-takedown process is often abused. A recent report found that the notice-and-takedown system is riddled with errors, misuse, and overreach, leaving much legal and legitimate content offline. EFF’s comments describe numerous examples of bad takedowns, including many that seemed based on automated content filters employed by the major online content sharing services. In Friday’s comments, EFF outlined parameters endorsed by many public interest groups to rein in filtering technologies and protect users from unfounded blocks and takedowns.

A must read whether you are interested in pursuing traditional relief or have more immediate consequences for rightsholders in mind.

Takedowns cry out for the application of data mining to identify the people who pursue takedowns, the use of takedowns, who benefits, to say nothing of the bots that are presently prowling the web looking for new victims.

I for one don’t imagine that rightsholders bots are better written than most government software (you did hear about State’s latest vulnerability?).

Sharpening your data skills on takedown data would benefit you and the public, which is being sorely abused at the moment.

Takedown Bots – Make It Personal

Tuesday, March 29th, 2016

Carl Malamud tweeted on 29 March 2016:

Hate takedown bots, both human and coded. If you’re going to accuse somebody of theft, you should make it personal.

in retweeting:

Mitch Stoltz
‏@mitchstoltz

How takedown-bots are censoring the web. https://www.washingtonpost.com/news/the-intersect/wp/2016/03/29/how-were-unwittingly-letting-robots-censor-the-web/ …

Carl has the right of it.

Users should make the use of take down notices very personal.

After all, illegitimate take down notices are thefts from the public domain and/or fair use.

Caitlin Dewey‘s How we’re unwittingly letting robots censor the Web is a great non-technical piece on the fuller report, Notice and Takedown in Everyday Practice.

Jennifer M. Urban, University of California, Berkeley – School of Law, Brianna L. Schofield, University of California, Berkeley – School of Law, and Joe Karaganis, Columbia University – The American Assembly, penned this abstract:

It has been nearly twenty years since section 512 of the Digital Millennium Copyright Act established the so-called notice and takedown process. Despite its importance to copyright holders, online service providers, and Internet speakers, very little empirical research has been done on how effective section 512 is for addressing copyright infringement, spurring online service provider development, or providing due process for notice targets.

This report includes three studies that draw back the curtain on notice and takedown:

1. using detailed surveys and interviews with more than three dozen respondents, the first study gathers information on how online service providers and rightsholders experience and practice notice and takedown on a day-to-day basis;

2. the second study examines a random sample from over 100 million notices generated during a six-month period to see who is sending notices, why, and whether they are valid takedown requests; and

3. the third study looks specifically at a subset of those notices that were sent to Google Image Search.

The findings suggest that whether notice and takedown “works” is highly dependent on who is using it and how it is practiced, though all respondents agreed that the Section 512 safe harbors remain fundamental to the online ecosystem. Perhaps surprisingly in light of large-scale online infringement, a large portion of OSPs still receive relatively few notices and process them by hand. For some major players, however, the scale of online infringement has led to automated, “bot”-based systems that leave little room for human review or discretion, and in a few cases notice and takedown has been abandoned in favor of techniques such as content filtering. The second and third studies revealed surprisingly high percentages of notices of questionable validity, with mistakes made by both “bots” and humans.

The findings strongly suggest that the notice and takedown system is important, under strain, and that there is no “one size fits all” approach to improving it. Based on the findings, we suggest a variety of reforms to law and practice.

At 160 pages it isn’t a quick or lite read.

The gist of both Caitlin’s post and the fuller report is that automated systems are increasingly being used to create and enforce take down requests.

Despite the margin of reported error, Caitlin notes:

Despite the margin of error, most major players seem to be trending away from human review. The next frontier in the online copyright wars is automated filtering: Many rights-holders have pressed for tools that, like YouTube’s Content ID, could automatically identify protected content and prevent it from ever publishing. They’ve also pushed for “staydown” measures that would keep content from being reposted once it’s been removed, a major complaint with the current system.

There is one source Caitlin uses:

…agreed to speak to The Post on condition of anonymity because he has received death threats over his work, said that while his company stresses accuracy and fairness, it’s impossible for seven employees to vet each of the 90,000 links their search spider finds each day. Instead, the algorithm classifies each link as questionable, probable or definite infringement, and humans only review the questionable ones before sending packets of takedown requests to social networks, search engines, file-hosting sites and other online platforms.

Copyright enforcers should discover their thefts from the public domain or infringement on fair use are on a par with car burglars or shoplifters.

What copyright enforcers lack is an incentive to err on the side of not issuing questionable take down notices.

If the consequences of illegitimate take down notices are high enough, they will spend the funds necessary to enforce only “legitimate” rights.

If you are interested in righteousness over effectiveness, by all means, pursue reform of “notice and takedown” in the copyright holder owned US Congress.

On the other hand, someone, more than a single someone, is responsible for honoring “notice and takedown” requests. Those someones also own members of Congress and can effectively seek changes that victims of illegitimate takedown requests cannot.

Imagine a leak from Yahoo! that outs those responsible for honoring “notice and takedown” requests.

Or the members of “Google’s Trusted Copyright Removal Program.” Besides “Glass.”

Or the takedown requests for YouTube.

Theft from the public cannot be sustained in the bright light of transparency.

Patent Sickness Spreads [Open Source Projects on Prior Art?]

Tuesday, March 8th, 2016

James Cook reports a new occurrence of patent sickness in Facebook has an idea for software that detects cool new slang before it goes mainstream.

The most helpful part of James’ post is the graphic outline of the “process” patented by Facebook:

facebook-patent

I sure do hope James has not patented that presentation because it make the Facebook patent, err, clear.

Quick show of hands on originality?

While researching this post, I ran across Open Source as Prior Art at the Linux Foundation. Are there other public projects that research and post prior art with regard to particular patents?

An armory of weapons for opposing ill-advised patents.

The Facebook patent is: 9,280,534 Hauser, et al. March 8, 2016, Generating a social glossary:

Its abstract:

Particular embodiments determine that a textual term is not associated with a known meaning. The textual term may be related to one or more users of the social-networking system. A determination is made as to whether the textual term should be added to a glossary. If so, then the textual term is added to the glossary. Information related to one or more textual terms in the glossary is provided to enhance auto-correction, provide predictive text input suggestions, or augment social graph data. Particular embodiments discover new textual terms by mining information, wherein the information was received from one or more users of the social-networking system, was generated for one or more users of the social-networking system, is marked as being associated with one or more users of the social-networking system, or includes an identifier for each of one or more users of the social-networking system. (emphasis in original)

U.S. Patents Requirements: Novel/Non-Obvious or Patent Fee?

Monday, February 22nd, 2016

IBM brags about its ranking in patents granted, IBM First in Patents for 23rd Consecutive Year, and is particularly proud of patent 9087304, saying:

We’ve all been served up search results we weren’t sure about, whether they were for “the best tacos in town” or “how to tell if your dog has eaten chocolate.” With IBM Patent no. 9087304, you no longer have to second-guess the answers you’re given. This new tech helps cognitive machines find the best potential answers to your questions by thinking critically about the trustworthiness and accuracy of each source. Simply put, these machines can use their own judgment to separate the right information from wrong. (From: http://ibmblr.tumblr.com/post/139624929596/weve-all-been-served-up-search-results-we-werent

Did you notice that the 1st for 23 years post did not have a single link for any of the patents mentioned?

You would think IBM would be proud enough to link to its new patents and especially 9087304, that “…separate[s] right information from wrong.”

But if you follow the link for 9087304, you get an impression of one reason IBM didn’t include the link.

The abstract for 9087304 reads:

Method, computer program product, and system to perform an operation for a deep question answering system. The operation begins by computing a concept score for a first concept in a first case received by the deep question answering system, the concept score being based on a machine learning concept model for the first concept. The operation then excludes the first concept from consideration when analyzing a candidate answer and an item of supporting evidence to generate a response to the first case upon determining that the concept score does not exceed a predefined concept minimum weight threshold. The operation then increases a weight applied to the first concept when analyzing the candidate answer and the item of supporting evidence to generate the response to the first case when the concept score exceeds a predefined maximum weight threshold.

I will spare you further recitations from the patent.

Show of hands, do U.S. Patents always require:

  1. novel/non-obvious ideas
  2. patent fee
  3. #2 but not #1

?

Judge rankings by # of patents granted accordingly.

How Much Can paragraph -> subparagraph mean? Lots under TPP!

Thursday, February 18th, 2016

Sneaky Change to the TPP Drastically Extends Criminal Penalties by Jeremy Malcolm.

From the post:


What does this surreptitious change from “paragraph” to “subparagraph” mean? Well, in its original form the provision exempted a country from making available any of the criminal procedures and penalties listed above, except in circumstances where there was an impact on the copyright holder’s ability to exploit their work in the market.

In its revised form, the only criminal provision that a country is exempted from applying in those circumstances is the one to which the footnote is attached—namely, the ex officio action provision. Which means, under this amendment, all of the other criminal procedures and penalties must be available even if the infringement has absolutely no impact on the right holder’s ability to exploit their work in the market. The only enforcement provision that countries have the flexibility to withhold in such cases is the authority of state officials to take legal action into their own hands.

Sneaky, huh?

The United States Trade Representative (USTR) isn’t representing your interests or mine in the drafting of the TPP.

If you had any doubt in that regard, Jeremy’s post on this change and others should remove all doubt in that regard.

BMG Seeks to Violate Privacy Rights – Cox Refuses to Aid and Abet

Monday, February 15th, 2016

Cox Refuses to Spy on Subscribers to Catch Pirates by Ernesto Van der Sar.

From the post:

Last December a Virginia federal jury ruled that Internet provider Cox Communications was responsible for the copyright infringements of its subscribers.

The ISP was found guilty of willful contributory copyright infringement and must pay music publisher BMG Rights Management $25 million in damages.

The verdict was a massive victory for the music company and a disaster for Cox, but the case is not closed yet.

A few weeks ago BMG asked the court to issue a permanent injunction against Cox Communications, requiring the Internet provider to terminate the accounts of pirating subscribers and share their details with the copyright holder.

In addition BMG wants the Internet provider to take further action to prevent infringements on its network. While the company remained vague on the specifics, it mentioned the option of using invasive deep packet inspection technology.

Last Friday, Cox filed a reply pointing out why BMG’s demands go too far, rejecting the suggestion of broad spying and account termination without due process.

“To the extent the injunction requires either termination or surveillance, it imposes undue hardships on Cox, both because the order is vague and because it imposes disproportionate, intrusive, and punitive measures against households and businesses with no due process,” Cox writes (pdf).

Read the rest of Ernesto’s post for sure but here’s a quick summary:

Cox.com is spending money to protect your privacy.

I don’t live in a Cox service area but if you do, sign up with Cox and say their opposition to BMG is driving your new subscription. Positive support always rings louder than protesters with signs and litter.

BMG.com is spending money to violate your privacy.

BMG is a subsidiary of Bertelsmann, which claims 112,037 employees.

I wonder how many of those employees have signed off on the overreaching and abusive positions of BMG?

Perhaps members of the public oppressed by BMG and/or Bertelsmann should seek them out to reason with them.

Bearing in mind that “rights” depend upon rules you choose to govern your discussions/actions.

Google Paywall Loophole Going Bye-Bye [Fair Use Driving Pay-Per-View Traffic]

Wednesday, February 3rd, 2016

The Wall Street Journal tests closing the Google paywall loophole by Lucia Moses.

From the post:

The Wall Street Journal has long had a strict paywall — unless you simply copy and paste the headline into Google, a favored route for those not wanting to pony up $200 a year. Some users have noticed in recent days that the trick isn’t working.

A Journal spokesperson said the publisher was running a test to see if doing so would entice would-be subscribers to pay up. The rep wouldn’t elaborate on how long and extensive the experiment was and if permanently closing the loophole was a possible outcome.

“We are experimenting with a number of different trial mechanics at the moment to provide a better subscription taster for potential new customers,” the rep said. “We are a subscription site and we are always looking at better ways to optimize The Wall Street Journal experience for our members.”

The Wall Street Journal can deprive itself of the benefits of “fair use” if it wants to, but is that a sensible position?

Fair Use Benefits the Wall Street Journal

Rather than a total ban on copying, what if the amount of an article that can be copied is set by algorithm? Such that at a minimum, the first two or three paragraphs of any story can be copied, whether you arrive from Google or directly on the WSJ site.

Think about it. Wall Street Journal readers aren’t paying to skim the lead paragraphs in the WSJ. They are paying to see the full story and analysis in particular subject areas.

Bloggers, such as myself, cannot drive content seekers to the WSJ because the first sentence or two isn’t enough for readers to develop an interest in the WSJ report.

If I could quote the first 2 or 3 paragraphs, add in some commentary and perhaps other links, then a visitor to the WSJ is visiting to see the full content the Wall Street Journal has to offer.

The story lead is acting, as it should, to drive traffic to the Wall Street Journal, possibly from readers who won’t otherwise think of the Wall Street Journal. Some of my readers on non-American/European continents for example.

Bloggers Driving Readers to Wall Street Journal Pay-Per-View Content

By developing algorithmic fair use as I describe it would enlist an army of bloggers in spreading notice of pay-per-view content of the Wall Street Journal, at no expense to the Wall Street Journal. As a matter of fact, bloggers would be alerting readers of pay-per-view WSJ content, at the blogger’s own expense.

It may just be me but if someone were going to drive viewers to pay-per-view content on my site, at their own expense, with fair use of content, I would be insane to prevent that. But, I’m not the one grasping at dimes while $100 bills are flying overhead.

Close the Loophole, Open Up Fair Use

Full disclosure, I don’t have any evidence for fair use driving traffic to the Wall Street Journal because that evidence doesn’t exist. The Wall Street Journal would have to enable fair use and track appearance of fair use content and the traffic originating from it. Along with conversions from that additional traffic.

Straight forward data analytics but it won’t happen by itself. When the WSJ succeeds with such a model, you can be sure that other paywall publishers will be quick to follow suite.

Caveat: Yes, there will be people who will only ever consume the free use content. And your question? If they aren’t ever going to be paying customers and the same fair use is delivering paying customers, will you lose the latter in order to spite the former?

Isn’t that like cutting off your nose to spite your face?

Historical PS:

I once worked for a publisher that felt a “moral obligation,” their words, not mine, to prevent anyone from claiming a missing journal issue to which they might not be entitled. Yeah. Journal issues that were as popular as the Watchtower is among non-Jehovah’s Witnesses. Cost to the publisher, about $3.00 per issue, cost to verify entitlement, a full time position at the publisher.

I suspect claims ran less than 200 per year. My suggestion was to answer any request with thanks, here’s your missing copy. End of transaction. Track claims only to prevent abuse. Moral outrage followed.

Is morality the basis for your pay-per-view access policy? I thought pay-per-view was a way to make money.

Pass this post along to the WSJ if you know anyone there. Free suggestion. Perhaps they will be interested in other, non-free suggestions.

Can You Help With Important But Non-Visual Story? – The Blue People

Thursday, January 14th, 2016

Accelerate Your Newsgathering and Verification reported a post that had 3 out of 5 newsgathering tools for images. But as I mention, there are important but non-visual stories that need improved tools for newsgathering and verification.

The copyright struggle between the Blue People and Carl Malamud is an important, but thus far, non-visual story.

Here’s the story in a nutshell:

Laws, court decisions, agency rulings, etc., that govern our daily lives, are found in complex document stores. They have complex citation systems to enable anyone to find a particular law, decision, or rule.

Those systems are the Dewey Decimal system or the Library of Congress classification, except several orders of magnitude more complex. And the systems vary from state to state, etc.

It’s important to get citations right, well, let’s let the BlueBook speak for itself:

The primary purpose of a citation is to facilitate finding and identifying the authority cited…. (A Uniform System of Citation, Tenth Edition, page iv.)

If you are going to quote a law or have access to it, you must have the correct citation.

In order to compel people to obey the law, they must have fair notice of it. And it stands to reason if you can’t find the law, no access to a citation guide, you are SOL as far as access to the law.

The courts come into the picture, being as lazy if not lazier than programmers, by referring to the “BlueBook” as the standard for citations. Courts could have written out their citation practices but as I said, courts are lazy.

Over time, the court’s enshrined their references to the “BlueBook” in court rules, which grants the “BlueBook” an informal monopoly on legal citations and access to the law.

As you have guessed by now, the Blue People, with their government created, unregulated monopoly, charge for the privilege of knowing how to find the law.

The Blue People are quite fond of their monopoly and are loathe to relinquish it. Even though a compilation of how statutes, regulations and courts decisions are cited in fact, is “sweat of the brow” work and not eligible for copyright protection.

A Possible Solution, Based on Capturing Public Facts

The answer to claims of copyright by the Blue People is to collect evidence of the citation practices in all fifty states and federal practice and publish such evidence along with advisory comments on usage.

Fifty law student/librarians could accomplish the task in parallel using modern search technologies and legal databases. Their findings would need to be collated but once done, every state plus federal practice, including nuances, would be easily accessible to anyone.

The courts, as practitioners of precedent,* will continue to support their self-created BlueBook monopoly.

But most judges will have difficulty distinguishing Holder, Attorney General, et al. v. Humanitarian Law Project et al. 561 U. S. 1 (2010) (following the BlueBook) and Holder, Attorney General, et al. v. Humanitarian Law Project et al. 561 U. S. 1 (2010) (following the U.S. Supreme Court and/or some recording of how cases are cited by the US Supreme Court).

If you are in the legal profession or aspire to be, don’t forget Jonathan Swift’s observation in Gulliver’s Travels:

It is a maxim among these lawyers that whatever has been done before, may legally be done again: and therefore they take special care to record all the decisions formerly made against common justice, and the general reason of mankind. These, under the name of precedents, they produce as authorities to justify the most iniquitous opinions; and the judges never fail of directing accordingly.

The inability of courts to distinguish between “BlueBook” and “non-BlueBook” citations will over time render their observance of precedent a nullity.

Not as satisfying as riding them and the Blue People down with war horns blowing but just as effective.

The Need For Visuals

If you have read this far, you obviously don’t need visuals to keep your interest in a story. Particularly a story about access to law and similarly exciting topics. It is an important topic, just not one that really gets your blood pumping.

How would you create visuals to promote public access the laws that govern our day-to-day lives?

I’m no artist but one thought would be to show people trying to consult law books that are chained shut by their citations. Or perhaps one or two of the identifiable Blue People as Jacob Marley type figures with bound law books and heavy chains about them?

The “…could have shared…might have shared…” lines would work well with access to legal materials.

Ping me with suggested images. Thanks!

Law as Pay-to-Play – ASTM International vs. Public.Resource.org, Inc.

Tuesday, January 12th, 2016

Carl Malamud has been hitting Twitter hard today as he posts links to new materials in ASTM International vs. Public.Resource.org, Inc. (case docket).

The crux of the case is whether a legal authority, like the United States, can pass a law that requires citizens to buy materials from private organizations, in order to know what the law says.

That is a law will cite a standard, say by ASTM, and you are bound by the terms of that law, which aren’t clear unless you have a copy of a standard from ASTM. ASTM will be more than happy to sell you a copy.

It’s interesting that ASTM, which has reasonable membership fees of $75 a year, would be the lead plaintiff in this case.

There are technical committees associated with ANSI that have membership fees of $1,200 or more per year. And that is the lowest membership category.

I deeply enjoyed Carl’s tweet that described the ANSI amicus brief as “the sky is falling.”

No doubt from ANSI’s perspective, if Public.Resource.org, Inc. prevails, which it should under any sensible notice of the law reasoning, the sky will be falling.

ANSI and its kin profit by creating a closed club of well-heeled vendors who can pay for early access and participate in development of standards.

You have heard the term “white privilege?” In the briefs for ASTM and its friends, you will realize how deeply entrenched “corporate privilege” is in the United States. The ANSI brief is basically “this is how we do it and it works for us, go away.” No sense of other at all.

There is a running implication that standards organizations (SDOs) have to sell copies of standards to support standards activity. At least on a quick skim, I haven’t seen any documentation on that point. In fact, the W3C, which makes a large number of standards, seems to do ok giving standards away for free.

I can’t help but wonder how the presiding judge will react should a data leak from one of the plaintiffs prove that the “sale of standards” is entirely specious from a financial perspective. That is membership, the “pay-to-play,” is really the deciding factor.

That doesn’t strengthen or weaken the public notice of the law but I do think it is a good indication of the character of the plaintiffs and the lengths they are willing to go to preserve corporate privilege.

In case you are still guessing, I’m on the side of Public.Resource.org.