Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

November 7, 2014

British intelligence spies on lawyer-client communications, government admits

Filed under: Cybersecurity,Government,Privacy,Security — Patrick Durusau @ 3:11 pm

British intelligence spies on lawyer-client communications, government admits by David Meyer.

From the post:

After the Snowden leaks, British lawyers expressed fears that the government’s mass surveillance efforts could undermine the confidentiality of their conversations with clients, particularly when those clients were engaged in legal battles with the state. Those fears were well-founded.

On Thursday the legal charity Reprieve, which provides assistance to people accused of terrorism, U.S. death row prisoners and so on, said it had succeeded in getting the U.K. government to admit that spy agencies tell their staff they may target and use lawyer-client communications “just like any other item of intelligence.” This is despite the fact that both English common law and the European Court of Human Rights protect legal professional privilege as a fundamental principle of justice.

See David’s post for the full details.

The dividends from 9/11 continue. One substantial terrorist attack and the United States, the United Kingdom and a number of other countries are in head long flight from their constitutions and traditions of individual liberty from government intrusion.

Given the lack of terrorist attacks in the United States following 9/11, either the United States isn’t on maps used by terrorists or they can’t afford a plane ticket to the US. I don’t consider the underwear bomber so much a terrorist as a sad follower who wanted to be a terrorist. If that’s the best they have, we are in no real danger.

What the terrorism debate needs is a public airing of credible risks and strategies for addressing those risks. The secret abandonment of centuries of legal tradition because government functionaries lack the imagination to combat common criminals is inexcusable.

Citizens are in far more danger from their governments than any known terrorist organization. Perhaps that was the goal of 9/11. If so, it was the most successful attack in human history.

September 17, 2014

ODNI and the U.S. DOJ Commemorate 9/11

Filed under: Privacy,Security — Patrick Durusau @ 9:27 am

Statement by the ODNI and the U.S. DOJ on the Declassification of Documents Related to the Protect America Act Litigation September 11, 2014

What better way to mark the anniversary of 9/11 than with a fuller account of another attack on the United States of American and its citizens. This attack not by a small band of criminals but a betrayal of the United States by those sworn to protect the rights of its citizens.

From the post:

On January 15, 2009, the U.S. Foreign Intelligence Surveillance Court of Review (FISC-R) published an unclassified version of its opinion in In Re: Directives Pursuant to Section 105B of the Foreign Intelligence Surveillance Act, 551 F.3d 1004 (Foreign Intel. Surv. Ct. Rev. 2008). The classified version of the opinion was issued on August 22, 2008, following a challenge by Yahoo! Inc. (Yahoo!) to directives issued under the Protect America Act of 2007 (PAA). Today, following a renewed declassification review, the Executive Branch is publicly releasing various documents from this litigation, including legal briefs and additional sections of the 2008 FISC-R opinion, with appropriate redactions to protect national security information. These documents are available at the website of the Office of the Director of National Intelligence (ODNI), www.dni.gov; and ODNI’s public website dedicated to fostering greater public visibility into the intelligence activities of the U.S. Government, IContheRecord.tumblr.com. A summary of the underlying litigation follows.

In case you haven’t been following along, the crux of the case was Yahoo’s refusal on Fourth Amendment grounds to comply with a fishing expedition by the Director of National Intelligence and the Attorney General for information on one or more alleged foreign nationals. Motion to Compel Compliance with Directives of the Director of National Intelligence and Attorney General.

Not satisfied with violating their duties to uphold the Constitution, the DNI and AG decided to add strong arming/extortion to their list of crimes. Civil contemp fines, fines that started at $250,000 per day and then doubled each week thereafter that Yahoo! failed to comply with the court’s judgement were sought by the government. Government’s Motion for an Order of Civil Contempt.

Take care to note that all of this occurred in absolute secrecy. Would not do to have other corporations or the American public to be aware that rogue elements in the government were deciding what rights citizens of the United States enjoy and which ones they don’t.

You may also want to read Highlights from the Newly Declassified FISCR Documents by Marc Zwillinger and Jacob Sommer. They are the lawyers who represented Yahoo in the challenge covered by the released documents.

We all owe them a debt of gratitude for their hard work but we also have to acknowledge that Yahoo, Zwillinger and Sommer were complicit in enabling the Foreign Intelligence Surveillance Court (FISC) and the Foreign Intelligence Court of Review (FISCR) to continue their secret work.

Yes, Yahoo, Zwillinger and Sommer would have faced life changing consequences had they gone public with what they did know, but everyone has a choice when faced with oppressive government action. You can, as the parties did in this case and further the popular fiction that mining user metadata is an effective (rather than convenient) tool against terrorism.

Or you can decide to “blow the whistle” on wasteful and illegal activities by the government in question.

Had Yahoo, Zwillinger or Sommer known of any data in their possession that was direct evidence a terrorist attack or plot, they would have called the Office of the Director of National Intelligence, or at least their local FBI office. Yes? Wouldn’t any sane person do the same?

Ah, but you see, that’s the point. There wasn’t any such data. Not then. Not now. Read the affidavits, at least the parts that aren’t blacked out and you get the distinct impression that the government is not only fishing, but it is hopeful fishing. “There might be something, somewhere that somehow might be useful to somebody but we don’t know.” is a fair summary of the government’s position in the Yahoo case.

A better way to commemorate 9/11 next year would be with numerous brave souls taking the moral responsibility to denounce those who have betrayed their constitutional duties in the cause of fighting terrorism. I prefer occasional terrorism over the destruction of the Constitution of the United States.

You?

I started the trail that lead to this post from a tweet by Ben Gilbert.

September 4, 2014

Celebrity Nudes: Blaming and Shaming

Filed under: Privacy,Security — Patrick Durusau @ 10:58 am

Violet Blue in Wake up: The celebrity nudes hack is everyone’s problem follows offering ten steps for victims to protect themselves with:

Telling victims that they “shouldn’t have done it” or “what did you expect” is pointless. Instead of blaming and shaming, how about some information people can really use to help them make the decisions that are right for them, and equipping them with tools to mitigate, minimize and even possibly avoid damage if something goes wrong?

Which is deeply ironic because both Violet Blue and a number of the comments blame/shame Apple for the security breach.

Blaming and shaming IT companies for security breaches is about as non-productive as any blaming and shaming can be.

As you probably know already, security breaches are not viewed as promotional opportunities, at least by the companies suffering the security breach.

Missing from most discussions of the hacked iCloud accounts are questions like:

  • How to improve ICloud security?
  • What improved security will cost?
  • Who will pay the cost (including inconvenience) of improved iCloud security?
  • …(and other issues)

Violet’s ten steps to help people protect themselves are OK, but if highly trained and security conscious administrators share passwords with Edward Snowden, a violation of basic password security, lots of luck on getting anyone to follow Violet’s ten rules.

Blaming and shaming IT companies for security breaches may play well to crowds, but it doesn’t get us any closer to solving security issues either from a technical (coding/system/authentication) or social (cost/inconvenience allocation) perspective.

PS: Perhaps Apple should have a warning on uploads to the ICloud:

Digital data, such as IPhone photos are at risk of being stolen and mis-used by others. Uploading/sharing/emailing digital data increases that risk exponentially. YHBW

July 25, 2014

Pussy Stalking [Geo-Location as merge point]

Filed under: Data Mining,Merging,Privacy — Patrick Durusau @ 12:45 pm

Cat stalker knows where your kitty lives (and it’s your fault) by Lisa Vaas.

From the post:

Ever posted a picture of your cat online?

Unless your privacy settings avoid making APIs publicly available on sites like Flickr, Twitpic, Instagram or the like, there’s a cat stalker who knows where your liddl’ puddin’ lives, and he’s totally pwned your pussy by geolocating it.

That’s right, fat-faced grey one from Al Khobar in Saudi Arabia, Owen Mundy knows you live on Tabari Street.

cat stalker

Mundy, a data analyst, artist, and Associate Professor in the Department of Art at Florida State University, has been working on the data visualisation project, which is called I Know Where Your Cat Lives.
….

See Lisa’s post for the details about the “I Know Where Your Cat Lives” project.

The same data leakage is found in other types of photographs as well. Such as photographs by military personnel.

An enterprising collector could use geolocation as a merge point to collect all the photos made at a particular location. Or using geolocation ask “who?” for some location X.

Or perhaps a city map using geolocated images to ask “who?” Everyone may not know your name but with a large enough base of users, someone will.

PS: There is at least one app for facial recognition, NameTag. I don’t have a cellphone so you will have to comment on how well it works. I still like the idea of a “who?” site. Perhaps because I prefer human intell over data vacuuming.

June 25, 2014

On Taxis and Rainbows

Filed under: Data,Privacy — Patrick Durusau @ 4:06 pm

On Taxis and Rainbows: Lessons from NYC’s improperly anonymized taxis logs by Vijay Pandurangan.

From the post:

Recently, thanks to a Freedom of Information request, Chris Whong received and made public a complete dump of historical trip and fare logs from NYC taxis. It’s pretty incredible: there are over 20GB of uncompressed data comprising more than 173 million individual trips. Each trip record includes the pickup and dropoff location and time, anonymized hack licence number and medallion number (i.e. the taxi’s unique id number, 3F38, in my photo above), and other metadata.

These data are a veritable trove for people who love cities, transit, and data visualization. But there’s a big problem: the personally identifiable information (the driver’s licence number and taxi number) hasn’t been anonymized properly — what’s worse, it’s trivial to undo, and with other publicly available data, one can even figure out which person drove each trip. In the rest of this post, I’ll describe the structure of the data, what the person/people who released the data did wrong, how easy it is to deanonymize, and the lessons other agencies should learn from this. (And yes, I’ll also explain how rainbows fit in).

I mention this because you may be interested in the data in large chunks or small chunks.

The other reason to mention this data set is the concern over “proper” anonymization of the data. As if failing to do that, resulted in a loss of privacy for the drivers.

I see no loss of privacy for the drivers.

I say that because the New York City Taxi and Limousine Commission already had the data. The question was: Will members of the public have access to the same data? Whatever privacy a taxi driver had was breached when the data went to the NYC Taxi and Limousine Commission.

That’s an important distinction. “Privacy” will be a regular stick the government trots out to defend its possessing data and not sharing it with you.

The government has no real interest in your privacy. Witness the rogue intelligence agencies in Washington if you have any doubts on that issue. The government wants to conceal your information, which it gained by fair and/or foul methods, from both you and the rest of us.

Why? I don’t know with any certainly. But based on my observations in both the “real world” and academia, most of it stems from “I know something you don’t,” and that makes them feel important.

I can’t imagine any sadder basis for feeling important. The NSA could print out a million pages of its most secret files and stack them outside my office. I doubt I would be curious enough to turn over the first page.

The history of speculation, petty office rivalries, snide remarks about foreign government officials, etc. are of no interest to me. I already assumed they were spying on everyone so having “proof” of that is hardly a big whoop.

But we should not be deterred by calls for privacy as we force government to disgorge data it has collected, including that of the NSA. Perhaps even licensing chunks of the NSA data for use in spy novels. That offers some potential for return on the investment in the NSA.

June 5, 2014

New SSL Issues

Filed under: Cybersecurity,NSA,Privacy,Security — Patrick Durusau @ 2:35 pm

OpenSSL Security Advisory [05 Jun 2014]

Seven new SSL bugs have been documented. See the advisory for details.

Given how insecure the Net is at present, I have to wonder at the effectiveness of Reset The Net at stopping mass surveillance?

I agree with ending mass surveillance but mostly because storing all that data is contractor waste.

I first saw this in a tweet by Nick Sullivan.

May 14, 2014

Feathers, Gossip and the European Union Court of Justice (ECJ)

Filed under: EU,Privacy,Search Engines — Patrick Durusau @ 2:52 pm

It is a common comment that the United States Supreme Court has difficulty with technology issues. Not terribly surprising since digital technology evolves several orders of magnitude faster than legal codes and customs.

But even if judicial digital illiteracy isn’t surprising, judicial theological illiteracy should be.

I am referring, of course, to the recent opinion by the European Court of Justice that there is a right to be “forgotten” in the records of the search giant Google.

In the informal press release about its decision, the ECJ states:

Finally, in response to the question whether the directive enables the data subject to request that links to web pages be removed from such a list of results on the grounds that he wishes the information appearing on those pages relating to him personally to be ‘forgotten’ after a certain time, the Court holds that, if it is found, following a request by the data subject, that the inclusion of those links in the list is, at this point in time, incompatible with the directive, the links and information in the list of results must be erased. The Court observes in this regard that even initially lawful processing of accurate data may, in the course of time, become incompatible with the directive where, having regard to all the circumstances of the case, the data appear to be inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed and in the light of the time that has elapsed. The Court adds that, when appraising such a request made by the data subject in order to oppose the processing carried out by the operator of a search engine, it should in particular be examined whether the data subject has a right that the information in question relating to him personally should, at this point in time, no longer be linked to his name by a list of results that is displayed following a search made on the basis of his name. If that is the case, the links to web pages containing that information must be removed from that list of results, unless there are particular reasons, such as the role played by the data subject in public life, justifying a preponderant interest of the public in having access to the information when such a search is made. (The press release version, The official judgement).

Which doesn’t sound unreasonable, particularly if you are a theological illiterate.

One contemporary retelling of a story about St. Philip Neri goes as follows:

The story is often told of the most unusual penance St. Philip Neri assigned to a woman for her sin of spreading gossip. The sixteenth-century saint instructed her to take a feather pillow to the top of the church bell tower, rip it open, and let the wind blow all the feathers away. This probably was not the kind of penance this woman, or any of us, would have been used to!

But the penance didn’t end there. Philip Neri gave her a second and more difficult task. He told her to come down from the bell tower and collect all the feathers that had been scattered throughout the town. The poor lady, of course, could not do it-and that was the point Philip Neri was trying to make in order to underscore the destructive nature of gossip. When we detract from others in our speech, our malicious words are scattered abroad and cannot be gathered back. They continue to dishonor and divide many days, months, and years after we speak them as they linger in people’s minds and pass from one tale-bearer to the next. (From The Feathers of Gossip: How our Words can Build Up or Tear Down by Edward P. Sri)*

The problem with “forgetting” is the same one as the gossip penitent. Information is copied and replicated by sites for their own purposes. Nothing Google can do will impact those copies. Even if Google, removes all of its references from a particular source, the information could be re-indexed in the future from new sources.

This decision is a “feel good” one for privacy advocates. But, the ECJ should have recognized the gossip folktale parallel and decided that effective relief is impossible. Ordering an Impossible solution diminishes the stature of the court and the seriousness with which its decisions are regarded.

Not to mention the burden this will place on Google and other search result providers, with no guarantee that the efforts will be successful.

Sometimes the best solution is to simply do nothing at all.

* There isn’t a canonical form for this folktale, which has been told and re-told by many cultures.

March 23, 2014

New Book on Data and Power

Filed under: Data,Government,NSA,Privacy,Security — Patrick Durusau @ 6:23 pm

New Book on Data and Power by Bruce Schneier.

From the post:

I’m writing a new book, with the tentative title of Data and Power.

While it’s obvious that the proliferation of data affects power, it’s less clear how it does so. Corporations are collecting vast dossiers on our activities on- and off-line — initially to personalize marketing efforts, but increasingly to control their customer relationships. Governments are using surveillance, censorship, and propaganda — both to protect us from harm and to protect their own power. Distributed groups — socially motivated hackers, political dissidents, criminals, communities of interest — are using the Internet to both organize and effect change. And we as individuals are becoming both more powerful and less powerful. We can’t evade surveillance, but we can post videos of police atrocities online, bypassing censors and informing the world. How long we’ll still have those capabilities is unclear.

Understanding these trends involves understanding data. Data is generated by all computing processes. Most of it used to be thrown away, but declines in the prices of both storage and processing mean that more and more of it is now saved and used. Who saves the data, and how they use it, is a matter of extreme consequence, and will continue to be for the coming decades.

Data and Power examines these trends and more. The book looks at the proliferation and accessibility of data, and how it has enabled constant surveillance of our entire society. It examines how governments and corporations use that surveillance data, as well as how they control data for censorship and propaganda. The book then explores how data has empowered individuals and less-traditional power blocs, and how the interplay among all of these types of power will evolve in the future. It discusses technical controls on power, and the limitations of those controls. And finally, the book describes solutions to balance power in the future — both general principles for society as a whole, and specific near-term changes in technology, business, laws, and social norms.
….

Bruce says a table of contents should appear in “a couple of months” and he is going to be asking “for volunteers to read and comment on a draft version.”

I assume from the description that Bruce is going to try to connect a fairly large number of dots.

Such as who benefits from the Code of Federal Regulations (CFRs) not having an index? The elimination of easier access to the CFRs is a power move. Someone with a great deal of power wants to eliminate the chance of someone gaining power from following information in the CFRs.

I am not a conspiracy theorist but there are only two classes of people in any society, people with more power than you and people with less. Every sentient person wants to have more and no one will voluntarily take less. Among chickens they call it the “pecking order.”

In human society, the “pecking order” in enforced by uncoordinated and largely unconscious following of cultural norms. No conspiracy, just the way we are. But there are cases, the CFR indexes being one of them, where someone is clearly trying to disadvantage others. Who and for what reasons remains unknown.

March 5, 2014

Welcoming Bankers/Lawyers/CEOs to the Goldfish Bowl

Filed under: Privacy,Security — Patrick Durusau @ 8:07 pm

A vast hidden surveillance network runs across America, powered by the repo industry by Shawn Musgrave.

From the post:

Few notice the “spotter car” from Manny Sousa’s repo company as it scours Massachusetts parking lots, looking for vehicles whose owners have defaulted on their loans. Sousa’s unmarked car is part of a technological revolution that goes well beyond the repossession business, transforming any ­industry that wants to check on the whereabouts of ordinary people.

An automated reader attached to the spotter car takes a picture of every ­license plate it passes and sends it to a company in Texas that already has more than 1.8 billion plate scans from vehicles across the country.

These scans mean big money for Sousa — typically $200 to $400 every time the spotter finds a vehicle that’s stolen or in default — so he runs his spotter around the clock, typically adding 8,000 plate scans to the database in Texas each day.

“Honestly, we’ve found random apartment complexes and shopping ­plazas that are sweet spots” where the company can impound multiple vehicles, explains Sousa, the president of New England Associates Inc. in Bridgewater.

But the most significant impact of Sousa’s business is far bigger than locating cars whose owners have defaulted on loans: It is the growing database of snapshots showing where Americans were at specific times, information that everyone from private detectives to ­insurers are willing to pay for.

Shawn does a great job detailing how pervasive auto-surveillance is in the United States. Bad enough for repossession but your car’s location could be used as evidence of your location as well.

I suppose as compensation for lenders and repossession companies taking photos of license plates, ordinary people could follow lender and repossession employees around and do the same thing.

While thinking about that possibility, it occurred to me that the general public could do them one better.

When you see a banker, lawyer, CEO at a party, school event, church service, restaurant, etc., use your cellphone to snap their picture. Tag everyone you recognize in the photo.

If enough people take enough photographs, there will be a geo-location and time record of their whereabouts for more and more of every day.

Thinking we need photos of elected officials, their immediate staffs and say the top 10% of your locality in terms of economic status.

It won’t take long before those images, perhaps even your images, will become quite important.

Maybe this could be the start of the intell market described in Snow Crash.

Some people may buy your intell to use it, others may buy it to suppress it.

I first saw this in a tweet by Tim O’Reilly.

February 11, 2014

Is 11 Feb 2014 The Day We Fight Back?

Filed under: Cryptography,Cybersecurity,NSA,Privacy,Security — Patrick Durusau @ 11:31 am

Is 11 Feb 2014 The Day We Fight Back? by Mark Stockley.

From the post:

Appalled with government surveillance without oversight? Sick of having your privacy invaded? Numb from stories about the NSA? If you are, you’ll have had many more bad days than good since June 2013.

But today, just perhaps, could be one of the better ones.

Mark covers the general theme of protests quite well and then admits, ok, so people are protesting, now what?

Lacking a target like SOPA, there is not specific action to ask for or for anyone to take.

Or as Mark points out:

Who do we lobby to fix that situation [government surveillance} and how will we ever know if we have succeeded?

I put it to you the government(s) being petitioned for privacy protection are the same ones that spied on you. Is there irony that situation?

Is it a reflection on your gullibility that despite years of known lies, deceptions and rights violations, you are willing to trust the people responsible for the ongoing lies, deceptions and rights violations?

If you aren’t going to trust the government, if you aren’t going to protest, what does that leave?

Fighting back effectively.

Mark points out a number of efforts to secure the technical infrastructure of the Internet. Learn more about those, support them and even participate in them.

Among other efforts, consider the OASIS PKCS 11 TC:

The OASIS PKCS 11 Technical Committee develops enhancements to improve the PKCS #11 standard for ease of use in code libraries, open source applications, wrappers, and enterprise/COTS products: implementation guidelines, usage tutorials, test scenarios and test suites, interoperability testing, coordination of functional testing, development of conformance profiles, and providing reference implementations.

The updated standard provides additional support for mobile and cloud computing use cases: for distributed/federated applications involving key management functions (key generation, distribution, translation, escrow, re-keying); session-based models; virtual devices and virtual keystores; evolving wireless/sensor applications using near field communication (NFC), RFID, Bluetooth, and Wi-Fi.

TC members are also designing new mechanisms for API instrumentation, suitable for use in prototyping, profiling, and testing in resource-constrained application environments. These updates enable support for easy integration of PKCS #11 with other cryptographic key management system (CKMS) standards, including a broader range of cryptographic algorithms and CKMS cryptographic service models. (from the TC homepage)

Whatever security you have from government intrusion is going to come from you and others like you who create it.

Want to fight back today? Join one of the efforts that Marks lists or the OASIS PKCS 11 TC. Today!

January 14, 2014

Home Invasion by Google

Filed under: Data Integration,Privacy,Transparency — Patrick Durusau @ 2:58 pm

When Google closes the Nest deal, privacy issues for the internet of things will hit the big time by Stacey Higginbotham.

From the post:

Google rocked the smart home market Monday with its intention to purchase connected home thermostat maker Nest for $3.2 billion, which will force a much-needed conversation about data privacy and security for the internet of things.

It’s a conversation that has seemingly stalled as advocates for the connected home expound upon the benefits in convenience, energy efficiency and even the health of people who are collecting and connecting their data and devices together through a variety of gadgets and services. On the other side are hackers and security researchers who warn how easy some of the devices are to exploit — gaining control of data or even video streams about what’s going on in the home.

So far the government, in the form of the Federal Trade Commission — has been reluctant to make rules and is still gathering information. A security research told the FTC at a Nov. 19 event that companies should be fined for data breaches, which would encourage companies to design data protection into their products from the beginning. Needless to say, industry representatives were concerned that such an approach would “stifle innovation.” Even at CES an FTC commissioner expressed a similar sentiment — namely that the industry was too young for rules.

Stacey writes a bit further down:

Google’s race to gather data isn’t evil, but it could be a problem

My assumption is that Google intends to use the data it is racing to gather. Google may not know or foresee all the potential uses for the data it collects (sales to the NSA?) but it has been said: “Data is the new oil.” Big Data Is Not the New Oil by Jer Thorp.

Think of Google as a successful data wildcatter, which in the oil patch resulted in heirs wealthy enough to attempt to corner the world silver market.

Don’t be mislead by Jer’s title, he means to decry the c-suite use of a phrase read on a newsstand cover. Later he writes:

Still, there are some ways in which the metaphor might be useful.

Perhaps the “data as oil” idea can foster some much-needed criticality. Our experience with oil has been fraught; fortunes made have been balanced with dwindling resources, bloody mercenary conflicts, and a terrifying climate crisis. If we are indeed making the first steps into economic terrain that will be as transformative (and possibly as risky) as that of the petroleum industry, foresight will be key. We have already seen “data spills” happen (when large amounts of personal data are inadvertently leaked). Will it be much longer until we see dangerous data drilling practices? Or until we start to see long term effects from “data pollution”?

An accurate account of our experience with oil, as far as it goes.

Unlike Jer, I see data continuing to follow the same path as oil, coal, timber, gold, silver, gemstones, etc.

I say continuing because scribes were the original data brokers. And enjoyed a privileged role in society. Printing reduced the power of scribes but new data brokers took their place. Libraries and universities and those they trained had more “data” than others. Specific examples of scientia potentia est (“knowledge is power”), are found in: The Information Master: Jean-Baptiste Colbert‘s Secret State Intelligence System (Louis XiV) and IBM and the Holocaust. (Not to forget the NSA.)

Information, or “data” if you prefer, has always been used to advance some interests and used against others. The electronic storage of data has reduced the cost of using data that was known to exist but was too expensive or inaccessible for use.

Consider marital history. For the most part, with enough manual effort and travel, a person’s marital history has been available for the last couple of centuries. Records are kept of marriages, divorces, etc. But accessing that information wasn’t a few strokes on a keyboard and perhaps an access fee. Same data, different cost of access.

Jer’s proposals and others I have read, are all premised on people foregoing power, advantage, profit or other benefits from obtaining, analyzing and acting upon data.

I don’t know of any examples in the history where that has happened.

Do you?

December 18, 2013

NSA & Connecting the Dots

Filed under: Cybersecurity,NSA,Privacy — Patrick Durusau @ 4:51 pm

A release of an review panel study of Surveillance U.S.A. (SUSA, aka, U.S. intelligence activities) has been rumored on the Net most of the day.

While we wait for a copy of the alleged study, consider this report by the Guardian:

On Wednesday, NSA director Keith Alexander, the army general who will retire in the spring after leading the agency for eight years, strongly defended the bulk collection of phone data as necessary to detect future domestic terrorist attacks. “There is no other way we know of to connect the dots,” Alexander told the Senate judiciary committee.

Mass telephone data collection because:

There is no other way we know of to connect the dots

If the General wasn’t just playing to the press, that is one key to why U.S. intelligence services are functioning so poorly.

streetlamp

The light is better for connecting telephone dots together.

Connecting other dots, non-telephone dots, the ones that might effectively prevent terrorism, that might be hard.

Or in this case, none of the General’s contract buddies have a clue about connecting non-telephone dots.

Arguments to keep massive telephone surveillance:

  • Telephone dots are easy to connect (even if ineffectual).
  • Usual suspects profit from connecting telephone dots.
  • Usual suspects don’t know how to connect non-telephone dots.

From the General’s perspective, that’s a home run argument.

To me, that’s a three strikes and you are out argument.

There are lots of ways to connect non-telephone dots, effectively and in a timely manner.

It would not be as easy as telephone data but then it would be more effective as well.

You would have to know what sort of non-telephone information the NSA has in order to fashion a connect the non-telephone dot proposal.

Easy information (telephone call records) don’t equal useful information (dot connecting information).

If your cause, organization, agency, department, government, government in waiting, is interested in non-telephone dot connecting advice, you know how to reach me.

PS: BTW, I work on a first come, first served basis.

December 16, 2013

Judge Wacks the NSA

Filed under: NSA,Privacy — Patrick Durusau @ 5:03 pm

Judge calls for phone data to be destroyed, says NSA program too broad by Jeff John Roberts.

From the post:

In a major rebuke to the National Security Agency’s mass collection of telephone data, a federal judge ruled that the agency’s surveillance program likely violates the Constitution and also granted two Verizon subscribers’ request for an order to destroy so-called meta-data.

On Monday in Washington,D.C., U.S. District Judge Richard Leon issued a ruling that “bars the Government from collecting … any telephony data” associated with the Verizon account of two citizens who filed the lawsuit, and “requires the Government to destroy any such metadata in its possession that was collected through the bulk collection program.”

….

The judge also rejected the argument that the existence of a secret NSA court, known as the FISA court, precluded him from reviewing the surveillance program for constitutional questions.

“While Congress has great latitude to create statutory scheme like FISA, it may not hang a cloak of secrecy over the Constitution,” he wrote as part of the 68 page ruling.

See the decision at: Klayman NSA Decision and more at: Politico.

Good news but note the judge only ordered the destruction of records for two subscribers. And even that is stayed on appeal. Like they would really destroy the data anyway. How would you know?

Take this as a temporary victory.

Celebrate, yes, but regroup tomorrow to continue the fight.

The Real Privacy Problem

Filed under: BigData,Privacy — Patrick Durusau @ 4:50 pm

The Real Privacy Problem by Evgeny Morozov.

A deeply provocative essay that has me re-considering my personal position on privacy.

Not about my personal privacy.

A more general concern that the loss of privacy will lead to less and less transparency and accountability of corporations and governments.

Consider this passage from the essay:

If you think Simitis was describing a future that never came to pass, consider a recent paper on the transparency of automated prediction systems by Tal Zarsky, one of the world’s leading experts on the politics and ethics of data mining. He notes that “data mining might point to individuals and events, indicating elevated risk, without telling us why they were selected.” As it happens, the degree of interpretability is one of the most consequential policy decisions to be made in designing data-mining systems. Zarsky sees vast implications for democracy here:

A non-interpretable process might follow from a data-mining analysis which is not explainable in human language. Here, the software makes its selection decisions based upon multiple variables (even thousands) … It would be difficult for the government to provide a detailed response when asked why an individual was singled out to receive differentiated treatment by an automated recommendation system. The most the government could say is that this is what the algorithm found based on previous cases.

This is the future we are sleepwalking into. Everything seems to work, and things might even be getting better—it’s just that we don’t know exactly why or how.

Doesn’t that sound like the circumstances we find with the NSA telephone surveillance? No one denies that they broke the law, lied to Congress about it, etc. but they claim to have protected the U.S. public.

Really? And where is that information? Oh, some of it was shown to a small group of selected Senators and they thought some unspecified part of it looked ok, maybe.

I don’t know about you but that doesn’t sound like accountability or transparency to me.

Moreover the debate doesn’t even start in the right place. Violation of our telephone privacy is already a crime.

The NSA leadership and staff should be in the criminal dock when the questioning starts, not a hearing room on Capital Hill.

Moreover, “good faith” is not a defense to criminal conduct in the law. It really doesn’t matter than you thought your dog was telling to you to protect us from terrorists by engaging in widespread criminal activity. Even if you thought your dog was speaking for the Deity.

If there is no accountability and/or transparency on the part of government/corporatons, there is a driving desire to make citizens completely transparent and accountable to both government and corporations:

Habits, activities, and preferences are compiled, registered, and retrieved to facilitate better adjustment, not to improve the individual’s capacity to act and to decide. Whatever the original incentive for computerization may have been, processing increasingly appears as the ideal means to adapt an individual to a predetermined, standardized behavior that aims at the highest possible degree of compliance with the model patient, consumer, taxpayer, employee, or citizen.

What Simitis is describing here is the construction of what I call “invisible barbed wire” around our intellectual and social lives. Big data, with its many interconnected databases that feed on information and algorithms of dubious provenance, imposes severe constraints on how we mature politically and socially. The German philosopher Jürgen Habermas was right to warn—in 1963—that “an exclusively technical civilization … is threatened … by the splitting of human beings into two classes—the social engineers and the inmates of closed social institutions.”

The more data both the government and corporations collect, the less accountability and transparency they have and the more accountability and transparency they want to impose on the average citizen.

A very good reason why putting users in control of their data is a non-answer to the privacy question. Enabling users to “sell” their data just gives them the illusion of a choice when their choices are in fact dwindling with each bit of data that is collected.

All hope is not lost, see Morozov’s essay for some imaginative thinking on how to deepen and broaden the debate over privacy.

Some of the questions I would urge people to raise are:

  • Should websites be allowed to collect tracking data at all?
  • Should domestic phone traffic be tracked other than for billing and then discarded (hourly)?
  • Should credit card companies be allowed to keep purchase histories more than 30 days old?

In terms of slogans, consider this one: Data = Less Freedom. (D=LF)

December 14, 2013

Information Data Exchanges

Filed under: Privacy,Security — Patrick Durusau @ 4:47 pm

For Second Year in a Row, Markey Investigation Reveals More Than One Million Requests By Law Enforcement for Americans’ Mobile Phone Data by Sen. Edward Markey.

From the post:

As part of his ongoing investigation into wireless surveillance of Americans by law enforcement, Senator Edward J. Markey (D-Mass.) today released responses from eight major wireless carriers that reveals expanded use of wireless surveillance of Americans, including more than one million requests for the personal mobile phone data of Americans in 2012 by law enforcement. This total may well represent tens or hundreds of thousands more actual individuals due to the law enforcement practice of requesting so-called “cell phone tower dumps” in which carriers provide all the phone numbers of mobile phone users that connect with a tower during a specific period of time. Senator Markey began his investigation last year, revealing 1.3 million requests in 2011 for wireless data by federal, state, and local law enforcement. In this year’s request for information, Senator Markey expanded his inquiry to include information about emergency requests for information, data retention policies, what legal standard –whether a warrant or a lower standard — is used for each type of information request, and the costs for fulfilling requests. The responses received by Senator Markey reveal surveillance startling in both volume and scope.

If you think the telco’s are donating your data, think again.

Sen. Markey reports that in 2012:

  • AT&T received $10 million
  • T-Mobile received $11 million
  • Verizon less than $5 million

Does that make you wonder how much Google, Microsoft and others got paid for their assistance?

If the top technology companies are going to profit from a police state, why shouldn’t the average citizen?

If you find evidence of stock or wire fraud, there should be a series of information data exchanges where both the government and the parties in questions can bid for your information.

It would create an incentive system for common folks to start looking for and collecting information on criminal wrong doing.

Not to mention that it would create competition to insure the holders of such information get a fair price.

Before you protest too much, remember the financial industry and others are selling your data right now, today.

Turn about seems like fair play to me.

December 10, 2013

Reverse Entity Recognition? (Scrubbing)

Filed under: Entities,Entity Resolution,Privacy — Patrick Durusau @ 12:51 pm

Improving privacy with language technologies by Rob Munro.

From the post:

One downside of the kind of technologies that we build at Idibon is that they can be used to compromise people’s privacy and, by extension, their safety. Any technology can be used for positive and negative purposes and as engineers we have a responsibility to ensure that what we create is for a better world.

For language technologies, the most negative application, by far, is eavesdropping: discovering information about people by monitoring their online communications and using that information in ways that harm the individuals. This can be something as direct and targeted as exposing the identities of at-risk individuals in a war-zone or it can be the broad expansion of government surveillance. The engineers at many technology companies announced their opposition to the latter with a loud, unified call today to reform government surveillance.

One way that privacy can be compromised at scale is the use of technology known as “named entity recognition”, which identifies the names of people, places, organizations, and other types of real-world entities in text. Given millions of sentences of text, named entity recognition can extract the names and addresses of everybody in the data in just a few seconds. But the same technology that can we used to uncover personally identifying information (PII) can also be used to remove the personally identifying information from the text. This is known as anonymizing or simply “scrubbing”.

Rob agrees that entity recognition can invade your personal privacy, but points out it can also protect your privacy.

You may think your “handle” on one or more networks provides privacy but it would not take much data to disappoint most people.

Entity recognition software can scrub data to remove “tells” that may identify you from it.

How much scrubbing is necessary depends on the data and the consequences of discovery.

Entity recognition is usually thought of as recognizing names, places, but it could just as easily be content analysis to recognize a particular author.

That would require more sophisticated “scrubbing” than entity recognition can support.

October 2, 2013

No Free Speech for Tech Firms?

Filed under: Law,NSA,Privacy,Security — Patrick Durusau @ 4:16 pm

I stumbled across Tech firms’ release of PRISM data will harm security — new U.S. and FBI court filings by Jeff John Roberts today.

From the post:

The Department of Justice, in long-awaited court filings that have just been released, urged America’s secret spy court to reject a plea by five major tech companies to disclose data about how often the government asks for user information under a controversial surveillance program aimed at foreign suspects.

The filings, which appeared on Wednesday, claimed that the tech companies – Google, Microsoft, Facebook, LinkedIn and Yahoo — do not have a First Amendment right to disclose how many Foreign Intelligence Surveillance Act requests they receive.

“Adversaries may alter their behavior by switching to service that the Government is not intercepting,” said the filings, which are heavily blacked out and cite Edward Snowden, a former NSA contractor. Snowden has caused an ongoing stir by leaking documents about a U.S. government program known as PRISM that vacuums up meta-data from the technology firms.

I thought we had settled the First Amendment for corporations back in Citizens United v. FEC.

Justice Kennedy, writing for the majority said:

The censorship we now confront is vast in its reach. The Government has “muffle[d] the voices that best represent the most significant segments of the economy.” Mc Connell, supra, at 257–258 (opinion of Scalia, J.). And “the electorate [has been] deprived of information, knowledge and opinion vital to its function.” CIO, 335 U. S., at 144 (Rutledge, J., concurring in result). By suppressing the speech of manifold corporations, both for-profit and nonprofit, the Government prevents their voices and viewpoints from reaching the public and advising voters on which persons or entities are hostile to their interests. Factions will necessarily form in our Republic, but the remedy of “destroying the liberty” of some factions is “worse than the disease.” The Federalist No. 10, p. 130 (B. Wright ed. 1961) (J. Madison). Factions should be checked by permitting them all to speak, see ibid., and by entrusting the people to judge what is true and what is false.

The purpose and effect of this law is to prevent corporations, including small and nonprofit corporations, from presenting both facts and opinions to the public. This makes Austin ’s antidistortion rationale all the more an aberration. “[T]he
First Amendment protects the right of corporations to petition legislative and administrative bodies.” Bellotti, 435 U. S., at 792, n. 31 (citing California Motor Transport Co. v. Trucking Unlimited, 404 U. S. 508, 510–511 (1972); Eastern Railroad Presidents Conference v. Noerr Motor Freight, Inc., 365 U. S. 127, 137–138 (1961)). Corporate executives and employees counsel Members of Congress and Presidential administrations on many issues, as a matter of routine and often in private. An amici brief filed on behalf of Montana and 25 other States notes that lobbying and corporate communications with elected officials occur on a regular basis. Brief for State of Montana et al. as Amici Curiae 19. When that phenomenon is coupled with §441b, the result is that smaller or nonprofit corporations cannot raise a voice to object when other corporations, including those with vast wealth, are cooperating with the Government. That cooperation may sometimes be voluntary, or it may be at the demand of a Government official who uses his or her authority, influence, and power to threaten corporations to support the Government’s policies. Those kinds of interactions are often unknown and unseen. The speech that §441b forbids, though, is public, and all can judge its content and purpose. References to massive corporate treasuries should not mask the real operation of this law. Rhetoric ought not obscure reality.

I admit that Citizens United v. FEC was about corporations buying elections but Justice Department censorship in this case is even worse.

Censorship in this case strikes at trust in products and services from: Google, Microsoft, Facebook, LinkedIn, Yahoo and Dropbox.

And it prevents consumers from making their own choices about who or what to trust.

Google, Microsoft, Facebook, LinkedIn, Yahoo and Dropbox should publish all the details of FISA requests.

Trust your customers/citizens to make the right choice.

PS: If Fortune 50 companies don’t have free speech, what do you think you have?

June 10, 2013

Why Theoretical Computer Scientists Aren’t Worried About Privacy

Filed under: Cryptography,NSA,Privacy,Security — Patrick Durusau @ 1:33 pm

Why Theoretical Computer Scientists Aren’t Worried About Privacy by Jeremy Kun.

From the post:

There has been a lot of news recently on government surveillance of its citizens. The biggest two that have pervaded my news feeds are the protests in Turkey, which in particular have resulted in particular oppression of social media users, and the recent light on the US National Security Agency’s widespread “backdoor” in industry databases at Google, Verizon, Facebook, and others. It appears that the facts are in flux, as some companies have denied their involvement in this program, but regardless of the truth the eye of the public has landed firmly on questions of privacy.

Barack Obama weighed in on the controversy as well, being quoted as saying,

You can’t have 100% security and 100% privacy, and also zero inconvenience.

I don’t know what balance the US government hopes to strike, but what I do know is that privacy and convenience are technologically possible, and we need not relinquish security to attain it.

Before I elaborate, let me get my personal beliefs out of the way. I consider the threat of terrorism low compared to the hundreds of other ways I can die. I should know, as I personally have been within an \varepsilon fraction of my life for all \varepsilon > 0 (when I was seven I was hit by a bus, proclaimed dead, and revived). So I take traffic security much more seriously than terrorism, and the usual statistics will back me up in claiming one would be irrational to do otherwise. On the other hand, I also believe that I only need so much privacy. So I don’t mind making much of my personal information public, and I opt in to every one of Google’s tracking services in the hopes that my user experience can be improved. Indeed it has, as services like Google Now will, e.g., track my favorite bands for me based on my Google Play listening and purchasing habits, and alert me when there are concerts in my area. If only it could go one step further and alert me of trending topics in theoretical computer science! I have much more utility for timely knowledge of these sorts of things than I do for the privacy of my Facebook posts. Of course, ideologically I’m against violating privacy as a matter of policy, but this is a different matter. One can personally loathe a specific genre of music and still recognize its value and one’s right to enjoy it.

But putting my personal beliefs aside, I want to make it clear that there is no technological barrier to maintaining privacy and utility. This may sound shocking, but it rings true to the theoretical computer scientist. Researchers in cryptography have experienced this feeling many times, that their wildest cryptographic dreams are not only possible but feasible! Public-key encryption and digital signatures, secret sharing on a public channel, zero-knowledge verification, and many other protocols have been realized quite soon after being imagined. There are still some engineering barriers to implementing these technologies efficiently in large-scale systems, but with demand and a few years of focused work there is nothing stopping them from being used by the public. I want to use this short post to describe two of the more recent ideas that have pervaded the crypto community and provide references for further reading.

Jeremy injects a note of technical competence into the debate over privacy and security in the wake of NSA disclosures.

Not that our clueless representatives in government, greedy bidders or turf building agencies will pick up on this line of discussion.

The purpose of the NSA program is what the Republicans call a “transfer of wealth.” In this case from the government to select private contractors.

How much is being transferred isn’t known. If we knew the amount of the transfer and that the program it funds is almost wholly ineffectual, we might object to our representatives.

Some constitutional law scholars (Obama) have forgotten informed participation by voters in public debate is a keystone of the U.S. constitution.

June 6, 2013

Whose Afraid of the NSA?

Filed under: Cypher,Government,Privacy,Security — Patrick Durusau @ 1:17 pm

To Catch a Cyber-Thief

From the post:

When local police came calling with child porn allegations last January, former Saint John city councillor Donnie Snook fled his house clutching a laptop. It was clear that the computer contained damning data. Six months later, police have finally gathered enough evidence to land him in jail for a long time to come.

With a case seemingly so cut and dry, why the lag time? Couldn’t the police do a simple search for the incriminating info and level charges ASAP? Easier said than done. With computing devices storing terrabytes of personal data, it can take months before enough evidence can be cobbled together from reams of documents, emails, chat logs and text messages.

That’s all about to change thanks to a new technique developed by researchers at Concordia University, who have slashed the data-crunching time. What once took months now takes minutes.

Gaby Dagher and Benjamin Fung, researchers with the Concordia Institute for Information Systems Engineering, will soon publish their findings in Data & Knowledge Engineering. Law enforcement officers are already putting this research to work through Concordia’s partnership with Canada’s National Cyber-Forensics and Training Alliance, in which law enforcement organizations, private companies, and academic institutions work together to share information to stop emerging cyber threats and mitigate existing ones.

Thanks to Dagher and Fung, crime investigators can now extract hidden knowledge from a large volume of text. The researchers’ new methods automatically identify the criminal topics discussed in the textual conversation, show which participants are most active with respect to the identified criminal topics, and then provide a visualization of the social networks among the participants.

Dagher, who is a PhD candidate supervised by Fung, explains “the huge increase in cybercrimes over the past decade boosted demand for special forensic tools that let investigators look for evidence on a suspect’s computer by analyzing stored text. Our new technique allows an investigator to cluster documents by producing overlapping groups, each corresponding to a specific subject defined by the investigator.”

Have you heard about clustering documents? Searching large volumes of text? Producing visualizations of social networks?

The threat of government snooping on its citizens should be evaluated on its demonstrated competence.

The FBI wants special backdoors (like it has for telecommunications) just to monitor IP traffic. (Going Bright… [Hack Shopping Mall?])

It would help the FBI if they had our secret PGP keys.

There a thought, maybe we should all generate new PGP keys and send the secret key for that key to the FBI.

They may not ever intercept any traffic encrypted with those keys but they can get funding from Congress to maintain an archive of them and to run them against all IP traffic.

The NSA probably has better chops when it comes to data collection but identity mining?

Identity mining is something completely different.

(See: The NSA Verizon Collection Coming on DVD)

The NSA Verizon Collection Coming on DVD

Filed under: Cybersecurity,Government,Privacy,Security — Patrick Durusau @ 1:15 pm

Don’t you wish! 😉

Sadly U.S. citizens have to rely on the foreign press, NSA collecting phone records of millions of Verizon customers daily for minimal transparency of our own government.

According to the post:

Under the terms of the blanket order, the numbers of both parties on a call are handed over, as is location data, call duration, unique identifiers, and the time and duration of all calls. The contents of the conversation itself are not covered.

The order expires July 19, 2013. One response to get a Verizon account and setup a war games dialer to make 1-800 calls between now an July 19, 2013.

The other response is to think about the subject identity management issues with the Verizon data.

Bare Verizon Data

Let’s see, you get: numbers of both parties on a call, location data, call duration, unique identifiers, and the time and duration of all calls.”

Not all that difficult to create networks of the calls based on the Verizon data but that doesn’t get you identity of the people making the calls.

Identifying Individuals

What about matching the phone numbers to the major credit bureaus?

Where the FTC found:

A Federal Trade Commission study of the U.S. credit reporting industry found that five percent of consumers had errors on one of their three major credit reports that could lead to them paying more for products such as auto loans and insurance.

Overall, the congressionally mandated study on credit report accuracy found that one in five consumers had an error on at least one of their three credit reports.

In FTC Study, Five Percent of Consumers Had Errors on Their Credit Reports That Could Result in Less Favorable Terms for Loans

Bear in mind that the sources in credit reports have their own methods for identifying individuals, which are not exposed through the credit bureaus.

As I recall, credit reports don’t include political or social activity.

Identifying Social Networks

Assuming some rough set of names, it might be possible to match those names against FaceBook and other social media sites. And then to map the relationship there back to the relationships in the original Verizon data.

The main problem being that every data set uses a different means to identify the same individuals and associations between individuals.

You and I may be friends on FaceBook, doing business together on LinkedIn and have cell phone conversations in the Verizon data, but the question will be mapping all of those together.

And remembering that all those systems are dynamic. Knowing my network of contacts six (6) weeks ago may or may not be useful now.

To be useful, the NSA will need to query along different identifications in different systems for the same person and have the results returned across all the systems.

Otherwise, Verizon will show a healthy profit after July 19, 2013 in fees for delivery electronic copies of data it already collects.

“Secret” court order was necessary to make the sale.

The NSA will have a data set just because it exists. “Big data” supports funding goals.

Verizon users? No less privacy than they had when only Verizon had the data.

June 3, 2013

The Banality of ‘Don’t Be Evil’

Filed under: Government,Government Data,Privacy — Patrick Durusau @ 2:00 pm

The Banality of ‘Don’t Be Evil’ by Julian Assange.

From the post:

“THE New Digital Age” is a startlingly clear and provocative blueprint for technocratic imperialism, from two of its leading witch doctors, Eric Schmidt and Jared Cohen, who construct a new idiom for United States global power in the 21st century. This idiom reflects the ever closer union between the State Department and Silicon Valley, as personified by Mr. Schmidt, the executive chairman of Google, and Mr. Cohen, a former adviser to Condoleezza Rice and Hillary Clinton who is now director of Google Ideas.

The authors met in occupied Baghdad in 2009, when the book was conceived. Strolling among the ruins, the two became excited that consumer technology was transforming a society flattened by United States military occupation. They decided the tech industry could be a powerful agent of American foreign policy.

The book proselytizes the role of technology in reshaping the world’s people and nations into likenesses of the world’s dominant superpower, whether they want to be reshaped or not. The prose is terse, the argument confident and the wisdom — banal. But this isn’t a book designed to be read. It is a major declaration designed to foster alliances.

“The New Digital Age” is, beyond anything else, an attempt by Google to position itself as America’s geopolitical visionary — the one company that can answer the question “Where should America go?” It is not surprising that a respectable cast of the world’s most famous warmongers has been trotted out to give its stamp of approval to this enticement to Western soft power. The acknowledgments give pride of place to Henry Kissinger, who along with Tony Blair and the former C.I.A. director Michael Hayden provided advance praise for the book.

In the book the authors happily take up the white geek’s burden. A liberal sprinkling of convenient, hypothetical dark-skinned worthies appear: Congolese fisherwomen, graphic designers in Botswana, anticorruption activists in San Salvador and illiterate Masai cattle herders in the Serengeti are all obediently summoned to demonstrate the progressive properties of Google phones jacked into the informational supply chain of the Western empire.

(…)

I am less concerned with privacy and more concerned with the impact of technological imperialism.

I see no good coming from the infliction of Western TV and movies on other cultures.

Or in making local farmers part of the global agriculture market.

Or infecting Iraq with sterile wheat seeds.

Compared to those results, privacy is a luxury of the bourgeois who worry about such issues.

I first saw this at Chris Blattman’s Links I liked.

April 10, 2013

Can Big Data From Cellphones Help Prevent Conflict? [Privacy?]

Filed under: BigData,Data Mining,Privacy — Patrick Durusau @ 10:54 am

Can Big Data From Cellphones Help Prevent Conflict? by Emmanuel Letouzé.

From the post:

Data from social media and Ushahidi-style crowdsourcing platforms have emerged as possible ways to leverage cellphones to prevent conflict. But in the world of Big Data, the amount of information generated from these is too small to use in advanced data-mining techniques and “machine-learning” techniques (where algorithms adjust themselves based on the data they receive).

But there is another way cellphones could be leveraged in conflict settings: through the various types of data passively generated every time a device is used. “Phones can know,” said Professor Alex “Sandy” Pentland, head of the Human Dynamics Laboratory and a prominent computational social scientist at MIT, in a Wall Street Journal article. He says data trails left behind by cellphone and credit card users—“digital breadcrumbs”—reflect actual behavior and can tell objective life stories, as opposed to what is found in social media data, where intents or feelings are obscured because they are “edited according to the standards of the day.”

The findings and implications of this, documented in several studies and press articles, are nothing short of mind-blowing. Take a few examples. It has been shown that it was possible to infer whether two people were talking about politics using cellphone data, with no knowledge of the actual content of their conversation. Changes in movement and communication patterns revealed in cellphone data were also found to be good predictors of getting the flu days before it was actually diagnosed, according to MIT research featured in the Wall Street Journal. Cellphone data were also used to reproduce census data, study human dynamics in slums, and for community-wide financial coping strategies in the aftermath of an earthquake or crisis.

Very interesting post on the potential uses for cell phone data.

You can imagine what I think could be correlated with cellphone data using a topic map so I won’t bother to enumerate those possibilities.

I did want to comment on the concern about privacy or re-identification as Emmanuel calls it in his post from cellphone data.

Governments, who have declared they can execute any of us without notice or a hearing, are the guardians of that privacy.

That causes me to lack confidence in their guarantees.

Discussions of privacy should assume governments already have unfettered access to all data.

The useful questions become: How do we detect their misuse of such data? and How do we make them heartily sorry for that misuse?

For cell phone data, open access will give government officials more reason for pause than the ordinary citizen.

Less privacy for individuals but also less privacy for access, bribery, contract padding, influence peddling, and other normal functions of government.

In the U.S.A., we have given up our rights to public trial, probable cause, habeas corpus, protections against unreasonable search and seizure, to be free from touching by strangers, and several others.

What’s the loss of the right to privacy for cellphone data compared to catching government officials abusing their offices?

March 26, 2013

Our Internet Surveillance State [Intelligence Spam]

Filed under: Privacy,WWW — Patrick Durusau @ 3:21 pm

Our Internet Surveillance State by Bruce Schneier.

Nothing like a good rant to get your blood pumping during a snap of cold weather! 😉

Bruce writes:

Maintaining privacy on the Internet is nearly impossible. If you forget even once to enable your protections, or click on the wrong link, or type the wrong thing, and you’ve permanently attached your name to whatever anonymous service you’re using. Monsegur slipped up once, and the FBI got him. If the director of the CIA can’t maintain his privacy on the Internet, we’ve got no hope.

In today’s world, governments and corporations are working together to keep things that way. Governments are happy to use the data corporations collect — occasionally demanding that they collect more and save it longer — to spy on us. And corporations are happy to buy data from governments. Together the powerful spy on the powerless, and they’re not going to give up their positions of power, despite what the people want.

And welcome to a world where all of this, and everything else that you do or is done on a computer, is saved, correlated, studied, passed around from company to company without your knowledge or consent; and where the government accesses it at will without a warrant.

Welcome to an Internet without privacy, and we’ve ended up here with hardly a fight.

I don’t disagree with anything Bruce writes but I do not counsel despair.

Nor would I suggest any stop using the “Internet, email, cell phones, web browser, social networking sites, search engines,” in order to avoid spying.

But remember that one of the reasons U.S. intelligence services have fallen on hard times is the increased reliance on “easy” data to collect.

Clipping articles from newspaper or now copy-n-paste from emails and online zines, isn’t the same as having culturally aware human resources on the ground.

“Easy” data collection is far cheaper, but also less effective.

My suggestion is that everyone go “bare” and load up all listeners with as much junk as humanly possible.

Intelligence “spam” as it were.

Routinely threaten to murder fictitious characters in books or conspire to kidnap them. Terror plots, threats against Alderaan, for example.

Apparently even absurd threats, ‘One Definition of “Threat”,’ cannot be ignored.

A proliferation of fictional threats will leave them too little time to spy people going about their lawful activities.

BTW, not legal advice but I have heard that directly communicating any threat to any law enforcement agency is a crime. And not a good idea in any event.

Nor should you threaten any person or place or institution that isn’t entirely and provably fictional.

When someone who thinks mining social networks sites is a blow against terrorism overhears DC comic characters being threatened, that should be enough.

October 23, 2012

[T]he [God]father of Google Glass?

Filed under: BigData,Marketing,Privacy — Patrick Durusau @ 9:32 am

The original title is 3 Big Data Insights from the Grandfather of Google Glass. The post describes MIT Media Lab Professor Alex ‘Sandy’ Pentland as the “Grandfather of Google Glass.”

Let’s review Pentland’s three points to see if my title is more appropriate:

1) Big Data is about people.

SP: Big Data is principally about people, it’s not about RFID tags and things like that. So that immediately raises questions about privacy and data ownership.

I mean, this looks like a nightmare scenario unless there’s something that means that people are more in charge of their data and it’s not something that can be used to spy on them. Fortunately as a consequence of this discussion group at the World Economic Forum, we now have the Consumer Privacy Bill of Rights which says you control data about you. It’s not the phone company, it’s not the ad company. And interestingly what that does is it means that the data is more available because it’s more legitimate. People feel safer about using it.

I feel so much better knowing about the “Consumer Privacy Bill of Rights.” Don’t you?

With secret courts, imprisonment without formal charges, government sanctioned murder, torture, in the United States or at its behest, my data won’t be used against me.

You might want to read Leon Panetta Plays Chicken Little before you decide that the current administration, with its Consumer Privacy Bill of Rights has much concern for your privacy.

2) Cell phones are one of the biggest sources of Big Data. Smart phones are becoming universal remote controls.
….
Not so much in this country but in other parts of the world, your phone is the way you interface through the entire world. And so it’s also a window into what your choices are and what you do.

Having a single interface makes gathering intelligence a lot easier than hiring spies and collaborators.

Surveillance is cheaper in bulk quantities.

3) Big Data will be about moving past averages to understanding patterns at the individual level. Doing so will allow us to build a Periodic Table of human behavior.

SP: We’re moving past this sort of Enlightenment way of thinking in terms of markets and competition and big averages and asking, how can we make the information environment at the human level, at the individual level, work for everybody?

I see no signs of a lack of thinking in terms of markets and competition. Are Apple and Google competing? Are Microsoft and IBM competing? Are the various information gateways competing?

It is certainly that case that any of the aforementioned and others, would like to have everyone as a consumer.

Equality as a consumer for information service providers isn’t that interesting to me.

You?

The universal surveillance that Pentland foresees does offer opportunities for topic maps.

The testing of electronic identities tied to the universal interface, a cell phone.

For a fee, an electronic identity provider will build an electronic identity record tied to a cell phone with residential address, credit history, routine shopping entries, etc.

Topic maps can test how closely an identity matches other identities along a number of dimensions. (For seekers or hiders.)

The quoted post by: Conor Myhrvold and David Feinleib.

I first saw this at KDNuggets.

September 25, 2012

The Man Behind the Curtain

Filed under: Intelligence,Privacy — Patrick Durusau @ 2:33 pm

The Man Behind the Curtain

From the post:

Without any lead-in whatsoever, we just ask that you watch the video above.

And we ask that you hang on for a few moments—this goes far beyond the hocus pocus you’re thinking the clip contains.

You really need to see this video.

Then answer:

Should watchers to watch themselves?

Should people watch the watchers?

March 17, 2012

Lifebrowser: Data mining gets (really) personal at Microsoft

Filed under: Data Mining,Microsoft,Privacy — Patrick Durusau @ 8:20 pm

Lifebrowser: Data mining gets (really) personal at Microsoft

Nancy Owano writes:

Microsoft Research is doing research on software that could bring you your own personal data mining center with a touch of Proust for returns. In a recent video, Microsoft scientist Eric Horvitz demonstrated the Lifebrowser, which is prototype software that helps put your digital life in meaningful shape. The software uses machine learning to help a user place life events, which may span months or years, to be expanded or contracted selectively, in better context.

Navigating the large stores of personal information on a user’s computer, the program goes through the piles of personal data, including photos, emails and calendar dates. A search feature can pull up landmark events on a certain topic. Filtering the data, the software calls up memory landmarks and provides a timeline interface. Lifebrowser’s timeline shows items that the user can associate with “landmark” events with the use of artificial intelligence algorithms.

A calendar crawler, working with Microsoft Outlook extracts various properties from calendar events, such as location, organizer, and relationships between participants. The system then applies Bayesian machine learning and reasoning to derive atypical features from events that make them memorable. Images help human memory, and an image crawler analyzes a photo library. By associating an email with a relevant calendar date with a relevant document and photos, significance is gleaned from personal life events. With a timeline in place, a user can zoom in on details of the timeline around landmarks with a “volume control” or search across the full body of information.

Sounds like the start towards a “personal” topic map authoring application.

One important detail: With MS Lifebrowser the user is gathering information on themselves.

Not the same as having Google or FaceBook gathering information on you. Is it?

October 29, 2011

Printer Dots, Pervasive Tracking and the Transparent Society

Filed under: Pervasive Tracking,Privacy,Transparency — Patrick Durusau @ 7:32 pm

Printer Dots, Pervasive Tracking and the Transparent Society

From the post:

So far in the fingerprinting series, we’ve seen how a variety of objects and physical devices [1, 2, 3, 4], often even supposedly identical ones, can be uniquely fingerprinted. This article is non-technical; it is an opinion on some philosophical questions about tracking and surveillance.

Here’s a fascinating example of tracking that’s all around you but that you’re probably unaware of:

Color laser printers and photocopiers print small yellow dots on every page for tracking purposes.

My source for this is the EFF’s Seth Schoen, who has made his presentation on the subject available.

If you are tracking the provenance of data in your topic map, does that mean that you are also tracking the users who submitted it?

And is that tracking “transparent” to the users who are being tracked or only “transparent” to the aggregators of that submitted content?

Or is that tracking only “transparent” to the sysops who are working one level above the aggregators?

And at any level, what techniques would you suggest for tracking data, whether transparent or not?

For that matter, what techniques would you suggest for detecting tracking?

Ironic that “transparent” government may require that some (all?) of its citizens and their transactions be “transparent” as well. How else to track the web of influence of lobbyists of various sorts if their transactions are not “transparent?” Which then means their employers will need to be “transparent.” Along with the objects of their largesse and attention. And the web of actors just keeps spreading out.

« Newer Posts

Powered by WordPress