Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

December 28, 2015

Voter Record Privacy? WTF?

Filed under: Cybersecurity,Government,Government Data,Privacy — Patrick Durusau @ 10:23 pm

Leaky database tramples privacy of 191 million American voters by Dell Cameron.

From the post:

The voter information of more than 191 million Americans—including full names, dates of birth, home addresses, and more—was exposed online for anyone who knew the right IP address.

The misconfigured database, which was reportedly shut down at around 7pm ET Monday night, was discovered by security researcher Chris Vickery. Less than two weeks ago, Vickery also exposed a flaw in MacKeeper’s database, similarly exposing 13 million customer records.

What amazes me about this “leak” is the outrage is focused on the 191+ million records being online.

??

What about the six or seven organizations who denied being the owners of the IP address in question?

I take it none of them denied having possession of the same or essentially the same data, just that they didn’t “leak” it.

Quick question: Was voter privacy breached when these six or seven organizations got the same data or when it went online?

I would say when the Gang of Six or Seven got the same data.

You don’t have any meaningful voter privacy, aside from your actual ballot, and with your credit record (also for sale), you voting behavior can be nailed too.

You don’t have privacy but the Gang of Six or Seven do.

Attempting to protect lost privacy is pointless.

Making corporate overlords lose their privacy as well has promise.

PS: Torrents of corporate overlord data? Much more interesting than voter data.

December 27, 2015

China Pulls Alongside US in Race to No Privacy

Filed under: Censorship,Government,Privacy — Patrick Durusau @ 8:07 pm

China passes law requiring tech firms to hand over encryption keys by Mark Wilson.

From the post:

Apple may have said that it opposes the idea of weakening encryption and providing governments with backdoors into products, but things are rather different in China. The Chinese parliament has just passed a law that requires technology companies to comply with government requests for information, including handing over encryption keys.

Mark doesn’t provide a link to the text of the new law and I don’t read Chinese in any event. I will look for an English translation to pass onto you.

Reading from Mark’s summary, I assume “handing over encryption keys” puts China alongside the United States as far as breaking into iPhones.

Apple doesn’t have the encryption keys for later models of iPhones and therefore possesses nothing to be surrendered.

Now that China is even with the United States, who will take the lead in diminishing privacy is a toss-up. Not to be forgotten is France, with its ongoing “state of emergency.” Will that become a permanent state of emergency in 2016?

December 18, 2015

Cybersecurity Act of 2015 – Text

Filed under: Cybersecurity,Government,Law,Law - Sources,Privacy — Patrick Durusau @ 8:41 pm

Coverage of the “omnibus” bill and the Cybersecurity Act of 2015 has been everywhere on the Web but nary a pointer to the text passed by Congress.

Wouldn’t you rather read the text for yourself than have it summarized?

At this point, the only text I can point you to is in the Congressional Record for December 17, 2015.

The Cybersecurity Act of 2015 is in subsection N, which begins on page H9631, last column on your right and continues to the top of the last column to your right on page H9645.

Please ask media outlets, bloggers and others to include pointers to court decisions, legislation, etc. with their stories.

It’s a small thing but a big step towards an interconnected web of information, as opposed to the current disconnected web.

December 16, 2015

Privacy Alert! – CISA By Friday (18 December 2015) Time to Raise Hell!

Filed under: Cybersecurity,Government,Privacy,Security — Patrick Durusau @ 9:19 pm

Lawmakers Have Snuck CISA Into a Bill That Is Guaranteed to Become a Law by Jason Koebler.

From the post:

To anyone who has protested the sweeping, vague, and privacy-killing iterations of the Cybersecurity Information Sharing and Protection Act or the Cybersecurity Information Sharing Act over the last several years, sorry, lawmakers have heard you, and they have ignored you.

That sounds bleak, but lawmakers have stripped the very bad CISA bill of almost all of its privacy protections and have inserted the full text of it into a bill that is essentially guaranteed to be passed and will certainly not be vetoed by President Obama.

CISA allows private companies to pass your personal information and online goings-on to the federal government and local law enforcement if it suspects a “cybersecurity threat,” a term so broadly defined that it can apply to “anomalous patterns of communication” and can be used to gather information about just about any crime, cyber or not.

At 2 AM Wednesday morning, Speaker of the House Paul Ryan unveiled a 2000-page budget bill that will fund the federal government well into next year. The omnibus spending bill, as it’s usually referred to, is the result of countless hours of backroom dealings and negotiations between Republicans and Democrats.

Without the budget bill (or a short-term emergency measure), the government shuts down, as it did in 2013 for 16 days when lawmakers couldn’t reach a budget deal. It contains dozens of measures that make the country run, and once it’s released and agreed to, it’s basically a guarantee to pass. Voting against it or vetoing it is politically costly, which is kind of the point: Republicans get some things they want, Democrats get some things they want, no one is totally happy but they live with it anyway. This is how countless pieces of bad legislation get passed in America—as riders on extremely important pieces of legislation that are politically difficult to vote against.

See Jason’s post for the full story but you get the gist of it, your privacy rights will be terminated to a large degree this coming Friday.

I don’t accept Jason’s fatalism, however.

There still remains time for members of Congress to strip the rider from the budget bill, but only if everyone raises hell with their representatives and senators between now and Friday.

We need to overload every switchboard in the 202 area code with legitimate, personal calls to representatives and senators. Fill up every voice mail box, every online message storage, etc.

Those of you will personal phone numbers, put them to good use. Call, now!

This may not make any difference, but, members of Congress can’t say they weren’t warned before taking this fateful step.

When Congress signals it doesn’t care about our privacy, then we damned sure don’t have to care about theirs.

We Know How You Feel [A Future Where Computers Remain Imbeciles]

We Know How You Feel by Raffi Khatchadourian.

From the post:

Three years ago, archivists at A.T. & T. stumbled upon a rare fragment of computer history: a short film that Jim Henson produced for Ma Bell, in 1963. Henson had been hired to make the film for a conference that the company was convening to showcase its strengths in machine-to-machine communication. Told to devise a faux robot that believed it functioned better than a person, he came up with a cocky, boxy, jittery, bleeping Muppet on wheels. “This is computer H14,” it proclaims as the film begins. “Data program readout: number fourteen ninety-two per cent H2SOSO.” (Robots of that era always seemed obligated to initiate speech with senseless jargon.) “Begin subject: Man and the Machine,” it continues. “The machine possesses supreme intelligence, a faultless memory, and a beautiful soul.” A blast of exhaust from one of its ports vaporizes a passing bird. “Correction,” it says. “The machine does not have a soul. It has no bothersome emotions. While mere mortals wallow in a sea of emotionalism, the machine is busy digesting vast oceans of information in a single all-encompassing gulp.” H14 then takes such a gulp, which proves overwhelming. Ticking and whirring, it begs for a human mechanic; seconds later, it explodes.

The film, titled “Robot,” captures the aspirations that computer scientists held half a century ago (to build boxes of flawless logic), as well as the social anxieties that people felt about those aspirations (that such machines, by design or by accident, posed a threat). Henson’s film offered something else, too: a critique—echoed on television and in novels but dismissed by computer engineers—that, no matter a system’s capacity for errorless calculation, it will remain inflexible and fundamentally unintelligent until the people who design it consider emotions less bothersome. H14, like all computers in the real world, was an imbecile.

Today, machines seem to get better every day at digesting vast gulps of information—and they remain as emotionally inert as ever. But since the nineteen-nineties a small number of researchers have been working to give computers the capacity to read our feelings and react, in ways that have come to seem startlingly human. Experts on the voice have trained computers to identify deep patterns in vocal pitch, rhythm, and intensity; their software can scan a conversation between a woman and a child and determine if the woman is a mother, whether she is looking the child in the eye, whether she is angry or frustrated or joyful. Other machines can measure sentiment by assessing the arrangement of our words, or by reading our gestures. Still others can do so from facial expressions.

Our faces are organs of emotional communication; by some estimates, we transmit more data with our expressions than with what we say, and a few pioneers dedicated to decoding this information have made tremendous progress. Perhaps the most successful is an Egyptian scientist living near Boston, Rana el Kaliouby. Her company, Affectiva, formed in 2009, has been ranked by the business press as one of the country’s fastest-growing startups, and Kaliouby, thirty-six, has been called a “rock star.” There is good money in emotionally responsive machines, it turns out. For Kaliouby, this is no surprise: soon, she is certain, they will be ubiquitous.

This is a very compelling look at efforts that have in practice made computers more responsive to the emotions of users. With the goal of influencing users based upon the emotions that are detected.

Sound creepy already?

The article is fairly long but a great insight into progress already being made and that will be made in the not too distant future.

However, “emotionally responsive machines” remain the same imbeciles as they were in the story of H14. That is to say they can only “recognize” emotions much as they can “recognize” color. To be sure it “learns” but its reaction upon recognition remains a matter of programming and/or training.

The next wave of startups will create programmable emotional images of speakers, edging the arms race for privacy just another step down the road. If I were investing in startups, I would concentrate on those to defeat emotional responsive computers.

If you don’t want to wait for a high tech way to defeat emotionally responsive computers, may I suggest a fairly low tech solution:

Wear a mask!

One of my favorites:

Egyptian_Guy_Fawkes_Mask

(From https://commons.wikimedia.org/wiki/Category:Masks_of_Guy_Fawkes. There are several unusual images there.)

Or choose any number of other masks at your nearest variety store.

A hard mask that conceals your eyes and movement of your face will defeat any “emotionally responsive computer.”

If you are concerned about your voice giving you away, search for “voice changer” for over 4 million “hits” on software to alter your vocal characteristics. Much of it for free.

Defeating “emotionally responsive computers” remains like playing checkers against an imbecile. If you lose, it’s your own damned fault.

PS: If you have a Max Headroom type TV and don’t want to wear a mask all the time, consider this solution for its camera:

120px-Cutting_tool_2

Any startups yet based on defeating the Internet of Things (IoT)? Predicting 2016/17 will be the year for those to take off.

December 14, 2015

Fixing Bugs In Production

Filed under: Humor,Privacy,Programming,Security — Patrick Durusau @ 8:48 pm

MΛHDI posted this to twitter and it is too good not to share:

Amusing now but what happens when the illusion of “static data” disappears and economic activity data is streamed from every transaction point?

Your code and analysis will need to specify the time boundaries of the data that underlie your analysis. Depending on the level of your analysis, it may quickly become outdated as new data streams in for further analysis.

To do the level of surveillance that law enforcement longs for in the San Bernardino attack, you would need real time sales transaction data for the last 5 years, plus bank records and “see something say something” reports on 322+ million citizens of the United States.

Now imagine fixing bugs in that production code, when arrest and detention, if not more severe consequences await.

December 9, 2015

How Effective Is Phone Data Mining?

Filed under: Government,Privacy — Patrick Durusau @ 11:04 am

If you missed Drug Agents Use Vast Phone Trove, Eclipsing N.S.A.’s by Scott Shane and Colin Moynihan when it first appeared in 2013, take a look at it now.

From the post:

For at least six years, law enforcement officials working on a counternarcotics program have had routine access, using subpoenas, to an enormous AT&T database that contains the records of decades of Americans’ phone calls — parallel to but covering a far longer time than the National Security Agency’s hotly disputed collection of phone call logs.

The Hemisphere Project, a partnership between federal and local drug officials and AT&T that has not previously been reported, involves an extremely close association between the government and the telecommunications giant.

The government pays AT&T to place its employees in drug-fighting units around the country. Those employees sit alongside Drug Enforcement Administration agents and local detectives and supply them with the phone data from as far back as 1987.

The project comes to light at a time of vigorous public debate over the proper limits on government surveillance and on the relationship between government agencies and communications companies. It offers the most significant look to date at the use of such large-scale data for law enforcement, rather than for national security.

The leaked presentation slides that inform this article claim some success stories but don’t offer an accounting for the effort expended and its successes.

Beyond the privacy implications, the potential for governmental overreaching, etc., there remains the question of how much benefit is being gained for the cost of the program.

Rather than an airy policy debate, numbers on expenditures and results could empower a far more pragmatic debate on this program.

I don’t doubt the success stories but random chance dictates that some drug dealers will be captured every year, whatever law enforcement methods are in place.

More data on phone data mining by the Drug Enforcement Administration could illustrate how effective or ineffective such mining is in the enforcement of drug laws. Given the widespread availability of drugs, I am anticipating a low score on that test.

Should that prove to be the case, it will be additional empirical evidence to challenge the same methods being used, ineffectively, in the prosecution of the “war” on terrorism.

Proving that such methods are ineffectual in addition to being violations of privacy rights could be what tips the balance in favor of ending all such surveillance techniques.

December 7, 2015

Untraceable communication — guaranteed

Filed under: Cybersecurity,Privacy,Security — Patrick Durusau @ 8:50 pm

Untraceable communication — guaranteed by Larry Hardesty.

From the post:

Anonymity networks, which sit on top of the public Internet, are designed to conceal people’s Web-browsing habits from prying eyes. The most popular of these, Tor, has been around for more than a decade and is used by millions of people every day.

Recent research, however, has shown that adversaries can infer a great deal about the sources of supposedly anonymous communications by monitoring data traffic though just a few well-chosen nodes in an anonymity network. At the Association for Computing Machinery Symposium on Operating Systems Principles in October, a team of MIT researchers presented a new, untraceable text-messaging system designed to thwart even the most powerful of adversaries.

The system provides a strong mathematical guarantee of user anonymity, while, according to experimental results, permitting the exchange of text messages once a minute or so.

“Tor operates under the assumption that there’s not a global adversary that’s paying attention to every single link in the world,” says Nickolai Zeldovich, an associate professor of computer science and engineering, whose group developed the new system. “Maybe these days this is not as good of an assumption. Tor also assumes that no single bad guy controls a large number of nodes in their system. We’re also now thinking, maybe there are people who can compromise half of your servers.”

Because the system confuses adversaries by drowning telltale traffic patterns in spurious information, or “noise,” its creators have dubbed it “Vuvuzela,” after the noisemakers favored by soccer fans at the 2010 World Cup in South Africa.

Pay particular attention to the generation of dummy messages as “noise.”

In topic map terms, I would say that the association between sender and a particular message, or between the receiver and a particular message, its identity has been obscured.

The reverse of the usual application of topic map principles. Which is a strong indication that the means to identify those associations, are also establishing associations and their identities. Perhaps not in traditional TMDM terms but they are associations with identities none the less.

For some unknown reason, the original post did not have a link to the article, Vuvuzela: Scalable Private Messaging Resistant to Traffic Analysis by Jelle van den Hooff, David Lazar, Matei Zaharia, and Nickolai Zeldovich.

The non-technical post concludes:

“The mechanism that [the MIT researchers] use for hiding communication patterns is a very insightful and interesting application of differential privacy,” says Michael Walfish, an associate professor of computer science at New York University. “Differential privacy is a very deep and sophisticated theory. The observation that you could use differential privacy to solve their problem, and the way they use it, is the coolest thing about the work. The result is a system that is not ready for deployment tomorrow but still, within this category of Tor-inspired academic systems, has the best results so far. It has major limitations, but it’s exciting, and it opens the door to something potentially derived from it in the not-too-distant future.”

It isn’t clear how such a system would defeat an adversary that has access to all the relevant nodes. Where “relevant nodes” is a manageable subset of all the possible nodes in the world. It’s unlikely that any adversary, aside from the NSA, CIA and other known money pits, would attempt to monitor all network traffic.

But monitoring all network traffic is both counter-productive and unnecessary. In general, one does not set out from the Washington Monument in the search of spies based in the United States. Or at least people who hope to catch spies don’t. I can’t speak for the NSA or CIA.

While you could search for messages between people unknown to you, that sounds like a very low-grade ore mining project. You could find a diamond in the rough, but its unlikely.

The robustness of this proposal should assume that both the sender and receiver have been identified and their network traffic is being monitored.

I think what I am groping towards is the notion that “noise” comes too late in this proposal. If either party is known, or suspected, it may be time consuming to complete the loop on the messages but adding noise at the servers is more of an annoyance than serious security.

At least when the adversary can effectively monitor the relevant nodes. Assuming that the adversary can’t perform such monitoring, seems like a risky proposition.

Thoughts?

December 6, 2015

Does Your Hello Barbie Have An STD? (IIoT)

Filed under: Cybersecurity,Privacy,Security — Patrick Durusau @ 3:36 pm

[STD = Security Transmitted Disease]

Internet-connected Hello Barbie doll gets bitten by nasty POODLE crypto bug by Dan Goodin.

From the post:

A recent review of the Internet-connected Hello Barbie doll from toymaker Mattel uncovered several red flags. Not only did the toy use a weak authentication mechanism that made it possible for attackers to monitor communications the doll sent to servers, but those servers were also vulnerable to POODLE, an attack disclosed 14 months ago that breaks HTTPS encryption.

The vulnerabilities, laid out in a report published Friday by security firm Bluebox Labs, are the latest black eye for so-called “Internet of Things” devices. The term is applied to appliances and other everyday devices that are connected to the Internet, supposedly to give them a wider range of capabilities. The Hello Barbie doll is able to hold real-time conversations by uploading the words a child says to a server. Instant processing on the server then allows the doll to provide an appropriate response.

Bluebox researchers uncovered a variety of weaknesses in the iOS and Android app developed by Mattel partner ToyTalk. The apps are used to connect the doll to a nearby Wi-Fi networks. The researchers also reported vulnerabilities in the remote server used to communicate with the doll.

Insecure baby monitors, hacked dolls are only the leading edges of the Insecure Internet of Things (IIoT).

Dan’s post has the details of the Security-Transmitted-Disease (STD) that can infect Hello Barbie servers and hence the dolls themselves.

When dolls, toys and other devices develop video capabilities, amateur porn will explode on the Insecure Internet of Things (IIoT). With or without the consent of the porn participants.

If you want a secure internet-of-things, the avoid the sieve stacking stacking strategy of current software fixes, which layers broken security software on top of broken software:

present-IT-stack-plus-security

Software security starts from the bottom of your software stack and goes upward.

For all the wailing of software developers about the inability to have perfect software, realize that sql injection attacks were the #1 attack in 2013. That is more than fifteen years after the attack was documented.

Don’t buy into the “we can build perfect software” scam. No one is expecting perfect software, just software that doesn’t have 5+ year old flaws in it.

Is that too much to expect?

Heavy civil penalties for 5+ year old bugs in software might help the software industry remember to avoid such bugs.

December 5, 2015

JAERO: Classic Aero SatCom ACARS signals [Alert: Rendition Trackers]

Filed under: Government,Privacy,Security — Patrick Durusau @ 9:28 pm

JAERO: A program to demodulate and decode Classic Aero SatCom ACARS signals by Jonathan Olds.

From the webpage:

JAERO is a program that demodulates and decodes Classic Aero ACARS (Aircraft Communications Addressing and Reporting System) messages sent from satellites to Aeroplanes (SatCom ACARS) commonly used when Aeroplanes are beyond VHF range. Demodulation is performed using the soundcard. Such signals are typically around 1.5Ghz and can be received with a simple low gain antenna that can be home brewed in a few hours in conjunction with a cheap RTL-SDR dongle.

In the advent of MH370, Classic Aero has become a well-known name. A quick search on the net using “Classic Aero MH370” will produce thousands of results. The Classic Aero signals sent from satellites to the Aeroplanes are what JAERO demodulates and decodes.

I am sure rendition trackers have these and even more sophisticated passive tracking capabilities but I pass this on as a possible starting place for would be civilian surveillance specialists.

Governments are obsessed with surveillance, so much so that civilians need to return the favor, building passive and distributed systems of surveillance that surpass anything governments can obtain.

An interesting hobby to intercept such signals, which are falling in your yard right now, and even more interesting hobby if you capture those signals and share them with other hobbyists. Perhaps even some data science types who can munge the data to bring out interesting insights. Such as on rendition flights.

A couple of tips: Disguise your external receiving equipment to look like a disgruntled satellite TV subscriber (is there any other kind?). Probably a good idea to not discuss monitoring aircraft or other government activities at the local barber shop.

I’m not a conspiracy theorist about government activities but if they didn’t intend someone harm, then why do they keep secrets? If you think about it, none of the many decried data leaks over the last 50 years have resulted in a single American being harmed.

Some of them were embarrassed and probably should have gone to jail (Oliver North) but for all the conjured harms of data leaks, not one has made a whit of difference.

Makes me wonder if secrecy is a means to conceal incompetence and venal criminal wrongdoing.

You?

Become a Mini-CIA Today!

Filed under: Government,Privacy,Security — Patrick Durusau @ 3:54 pm

New software watches for license plates, turning you into Little Brother by Cyrus Farivar.

From the post:

We now live in a world where if you have an IP-enabled security camera, you can download some free, open-source software from GitHub and boom—you have a fully functional automated license plate reader (ALPR, or LPR).

How very cool!

I know some privacy advocates may be troubled by this development but think of the old adage:

When guns are outlawed, only outlaws will have guns.

Applying that to surveillance:

When surveillance is outlawed, only outlaws will have surveillance.

Yes?

With the OpenALPR software, neighborhoods that see a lot of police violence can track and alert residents to repeat offenders who are entering the area. Or drug dealers, pimps or other scourges of modern urban areas.

And it would be a counter to suddenly malfunctioning dashboard and body cameras worn by the police.

As I have mentioned before, there are far more citizens than government-based agents. If we start surveillance on them, they will have no place to hide and no where left to run.

Being IP enabled, you could set up a central monitoring station, possibly sharing information with citizens interested in real time “traffic” information.

PS: If you keep the video or scanning results, be sure it is streamed to a heavily encrypted drive.

December 4, 2015

Leaving Pakistan in the Stone Ages

Filed under: Cybersecurity,Government,Privacy,Security — Patrick Durusau @ 10:47 am

BlackBerry gets bounced from Pakistan after saying no to backdoors by John Zorabedian.

From the post:

BlackBerry is saying “no” to government backdoor access to communications on its services and devices, in actions that speak louder than words.

Earlier this week, BlackBerry announced it is shutting down its operations in Pakistan and will leave the country by 30 December, after refusing to provide Pakistan’s government with backdoor access to its customers’ communications.

Marty Beard, BlackBerry’s chief operating officer, wrote on the company’s blog that the Pakistan Telecommunications Authority told mobile phone operators in July that BlackBerry would no longer be allowed to operate in the country for “security reasons.”

Beard said that Pakistan wanted unfettered access to all emails, BBM messages and other Blackberry Enterprise Service (BES) traffic, but the company refused on principle:

Finally, a viable alternative to bombing countries back into the Stone Ages. Just leave them technologically in the Stone Ages. See how their citizens, businesses, crime lords, etc. feel about that!

If Pakistan was demanding backdoors from BlackBerry, you have to wonder what demands have been made on other communication service providers. Yes?

One hopes that such service providers, including those that control the pipes in and out of Pakistan will take stances like that of BlackBerry.

Would be sad to see Pakistan suddenly go dark on the Web but choices do have consequences. Isolated from the modern communications networks used by other countries, government officials will have lots of time for their paranoid fantasies.

Best prepare for a sudden exit of capital and bright minds from Pakistan once it goes dark. Not altogether sure communications should be restored even if the government changes. Let it stay dark as a lesson about governmental over reaching for the rest of the world.

PS: If you want a less extreme lesson first, cut Pakistan off of the Internet for a week, just as a warning to its government.

December 2, 2015

Signal Desktop beta! [End-to-end encryption] e3

Filed under: Cybersecurity,Government,Privacy,Security — Patrick Durusau @ 2:36 pm

Signal Desktop beta!

From the post:

Today we’re making the Signal Desktop beta available. Signal Desktop brings the trusted private messaging experience of Signal to the desktop, with a simplicity that allows you to seamlessly continue conversations back and forth between your mobile device and your desktop computer.

Private messaging, now with all ten fingers

As always, everything is end-to-end encrypted and painstakingly engineered in order to keep your communication safe – allowing you to send high-quality private group, text, picture, and video messages for free.

(graphic omitted)

Multiple devices, single identifier

Signal Desktop is a Chrome app which links with your phone, so all incoming and outgoing messages are displayed consistently on all your devices. Your contacts don’t have to guess where to message you, and when you switch devices the conversation you started will already be there.

(graphic omitted)

Android devices only, for now

For the initial Signal desktop beta, only linking to Android devices is supported. Follow us on twitter for updates on when the iOS app supports Signal Desktop.

View source

All of our code is free, open source, and available on GitHub. This allows experts to verify our protocols and our implementations.

Like everything we do at Open Whisper Systems, dedicated development is supported by community donations and grants. Signal Desktop contains no advertisements, and it doesn’t cost anything to use.

Terrorists don’t use encrypted messaging, but that is not a reason for you to avoid end-to-end encryption.

From the How do I help? page:

  • Spread the word – Open WhisperSystems is a collaborative open source project and does not have a dedicated PR department. We rely on our users to help explain the benefits of using our software. Friends don’t let friends send plaintext!
  • Help your fellow users – Join our mailing list and assist new and existing users, or help shape the future of the project by participating in the discussions that are held there. We also appreciate assistance with marking issues as duplicates in our GitHub repos, answering questions that are raised in the form of issues, helping to reproduce bugs while providing additional details, or performing other forms of triage.
  • Contribute code – If you have Android or iOS development experience, please consider helping us tackle some of the open issues in our repositories. Pull requests for new features and functionality are also welcome.
  • Contribute moneyBitHub is our experiment in collaborative funding for open source projects. Donating to our BitHub pool provides an additional incentive for developers to contribute their work and time. You can also donate to Open WhisperSystems via the Freedom of the Press Foundation.

The Open WhisperSystems project is where it is today because of your help and support. Thank you!

To market end-to-end encryption, would e3 be a good logo?

Ex. If e3 is banned, only criminals will have e3.

November 28, 2015

Better Than An Erector Set! — The Deep Sweep (2015)

Filed under: Government,Privacy — Patrick Durusau @ 4:53 pm

The Deep Sweep (2015) High-altitude Signal Research

800-comp-mid-plot-0

From the introduction:

The Deep Sweep is an aerospace probe scanning the otherwise out-of-reach signal space between land and stratosphere, with special interest placed in UAV/drone to satellite communication.

Taking the form of a high-altitude weather balloon, tiny embedded computer and RF equipment, The Deep Sweep project is being developed to function as a low-cost, aerial signal-intelligence (SIGINT) platform. Intended for assembly and deployment by public, it enables surveying and studying the vast and often secretive world of signal in our skies.

Two launches have been performed so far, from sites in Germany, landing in Poland and Belarus respectively.

We intend to make many more, in Europe and beyond.

What a cool homebrew project!

Warning: There are legitimate concerns for air safety when performing this type of research. Governments that engage in questionable practices with UAV/drone hardware are unlikely to welcome detection of their nefarious activities.

I liked the notion of bugs (surveillance devices) that “bite” upon discovery in Marooned in Realtime. Depending upon your appetite for risk, you may want to consider such measures in a hostile environment.

The biggest risk of the narrated approach is that you have to physically recover the probe. All sorts of things could go sideways depending on your operating environment.

Still, a good read and quite instructive on what has been done.

Future improvements could include capturing data, injecting data, taking control UAV/drone vehicles that are not yours, just to name a few.

Up to you to create what comes next.

November 4, 2015

The Disappearance of Privacy in the UK

Filed under: Government,Privacy,Security — Patrick Durusau @ 2:00 pm

Investigatory Powers Bill: what’s in it, and what does it mean? by Matt Burgess.

From the post:

Internet service providers will have to store the details of every website people visited for 12 months if the new draft Investigatory Powers Bill is passed, the government has confirmed.

The measure was announced by Home Secretary Theresa May in the House of Commons and is included in a raft of new powers intended to reform the way MI5, MI6, GCHQ, and others use surveillance powers.

May said that “communication records up to 12 months” will have to be stored by internet and communications service providers.

This means the individual webpage — “just the front page of the websites,” in May’s words — will be kept. She distinguished between domains visited and “content” — including individual pages, searches and other information — which will not be stored.

In a lengthy statement to parliament, May reiterated that the powers were intended to allow security services to protect the public, and particularly children, against threats including terrorism, organised crime and sexual predators.

At least from the standpoint of protecting the public and children from organized crime and sexual predators, full monitoring of government offices would do more good than surveillance of the general public.

As far as terrorism, people in the UK, those old enough to remember less pleasant times in Northern Ireland, know that the modern “terrorism” is a fiction, wrapped in a lie and hidden behind national security interests.

The interests of the security agencies and their contractors are the only ones being served by concerns over “terrorism.”

The Investigatory Powers Bill, all 299 pages, is online.

Curious, is anyone working on a comparison of the Investigatory Powers Bill and the Patriot Act?

The full text of the Patriot Act (Public Law version).

I have read snippets of the Patriot Act but not in its entirety. It’s a difficult read because it amends some existing statutes, inserts entirely new content and creates new statutes as well.

A comparison of these two offenses against the citizens of the UK and the US, respectively, might prove to be useful in opposing them.

With the caveat that whatever new outrages against citizens are contained in the UK bill will be doubled down by the US against its own.

I first saw this in a tweet by Simon Brunning.

October 28, 2015

Twitter – Tying Your Twitter Account to SMS-Enabled Phone

Filed under: Cybersecurity,Government,NSA,Privacy,Security — Patrick Durusau @ 3:30 pm

I tried to create a new Twitter account today but much to my surprise I could not use a phone number already in use by another Twitter account.

Moreover, the phone number has to be of an SMS-enabled phone.

I understand the need for security but you do realize that the SMS-enabled phone requirement ties your Twitter account to a particular phone. Yes?

Now, who was it that was tracking all phone traffic?

Oh, I remember, Justice Department plotting to resume NSA bulk phone records collection, it was the NSA!

The number of government mis-steps and outrages in just a few months is enough to drive earlier ones from immediate memory. It’s sad to have a government that deeply incompetent and dishonest.

The SMS-enabled phone requirement of Twitter makes binding your Twitter posts to a specific phone easy.

Although it will be portrayed as requiring sophisticated analysis tools in order to justify the NSA’s budget.

Suggestion: Twitter should display the SMS code on a page returned to the browser requesting an account.

Unless of course, Twitter has already joined itself at the hip to the NSA.

October 26, 2015

perv_magnet (10 years of troll abuse published by violinist)

Filed under: Privacy — Patrick Durusau @ 2:24 pm

10 years of troll abuse published by violinist by Lisa Vaas.

From the post:


Matsumiya, who’s based in Los Angeles, describes herself as a violinist and a “perv magnet.” Reading the messages published on Instagram under her perv_magnet account, it’s obvious that she’s not exaggerating.

She’s using Instagram to demonstrate the violence, aggression and volume of messages she’s received and captured via screenshot over 10 years, in an effort to show how relentlessly women are abused online.

I won’t quote any of the abusive messages but you can see the entire collection at: perv_magnet.

Lisa does a good job of covering the issue and possible responses and then says:

The rest of us non-celebrities must bear in mind that taking on trolls can have extremely dangerous consequences.

I’m thinking here of swatting.

Unless you are cultivating marijuana in the basement with grow lights or running a meth lab, I’m not sure how dangerous “swatting” is to an ordinary citizen.

To be sure, don’t make sudden moves when heavily armed police burst into your home but how many of us are likely to be cleaning an AK-47 or RPG when that happens?

The real danger is dealing with trolls is going it alone. One on one, since trolls don’t appear to have any life beyond their closets and keyboards, a troll can spend more time on you can you can possibly use to reply to them.

But what if there were a network of troll fighters? Could the average troll deal with five people responding? What about 50? For really troublesome trolls, how about 500? Or more? In combination with hackers who push back against them in the real world.

No one wants government regulation of the web so it falls to its users to reduce any cause for government intervention.

Simple ignorance on my part but are there any anti-troll networks currently in operation? I would like to volunteer some cycles to their efforts.

PS: I am untroubled by the “freedom of speech” argument of trolls. In the struggle between abusers and the abused, I chose sides a long time ago. Helps get down to concrete cases without a lot of theoretical machinery.

October 18, 2015

Drone Registration Coming! Call the NRA!

Filed under: Government,Privacy — Patrick Durusau @ 7:09 pm

US government will reportedly require all drone purchases to be registered by Chris Welch.

From the post:

The US government plans to make it a mandatory requirement that all drone purchases, including those made by consumers, be formally registered. NBC News reports that the Department of Transportation will announce the new plan on Monday, with hopes to have this drone registry implemented by the holidays, when drones will likely prove a popular gift. The Obama administration and DoT have yet to announce any such press conference for Monday.

Chris promises more details so follow @chriswelch.

Registration of drones isn’t going to help regulate drones, unless of course the drones have identifying marks and/or broadcast their registration. Yes?

In other words, registration of drones is a means of further government surveillance on where and when you fly your drone.

If you want an unregistered drone, buy one before regulations requiring registration go into effect.

So long as you are obeying all aviation laws, the government has no right to know where and when you fly your drone.

Hopefully the NRA will realize that preserving gun ownership where the government tracks:

  • All your phone calls.
  • All your emails.
  • All your web traffic.
  • All your cell phones.
  • All your credit cards.
  • All your purchases.
  • All your use of drones.

isn’t all that meaningful by itself.

Tracking the government and its servants is a first step towards ending the current surveillance state.

October 16, 2015

October 6, 2015

Promises Of Stronger Protections From A Habitual Liar?

Filed under: Cybersecurity,Government,Privacy,Security — Patrick Durusau @ 7:41 pm

Europe’s highest court strikes down Safe Harbor data sharing between EU, US by Sebastian Anthony.

From the post:

Europe’s top court, the Court of Justice of the European Union (CJEU), has struck down the 15-year-old Safe Harbour agreement that allowed the free flow of information between the US and EU. The most significant repercussion of this ruling is that American companies, such as Facebook, Google, and Twitter, may not be allowed to send user data from Europe back to the US.

The full text of the decision: decision (link to the full text).

A repeated theme in discussion of this decision is the need for stronger promises by the U.S. to protect European privacy rights.

I’ll be the first to admit that I don’t follow some segments of the news very closely but surely most people have heard of Edward Snowden. Yes?

I won’t recite the history of his disclosures here but suffice it to say that his revelations establish beyond any doubt that the United States government has systematically disobeyed it own laws and the laws of other countries in surveillance and other areas. If that weren’t bad enough, the U.S. government has repeated lied to the people it governs and other countries.

Let’s assume that the United States government agrees to very strong provisions for guarding the privacy of EU citizens. On what basis would you trust such a promise? A government willing to break it own laws, to lie to its own people, certainly will have no qualms lying to other countries.

In litigation that challenges any future agreement on the transfer of user data from Europe to the United States, the Court of Justice of the European Union (CJEU) should take judicial notice that the United States is in fact a habitual liar and its word counts for nothing in its proceedings.

I don’t know how long it will take the United States to regain credibility in international courts but it has fully and well earned the designation “habitual liar” in present proceedings.

September 18, 2015

Tor relay turned back on after unanimous library vote

Filed under: Censorship,Privacy — Patrick Durusau @ 9:44 am

Tor relay turned back on after unanimous library vote by Lisa Vaas.

From the post:

Live free or die.

That, possibly the most well-known of US state mottos, is declared on vehicle license plates throughout the verdant, mountainous, cantankerous state of New Hampshire.

True to that in-your-face independence, on Tuesday evening, in the New Hampshire town of Lebanon, the Lebanon Libraries board unanimously seized freedom and privacy by flipping the bird to the Department of Homeland Security (DHS) and its grudge against the Tor network.

Dozens of community members had come to the meeting to chime in on whether the Kilton Public Library should go ahead with a project to set up a Tor relay: a project that was shelved after a DHS agent reached out to warn New Hampshire police – or, as some classify it, spread FUD – that Tor shields criminals.

Boston librarian Alison Macrina, the mastermind behind the Library Freedom Project (LFP) and its plan to install exit nodes in libraries in collaboration with the Tor Project, said in an article on Slate (co-authored with digital rights activist April Glaser) that the unanimous vote to reinstate the library’s Tor relay was greeted with enthusiasm:

When library director Sean Fleming declared that the relay would go back online, a huge round of applause rang out. The citizens of Lebanon fought to protect privacy and intellectual freedom from the Department of Homeland Security’s intimidation tactics - and they won.

One bright spot of news in the flood of paranoid reports concerning terrorism and government demands for greater surveillance of everyone.

If you aren’t running Tor you should be.

Privacy is everyone’s concern.

September 16, 2015

Elliptic Curve Cryptography: a gentle introduction

Filed under: Cryptography,Privacy — Patrick Durusau @ 9:06 pm

Elliptic Curve Cryptography: a gentle introduction by Andrea Corbellini.

From the post:

Those of you who know what public-key cryptography is may have already heard of ECC, ECDH or ECDSA. The first is an acronym for Elliptic Curve Cryptography, the others are names for algorithms based on it.

Today, we can find elliptic curves cryptosystems in TLS, PGP and SSH, which are just three of the main technologies on which the modern web and IT world are based. Not to mention Bitcoin and other cryptocurrencies.

Before ECC become popular, almost all public-key algorithms were based on RSA, DSA, and DH, alternative cryptosystems based on modular arithmetic. RSA and friends are still very important today, and often are used alongside ECC. However, while the magic behind RSA and friends can be easily explained, is widely understood, and rough implementations can be written quite easily, the foundations of ECC are still a mystery to most.

With a series of blog posts I’m going to give you a gentle introduction to the world of elliptic curve cryptography. My aim is not to provide a complete and detailed guide to ECC (the web is full of information on the subject), but to provide a simple overview of what ECC is and why it is considered secure, without losing time on long mathematical proofs or boring implementation details. I will also give helpful examples together with visual interactive tools and scripts to play with.

Specifically, here are the topics I’ll touch:

  1. Elliptic curves over real numbers and the group law (covered in this blog post)
  2. Elliptic curves over finite fields and the discrete logarithm problem
  3. Key pair generation and two ECC algorithms: ECDH and ECDSA
  4. Algorithms for breaking ECC security, and a comparison with RSA

In order to understand what’s written here, you’ll need to know some basic stuff of set theory, geometry and modular arithmetic, and have familiarity with symmetric and asymmetric cryptography. Lastly, you need to have a clear idea of what an “easy” problem is, what a “hard” problem is, and their roles in cryptography.

Ready? Let’s start!

Whether you can make it through this series of posts or not, it remains a great URL to have show up in a public terminal’s web browsing history.

Even if you aren’t planning on “going dark,” you can do your part to create noise that will cover those who do.

Take the opportunity to visit this site and other cryptography resources. Like the frozen North, they may not be around for your grandchildren to see.

Theoretical Encryption Horror Stories

Filed under: Cybersecurity,Privacy — Patrick Durusau @ 8:45 pm

FBI Keeps Telling Purely Theoretical Encryption Horror Stories by Jenna McLaughlin.

Jenna reports the best quote I have seen from FBI Director James Comey on the criminals “going dark:”


Previous examples provided by FBI Director James Comey in October to illustrate the dangers of “going dark” turned out to be almost laughable. Comey acknowledged at the time that he had “asked my folks just to canvas” for examples he could use, “but I don’t think I’ve found that one yet.” Then he immediately added: “I’m not looking.”

Jenna’s post should be read verbatim into every committee, sub-committee and other hearing conducted by Congress on encryption issues.

What is more disturbing than the FBI lacking evidence for its position on encryption and neglecting to even see if it exists, is that FBI representatives are still appear as witnesses in court, before Congress and are taken seriously by the news media.

What other group could admit that their “facts” were in truth fantasies and still be taken seriously by anyone?

The FBI should return to the pursuit of legitimate criminals (of which there appears to be no shortage) or be ignored and disbelieved by everyone starting with judges and ending with the news media.

September 3, 2015

Poor Fantasy Adulterers [Ashley Madison]

Filed under: Cybersecurity,Privacy,Security — Patrick Durusau @ 1:59 pm

Farhad Manjoo writes in Hacking Victims Deserve Empathy, Not Ridicule:


But the theft and disclosure of more than 30 million accounts from Ashley Madison, a site that advertises itself as a place for married people to discreetly set up extramarital affairs, is different. After the hacking, many victims have been plunged into the depths of despair. In addition to those contemplating suicide, dozens have told Mr. Hunt that they feared losing their jobs and families, and they expected to be humiliated among friends and co-workers.

But the victims of the Ashley Madison hacking deserve our sympathy and aid because, with slightly different luck, you or I could just as easily find ourselves in a similarly sorry situation. This breach stands as a monument to the blind trust many of us have placed in our computers — and how powerless we all are to evade the disasters that may befall us when the trust turns out to be misplaced.

Being seen at a high-end restaurant when you are “working late” by your spouse, or your spouse finding condoms (which you don’t use at home) in your jacket, or your boss seeing you exiting a co-worker’s hotel room in a state of undress, differs from a cyberhack outing in what way?

All of those cases would induce fear of losing family, job, and humiliation among friends and co-workers. Yes?

We know now that almost no women used the Ashley Madison site so truth in advertising leads to: “Life’s short. Have a fantasy affair.

The Ashley Madison data should be made publicly available to everyone.

None of the people verified as giving Ashley Madison credit card data and a profile, should ever be given access to any IT system. Ever. (full stop)

Anyone giving information that could be used for blackmail purposes to an online adultery site is a security risk. Best to weed them out of your IT system post-haste.

Victims in a VISA, Mastercard or the OMB hack are different. They supplied information for legitimate purposes and the act of submission carries no potential for blackmail.

Ashley Madison customers supplied personal data, knowing their membership could be used for blackmail purposes.

Perhaps that is too subtle a distinction for the New York Times or the Ashley Madison data has an abundance of yet undisclosed email addresses.

August 16, 2015

AT&T’s Betrayal of Its Customers

Filed under: Privacy,Security — Patrick Durusau @ 6:56 pm

NSA Spying Relies on AT&T’s ‘Extreme Willingness to Help’ by by Julia Angwin and Jeff Larson, ProPublica; Charlie Savage and James Risen, The New York Times; and Henrik Moltke and Laura Poitras, special to ProPublica.

From the post:

The National Security Agency’s ability to spy on vast quantities of Internet traffic passing through the United States has relied on its extraordinary, decades-long partnership with a single company: the telecom giant AT&T.

While it has been long known that American telecommunications companies worked closely with the spy agency, newly disclosed NSA documents show that the relationship with AT&T has been considered unique and especially productive. One document described it as “highly collaborative,” while another lauded the company’s “extreme willingness to help.”

Timelines, source documents, analysis, sketch a damning outline of AT&T’s betrayal of its customers for more than a decade.

If you are an AT&T customer, this article is a must read. If you know someone who is an AT&T customer, please forward this article to their attention. Post it to Facebook, Twitter, etc.

You may not be able to force changes in government spy programs but as customers, collectively we can impact the bottom line of their co-conspirators.

I saw a cartoon that is a fair take on government rhetoric in this area today:

shadow-jihadist

The word to pass on to vendors is: You can be my friend or a friend of the government. Choose carefully.

August 6, 2015

Are You In The Enron Dataset?

Filed under: Privacy — Patrick Durusau @ 4:11 pm

I am still laboring, along with Sam Hunting, to put the final touches on our Balisage presentation: Spreadsheets – 90+ million End User Programmers With No Comment Tracking or Version Control.

Before you ask, yes, yes it does use topic maps to address the semantic darkness that are spreadsheets. 😉

The reason for this post was that I ran across a spreadsheet today that listed both public and private phone numbers for a large number of oil & gas types. Too old now to be much of a bother but just an FYI that prior to checking big data sets, check for private phone numbers as well as SSNs.

BTW, I was rather amazed at the large number of things that “spreadsheets” are used for in fact. Auto-processing would create nearly as many problems as it would solve. I have seen documentation, unnamed content, letters, other content. Does it never occur to anyone to use a word processor?

July 15, 2015

ProxyHam’s early demise… [+ an alternative]

Filed under: Cybersecurity,Privacy,Security — Patrick Durusau @ 2:48 pm

ProxyHam’s early demise gives way to new and improved privacy devices by Dan Goodin.

From the post:

Privacy advocates disappointed about the sudden and unexplained demise of the ProxyHam device for connecting to the Internet have reason to cheer up: there are two similarly low-cost boxes that do the same thing or even better.

The more impressive of the two is the ProxyGambit, a $235 device that allows people to access an Internet connection from anywhere in the world without revealing their true location or IP address. One-upping the ProxyHam, its radio link can offer a range of up to six miles, more than double the 2.5 miles of the ProxyHam. More significantly, it can use a reverse-tunneled GSM bridge that connects to the Internet and exits through a wireless network anywhere in the world, a capability that provides even greater range.

A bit pricey and 2.5 miles doesn’t sound like a lot to me.

Using Charter Communications as my cable provider, my location is shown by router to be twenty (20) miles from my physical location. Which makes for odd results when sites try to show a store “nearest to” my physical location.

Of course, Charter knows the actual service address and I have no illusions about my cable provider throwing themselves on a grenade to save me. Or a national security letter.

With a little investigation you can get distance from your physical location for free in some instances, bearing in mind that if anyone knows where you are, physically, then you can be found.

Think of security as a continuum that runs from being broadcast live at a public event to lesser degrees of openness. The question always is how much privacy is useful to you at what cost?

Google Data Leak!

Filed under: Censorship,Privacy — Patrick Durusau @ 2:07 pm

Google accidentally reveals data on ‘right to be forgotten’ requests by Sylvia Tippman and Julia Powles.

From the post:

Less than 5% of nearly 220,000 individual requests made to Google to selectively remove links to online information concern criminals, politicians and high-profile public figures, the Guardian has learned, with more than 95% of requests coming from everyday members of the public.

The Guardian has discovered new data hidden in source code on Google’s own transparency report that indicates the scale and flavour of the types of requests being dealt with by Google – information it has always refused to make public. The data covers more than three-quarters of all requests to date.

Previously, more emphasis has been placed on selective information concerning the more sensational examples of so-called right to be forgotten requests released by Google and reported by some of the media, which have largely ignored the majority of requests made by citizens concerned with protecting their personal privacy.

It is a true data leak but not nearly as exciting as it sounds. If you follow the Explore the data link, you will find a link to “snapshots on WayBack Machine” that will provide access to the data now scrubbed from Google transparency reports. Starting about three months ago the data simply disappeared from the transparency reports.

Here is an example from the February 4th report as saved by the WayBack Machine:

“GB”: { “name”: “United Kingdom”, “requests”: {“all”: {“rejected”: 11308, “total”: 26979, “pending”: 989, “complied”: 8527, “need_more_info”: 4050}, “issues”: {“serious_crime”: {“rejected”: 483, “total”: 694, “pending”: 28, “complied”: 93, “need_more_info”: 90}, “cp”: {“rejected”: 260, “total”: 339, “pending”: 11, “complied”: 29, “need_more_info”: 39}, “political”: {“rejected”: 83, “total”: 117, “pending”: 4, “complied”: 19, “need_more_info”: 11}, “private_personal_info”: {“rejected”: 10185, “total”: 23217, “pending”: 934, “complied”: 8201, “need_more_info”: 3857}, “public_figure”: {“rejected”: 156, “total”: 220, “pending”: 12, “complied”: 38, “need_more_info”: 13}}}, “urls”: {“all”: {“rejected”: 55731, “total”: 105337, “pending”: 3677, “complied”: 29148, “need_more_info”: 15429}, “issues”: {“serious_crime”: {“rejected”: 2413, “total”: 3249, “pending”: 81, “complied”: 298, “need_more_info”: 455}, “cp”: {“rejected”: 1160, “total”: 1417, “pending”: 22, “complied”: 90, “need_more_info”: 144}, “political”: {“rejected”: 345, “total”: 482, “pending”: 17, “complied”: 58, “need_more_info”: 59}, “private_personal_info”: {“rejected”: 49926, “total”: 97413, “pending”: 3442, “complied”: 28118, “need_more_info”: 14603}, “public_figure”: {“rejected”: 1430, “total”: 1834, “pending”: 115, “complied”: 190, “need_more_info”: 95}}} },

The post concludes with:

Dr Paul Bernal, lecturer in technology and media law at the UEA School of Law, argues that the data reveals that the right to be forgotten seems to be a legitimate piece of law. “If most of the requests are private and personal ones, then it’s a good law for the individuals concerned. It seems there is a need for this – and people go for it for genuine reasons.”

On the contrary, consider this chart (from the Guardian explore the data page):

guardian-google-data

The data shows that 96% of the requests are likely to have one searcher, the person making the request.

If the EU wants to indulge such individuals, it should create a traveling “Board of the Right to Be Forgotten,” populate it with judges, clerks, transcribers, translators, etc. that visits every country in the EU on some regular schedule and holds televised hearings for every applicant and publishes written decisions (in all EU languages) on which links should be delisted from Google.

That would fund the travel, housing and entertainment industries in the EU, a perennial feature of EU funding and relieve Google of the distraction of such cases. It would establish a transparent record of the self-obsessed who request delisting of facts from a search engine and the facts deleted.

Decisions by a “Board of the Right to Be Forgotten” would also enable the monetization of requests to be forgotten, by easing the creation of search engines that only report facts “forgotten” by Google. Winners all the way around!

July 2, 2015

The Big Lie About the Islamic State of Iraq and Syria (ISIS) and Social Media

Filed under: Privacy,Security — Patrick Durusau @ 1:31 pm

Jim Comey, ISIS, and “Going Dark” by Benjamin Wittes.

From the post:

FBI Director James Comey said Thursday his agency does not yet have the capabilities to limit ISIS attempts to recruit Americans through social media.

It is becoming increasingly apparent that Americans are gravitating toward the militant organization by engaging with ISIS online, Comey said, but he told reporters that “we don’t have the capability we need” to keep the “troubled minds” at home.

“Our job is to find needles in a nationwide haystack, needles that are increasingly invisible to us because of end-to-end encryption,” Comey said. “This is the ‘going dark’ problem in high definition.”

Comey said ISIS is increasingly communicating with Americans via mobile apps that are difficult for the FBI to decrypt. He also explained that he had to balance the desire to intercept the communication with broader privacy concerns.

“It is a really, really hard problem, but the collision that’s going on between important privacy concerns and public safety is significant enough that we have to figure out a way to solve it,” Comey said.

Let’s unpack this.

As has been widely reported, the FBI has been busy recently dealing with ISIS threats. There have been a bunch of arrests, both because ISIS has gotten extremely good at the inducing self-radicalization in disaffected souls worldwide using Twitter and because of the convergence of Ramadan and the run-up to the July 4 holiday.

Just as an empirical matter, phrases like, “…ISIS has gotten extremely good at…”, should be discarded as noise. You have heard of the three teenage girls from the UK who “attempted” to join the Islamic State of Iraq and Syria. Taking the “teenage” population of the UK to fall between 10 to 19 years of age, the UK teen population for 2014 was 7,667,000.

Three (3) teens from a population of 7,667,000 doesn’t sound like “…extremely good…” recruitment to me.

Moreover, unless they have amended the US Constitution quite recently, as a US citizen I am free to read publications by any organization on the face of the Earth. There are some minor exceptions for child pornography but political speech, which Islamic State of Iraq and Syria publications clearly fall under, are under the highest level of protection by the constitution.

Unlike the Big Lie statements about Islamic State of Iraq and Syria and social media, there is empirical research on the impact of surveillance on First Amendment rights:

This article brings First Amendment theory into conversation with social science research. The studies surveyed here show that surveillance has certain effects that directly implicate the theories behind the First Amendment, beyond merely causing people to stop speaking when they know they are being watched. Specifically, this article finds that social science research supports the protection of reader and viewer privacy under many of the theories used to justify First Amendment protection.

If the First Amendment serves to foster a marketplace of ideas, surveillance thwarts this purpose by preventing the development of minority ideas. Research indicates that surveillance more strongly affects those who do not yet hold strong views than those who do.

If the First Amendment serves to encourage democratic selfgovernance, surveillance thwarts this purpose as well. Surveillance discourages individuals with unformed ideas from deviating from majority political views. And if the First Amendment is intended to allow the fullest development of the autonomous self, surveillance interferes with autonomy. Surveillance encourages individuals to follow what they think others expect of them and conform to perceived norms instead of engaging in unhampered self-development.

The quote is from the introduction to: The Conforming Effect: First Amendment Implications of Surveillance, Beyond Chilling Speech by Margot E. Kaminski and Shane Witnov. (Kaminski, Margot E. and Witnov, Shane, The Conforming Effect: First Amendment Implications of Surveillance, Beyond Chilling Speech (January 1, 2015). University of Richmond Law Review, Vol. 49, 2015; Ohio State Public Law Working Paper No. 288. Available at SSRN: http://ssrn.com/abstract=2550385))

The abstract from Kaminski and Witnov reads:

First Amendment jurisprudence is wary not only of direct bans on speech, but of the chilling effect. A growing number of scholars have suggested that chilling arises from more than just a threat of overbroad enforcement — surveillance has a chilling effect on both speech and intellectual inquiries. Surveillance of intellectual habits, these scholars suggest, implicates First Amendment values. However, courts and legislatures have been divided in their understanding of the extent to which surveillance chills speech and thus causes First Amendment harms.

This article brings First Amendment theory into conversation with social psychology to show that not only is there empirical support for the idea that surveillance chills speech, but surveillance has additional consequences that implicate multiple theories of the First Amendment. We call these consequences “the conforming effect.” Surveillance causes individuals to conform their behavior to perceived group norms, even when they are unaware that they are conforming. Under multiple theories of the First Amendment — the marketplace of ideas, democratic self-governance, autonomy theory, and cultural democracy — these studies suggest that surveillance’s effects on speech are broad. Courts and legislatures should keep these effects in mind.

Conformity to the standard US line on Islamic State of Iraq and Syria is the more likely goal of FBI Director James Comey than stopping “successful” Islamic State of Iraq and Syria recruitment over social media.

The article also looks at First Amendment cases, including one that is directly on point for ISIS social media:

The Supreme Court has stated that laws that deter the expression of minority viewpoints by airing the identities of their holders are also unconstitutional. In Lamont, the Court found unconstitutional a requirement that mail recipients write in to request communist literature. The state had an impermissible role in identifying this minority viewpoint and condemning it. The Court reasoned that ― any addressee is likely to feel some inhibition in sending for literature which federal officials have condemned as ‘communist political propaganda.‘ The individual‘s inhibition stems from the state‘s obvious condemnation, but also from a fear of social repercussions (by the state as an employer). The Court found that the requirement that a person identify herself as a communist ―is almost certain to have a deterrent effect. (I omitted the footnote numbers for ease of reading.)

For more on Lamont (8-0), see: Lamont vs. Postmaster General (Wikipedia). Lamont v. Postmaster General 381 U.S. 301 (1965) Text of the decision, Justia.

The date of the decision is important as well, 1965. In 1965, China and Russia, had a total combined population of 842 million people. One assumes the vast majority of who were communists.

Despite the presence of a potential 843 million communists in the world, the Lamont Court found that even chilling access to communist literature was not permitted under the United States Constitution.

Before someone challenges my claim that social media has not been successful for the Islamic State of Iraq and Syria, remember that at best the Islamic State of Iraq and Syria has recruited 30,000 fighters from outside of Syria, with no hard evidence on whether they were motivated by social media or not.

Even assuming the 30,000 were all from social media, how does that compare to the 843 million known communists of 1965?

Why is Comey so frightened of a few thousand people? Frightened enough to to abridge the freedom of speech rights of every American and who knows what he wants to do to non-Americans.

As best I understand the goals of the Islamic State of Iraq and Syria, the overriding one is to have a Muslim government that is not subservient to Western powers. I don’t find that remotely threatening. Whether the Islamic State of Iraq and Syria will be the one to establish such a government is unclear. Governing is far more tedious and difficult than taking territory.

A truly Muslim government would be a far cry from the favoritism and intrigue that has characterized Western relationships with Muslim governments for centuries.

Citizens of the United States are in more danger from the FBI than they ever will be from members of the Islamic State of Iraq and Syria. Keep that in mind when you hear FBI Director James Comey talks about surveillance. The target of that surveillance is you.

July 1, 2015

GCHQ has legal immunity to reverse-engineer…

Filed under: Cybersecurity,Privacy,Security — Patrick Durusau @ 7:56 pm

GCHQ has legal immunity to reverse-engineer Kaspersky antivirus, crypto by Glyn Moody.

From the post:

Newly-published documents from the Snowden trove show GCHQ asking for and obtaining special permission to infringe on the copyright of software programs that it wished to reverse-engineer for the purpose of compromising them. GCHQ wanted a warrant that would give it indemnity against legal action from the companies owning the software in the unlikely event that they ever found out.

The legal justification for this permission is dubious. As the new report in The Intercept explains: “GCHQ obtained its warrant under section 5 of the 1994 Intelligence Services Act [ISA], which covers interference with property and ‘wireless telegraphy’ by the Security Service (MI5), Secret Intelligence Service (MI6) and GCHQ.” Significantly, Section 5 of the ISA does not mention interference in abstractions like copyright, but in 2005 the intelligence services commissioner approved the activity anyway.

It is difficult to say if the de-legitimization of laws and government by intelligence agencies is a deliberate strategy or not.

Whether intended or not, it has become clear that no privacy right of citizens, the property rights of commercial entities and even the marketability of commercial software and services, have no meaning for the United States government.

Technology companies, enterprises of all types, citizens, etc., need to all unite to return government to its legitimate goals, one of which is respecting the rights of citizens, the property rights of enterprises, and the reputations of technology companies in the world wide market.

Of what use is a global market if US vendors are so distrusted, due to government interference with their products, that their market share dwindles?

GCHQ has availed itself of legal fictions much as the United States did with so-called torture memos. All involved should be aware that no regime reins forever.

« Newer PostsOlder Posts »

Powered by WordPress