Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

October 10, 2018

Passwords: Philology, Security, Authentication

Filed under: Cryptography,Humanities,Security — Patrick Durusau @ 4:29 pm

Passwords: Philology, Security, Authentication by Brian Lennon.

Disclaimer: I haven’t seen Passwords, yet, but it’s description and reviews prompted me to mention it here.

That and finding an essay with the same titie, verbatim, by the same author, published in Diacritics, Volume 43.1 (2015) 82-104. Try: Passwords: Philology, Security, Authentication. (Hosted on the Academia site so you will need an account (free) to download the essay (also free).

From the publisher:

Cryptology, the mathematical and technical science of ciphers and codes, and philology, the humanistic study of natural or human languages, are typically understood as separate domains of activity. But Brian Lennon contends that these two domains, both concerned with authentication of text, should be viewed as contiguous. He argues that computing’s humanistic applications are as historically important as its mathematical and technical ones. What is more, these humanistic uses, no less than cryptological ones, are marked and constrained by the priorities of security and military institutions devoted to fighting wars and decoding intelligence.

Lennon’s history encompasses the first documented techniques for the statistical analysis of text, early experiments in mechanized literary analysis, electromechanical and electronic code-breaking and machine translation, early literary data processing, the computational philology of late twentieth-century humanities computing, and early twenty-first-century digital humanities. Throughout, Passwords makes clear the continuity between cryptology and philology, showing how the same practices flourish in literary study and in conditions of war.

Lennon emphasizes the convergence of cryptology and philology in the modern digital password. Like philologists, hackers use computational methods to break open the secrets coded in text. One of their preferred tools is the dictionary, that preeminent product of the philologist’s scholarly labor, which supplies the raw material for computational processing of natural language. Thus does the historic overlap of cryptology and philology persist in an artifact of computing—passwords—that many of us use every day.

Reviews (from the website):

Passwords is a fascinating book. What is especially impressive is the author’s deft and knowing engagements with both the long histories of computational text processing and the many discourses that make up literary philology. This is just the sort of work that the present mania for the digital demands, and yet books that actually live up to those demands are few and far between. Lennon is one of the few scholars who is even capable of managing that feat, and he does so here with style and erudition.”—David Golumbia, Virginia Commonwealth University

“A stunning intervention, Passwords rivets our attention to the long history of our present fascination with the digital humanities. Through a series of close, contextual readings, from ninth-century Arabic philology and medieval European debates on language to twentieth-century stylometry and machine translation, this book recalls us to a series of engagements with language about which ‘all of us—we scholars, we philologists,’ as Lennon puts it, ought to know more. Passwords is eloquent and timely, and it offers a form of deep, institutional-lexical study, which schools us in a refusal to subordinate scholarship in the humanities to the identitarian and stabilizing imperatives of the national-security state.”—Jeffrey Sacks, University of California, Riverside

Not surprisingly, I think a great deal was lost when humanities, especially those areas focused on language, stopped interacting with computer sciences. Sometime after the development of the first compilers but I don’t know that history in detail. Suggested reading?

February 8, 2018

Introducing HacSpec (“specification language for cryptographic primitives”)

Filed under: Cryptography,Cybersecurity,Security — Patrick Durusau @ 2:58 pm

Introducing HacSpec by Franziskus Kiefer.

From the post:

HacSpec is a proposal for a new specification language for cryptographic primitives that is succinct, that is easy to read and implement, and that lends itself to formal verification. It aims to formalise the pseudocode used in cryptographic standards by proposing a formal syntax that can be checked for simple errors. HacSpec specifications are further executable to test against test vectors specified in a common syntax.

The main focus of HacSpec is to allow specifications to be compiled to formal languages such as cryptol, coq, F*, and easycrypt and thus make it easier to formally verify implementations. This allows a specification using HacSpec to be the basis not only for implementations but also for formal proofs of functional correctness, cryptographic security, and side-channel resistance.

The idea of having a language like HacSpec stems from discussions at the recent HACS workshop in Zurich. The High-Assurance-Cryptographic-Software workshop (HACS) is an invite-only workshop co-located with the Real World Crypto symposium.

Anyone interested in moving this project forward should subscribe to the mailing list or file issues and pull requests against the Github repository.

Cryptography projects should be monitored like the NSA does NIST cryptography standards. If you see an error or weakness, you’re under no obligation to help. The NSA won’t.

Given security fails from software, users, etc., end-to-end encryption resembles transporting people from one homeless camp to another in an armored car.

Secure in transit but not secure at either end.

January 18, 2017

Quantum Computer Resistant Encryption

Filed under: Cryptography,Quantum,Security — Patrick Durusau @ 10:37 am

Irish Teen Introduces New Encryption System Resistant to Quantum Computers by Joseph Young.

From the post:


… a 16-year-old student was named as Ireland’s top young scientist and technologist of 2017, after demonstrating the application of qCrypt, which offers higher levels of protection, privacy and encryption in comparison to other innovative and widely-used cryptographic systems.

BT Young Scientist Judge John Dunnion, the associate professor at University of College Dublin, praised Curran’s project that foresaw the impact quantum computing will have on current cryptographic and encryption methods.

“qCrypt is a novel distributed data storage system that provides greater protection for user data than is currently available. It addresses a number of shortfalls of current data encryption systems; in particular, the algorithm used in the system has been demonstrated to be resistant to attacks by quantum computers in the future,” said Dunnion.

While it may be too early to predict whether technologies like qCrypt can protect existing encryption methods and data protection systems from quantum computers, Curran and the judges of the competition saw promising potential in the technology.

Word is spreading rapidly.

qCrypt has a place-holder website, Post-Quantum Cryptography for the Masses.

A Youtube video:

Shane’s Github repository (no qCrypt, yet)

Not to mention Shane’s website.

qCrypt has the potential to provide safety from government surveillance for everyone, everywhere.

Looking forward to this!

November 23, 2016

Comic Book Security

Filed under: Cryptography,Cybersecurity,Encryption,Security — Patrick Durusau @ 3:21 pm

The Amazing Mysteries of the Gutter: Drawing Inferences Between Panels in Comic Book Narratives by Mohit Iyyer, et al.

Abstract:

Visual narrative is often a combination of explicit information and judicious omissions, relying on the viewer to supply missing details. In comics, most movements in time and space are hidden in the “gutters” between panels. To follow the story, readers logically connect panels together by inferring unseen actions through a process called “closure”. While computers can now describe the content of natural images, in this paper we examine whether they can understand the closure-driven narratives conveyed by stylized artwork and dialogue in comic book panels. We collect a dataset, COMICS, that consists of over 1.2 million panels (120 GB) paired with automatic textbox transcriptions. An in-depth analysis of COMICS demonstrates that neither text nor image alone can tell a comic book story, so a computer must understand both modalities to keep up with the plot. We introduce three cloze-style tasks that ask models to predict narrative and character-centric aspects of a panel given n preceding panels as context. Various deep neural architectures underperform human baselines on these tasks, suggesting that COMICS contains fundamental challenges for both vision and language.

From the introduction:

comics-460

Comics are fragmented scenes forged into full-fledged stories by the imagination of their readers. A comics creator can condense anything from a centuries-long intergalactic war to an ordinary family dinner into a single panel. But it is what the creator hides from their pages that makes comics truly interesting, the unspoken conversations and unseen actions that lurk in the spaces (or gutters) between adjacent panels. For example, the dialogue in Figure 1 suggests that between the second and third panels, Gilda commands her snakes to chase after a frightened Michael in some sort of strange cult initiation. Through a process called closure [40], which involves (1) understanding individual panels and (2) making connective inferences across panels, readers form coherent storylines from seemingly disparate panels such as these. In this paper, we study whether computers can do the same by collecting a dataset of comic books (COMICS) and designing several tasks that require closure to solve.

(emphasis in original)

Comic book security: A method for defeating worldwide data slurping and automated analysis.

The authors find that human results easily exceed automated analysis, raising the question of the use of a mixture of text and images as a means to evade widespread data sweeps.

Security based on a lack of human eyes to review content is chancy but depending upon your security needs, it may be sufficient.

For example, a cartoon in a local newspaper that designates a mission target and time, only needs to be secure from the time of its publication until the mission has finished. That it is discovered days, weeks or even months later, doesn’t impact the operational security of the mission.

The data set of cartoons is available at: http://github.com/miyyer/comics.

Guaranteed, algorithmic security is great, but hiding in gaps of computational ability may be just as effective.

Enjoy!

November 4, 2016

The GCHQ Puzzle Book

Filed under: Books,Cryptography,Law — Patrick Durusau @ 9:20 am

The GCHQ Puzzle Book

The Amazon description:

If 3=T, 4=S, 5=P, 6=H, 7=H … what is 8?

What is the next letter in the sequence: M, V, E, M, J, S, U, ?

Which of the following words is the odd one out: CHAT, COMMENT, ELF, MANGER, PAIN, POUR?

GCHQ is a top-secret intelligence and security agency which recruits some of the very brightest minds. Over the years, their codebreakers have helped keep our country safe, from the Bletchley Park breakthroughs of WWII to the modern-day threat of cyberattack. So it comes as no surprise that, even in their time off, the staff at GCHQ love a good puzzle. Whether they’re recruiting new staff or challenging each other to the toughest Christmas quizzes and treasure hunts imaginable, puzzles are at the heart of what GCHQ does. Now they’re opening up their archives of decades’ worth of codes, puzzles and challenges for everyone to try.
(emphasis in original)

Hard to say if successful completion of the GCHQ Puzzle Book or hacking into GCHQ would be the better way to introduce yourself to the GCHQ.

Depends on which department within GCHQ captures your interest. 😉

Be aware that some pedestrian agencies and their personnel view intrusion into government computers to be crime and punishable as such.

More sophisticated agencies/personnel realize that “…in Jersey, anything is legal so long as you don’t get caught” and/or if you have something of sufficient value to trade.

The “rule of law,” and “letter of the law” stuff is for groundlings. Don’t be a groundling.

September 30, 2016

ORWL – Downside of a Physically Secure Computer

Filed under: Cryptography,Cybersecurity,Security — Patrick Durusau @ 1:57 pm

Meet ORWL. The first open source, physically secure computer

orwl-460

If someone has physical access to your computer with secure documents present, it’s game over! ORWL is designed to solve this as the first open source physically secure computer. ORWL (pronounced or-well) is the combination of the physical security from the banking industry (used in ATMs and Point of Sale terminals) and a modern Intel-based personal computer. We’ve designed a stylish glass case which contains the latest processor from Intel – exactly the same processor as you would find in the latest ultrabooks and we added WiFi and Bluetooth wireless connectivity for your accessories. It also has two USB Type C connectors for any accessories you prefer to connect via cables. We then use the built-in Intel 515 HD Video which can output up to 4K video with audio.

The physical security enhancements we’ve added start with a second authentication factor (wireless keyfob) which is processed before the main processor is even powered up. This ensures we are able to check the system’s software for authenticity and security before we start to run it. We then monitor how far your keyfob is from your PC – when you leave the room, your PC will be locked automatically, requiring the keyfob to unlock it again. We’ve also ensured that all information on the system drive is encrypted via the hardware on which it runs. The encryption key for this information is managed by the secure microcontroller which also handles the pre-boot authentication and other security features of the system. And finally, we protect everything with a high security enclosure (inside the glass) that prevents working around our security by physically accessing hardware components.

Any attempt to get physical access to the internals of your PC will delete the cryptographic key, rendering all your data permanently inaccessible!

The ORWL is a good illustration that good security policies can lead to unforeseen difficulties.

Or as the blog post brags:

Any attempt to get physical access to the internals of your PC will delete the cryptographic key, rendering all your data permanently inaccessible!

All I need do to deprive you of your data (think ransomware), is to physically tamper with your ORWL.

Of interest to journalists who need the ability to deprive others of data on very short notice.

Perhaps a fragile version for journalists and a more resistance to abuse version for the average user.

Enjoy!

August 26, 2016

Germany and France declare War on Encryption to Fight Terrorism

Filed under: Cryptography,Encryption,Government,Privacy — Patrick Durusau @ 4:11 pm

Germany and France declare War on Encryption to Fight Terrorism by Mohit Kumar.

From the post:

Yet another war on Encryption!

France and Germany are asking the European Union for new laws that would require mobile messaging services to decrypt secure communications on demand and make them available to law enforcement agencies.

French and German interior ministers this week said their governments should be able to access content on encrypted services in order to fight terrorism, the Wall Street Journal reported.
(emphasis in original)

On demand decryption? For what? Rot-13 encryption?

The Franco-German text transmitted to the European Commission.

The proposal wants to extend current practices of Germany and France with regard to ISPs but doesn’t provide any details about those practices.

In case you have influence with the budget process at the EU, consider pointing out there is no, repeat no evidence that any restriction on encryption will result in better police work combating terrorism.

But then, what government has ever pushed for evidence-based policies?

August 21, 2016

EasyCrypt Reference Manual

Filed under: Cryptography,Cybersecurity — Patrick Durusau @ 4:56 pm

EasyCrypt Reference Manual (PDF)

For your reading convenience, I have emended the hyperlinks in the introduction to point to online versions of the citations and not to the paper’s bibliography.

From the introduction:

EasyCrypt [BDG+14, BGHZ11] is a framework for interactively finding, constructing, and machine-checking security proofs of cryptographic constructions and protocols using the codebased sequence of games approach [BR04, BR06, Sho04]. In EasyCrypt, cryptographic games and algorithms are modeled as modules, which consist of procedures written in a simple userextensible imperative language featuring while loops and random sampling operations. Adversaries are modeled by abstract modules—modules whose code is not known and can be quantified over. Modules may be parameterized by abstract modules.

EasyCrypt has four logics: a probabilistic, relational Hoare logic (pRHL), relating pairs of procedures; a probabilistic Hoare logic (pHL) allowing one to carry out proofs about the probability of a procedure’s execution resulting in a postcondition holding; an ordinary (possibilistic) Hoare logic (HL); and an ambient higher-order logic for proving general mathematical facts and connecting judgments in the other logics. Once lemmas are expressed, proofs are carried out using tactics, logical rules embodying general reasoning principles, and which transform the current lemma (or goal) into zero or more subgoals—sufficient conditions for the original lemma to hold. Simple ambient logic goals may be automatically proved using SMT solvers. Proofs may be structured as sequences of lemmas, and EasyCrypt’s theories may be used to group together related types, predicates, operators, modules, axioms and lemmas. Theory parameters that may be left abstract when proving its lemmas—types, operators and predicates—may be instantiated via a cloning process, allowing the development of generic proofs that can later be instantiated with concrete parameters.

Be aware the documentation carries this warning (1.6 About this Documentation):

This document is intended as a reference manual for the EasyCrypt tool, and not as a tutorial on how to build a cryptographic proof, or how to conduct interactive proofs. We provide some detailed examples in Chapter 7, but they may still seem obscure even with a good understanding of cryptographic theory. We recommend experimenting.

My first time seeing documentation advising “experimenting” to understand it. 😉

You?

Before you jump to Chapter 7, be aware that Chapters 4 Structuring Specifications and Proofs, Chapter 5 EasyCrypt Library, Chapter 6 Advanced Features and Usage, and Chapter 7 Examples, have yet to be written.

You have time to work through the first three chapters and to experiment with EasyCrypt before being called upon to evaluate Chapter 7.

Enjoy!

August 1, 2016

Law Enforcement Shouldn’t Be Omniscient

Filed under: Cryptography,Cybersecurity,Encryption — Patrick Durusau @ 3:37 pm

Andy Greenberg’s introduction to the genius behind Signal, Meet Moxie Marlinspike, The Anarchist Bringing Encryption To All Of Us, is a great read.

Just a sample to get you going:


For any cypherpunk with an FBI file, it’s already an interesting morning. At the very moment the Cryptographers’ Panel takes the stage, Apple and the FBI are at the height of a six-week battle, arguing in front of the House Judiciary Commit­tee over the FBI’s demand that Apple help it access an encrypted ­iPhone 5c owned by San Bernardino killer Syed Rizwan Farook. Before that hearing ends, Apple’s general counsel will argue that doing so would set a dangerous legal precedent, inviting foreign govern­ments to make similar demands, and that the crypto-cracking software could be co-opted by criminals or spies.

The standoff quickly becomes the topic of the RSA panel, and Marlinspike waits politely for his turn to speak. Then he makes a far simpler and more radical argument than any advanced by Apple: Perhaps law enforcement shouldn’t be omniscient. “They already have a tremendous amount of information,” he tells the packed ballroom. He points out that the FBI had accessed Farook’s call logs as well as an older phone backup. “What the FBI seems to be saying is that we need this because we might be missing something. Obliquely, they’re asking us to take steps toward a world where that isn’t possible. And I don’t know if that’s the world we want to live in.”

Marlinspike follows this remark with a statement that practically no one else in the privacy community is willing to make in public: that yes, people will use encryption to do illegal things. And that may just be the whole point. “I actually think that law enforcement should be difficult,” Marlinspike says, looking calmly out at the crowd. “And I think it should actually be possible to break the law.”

I don’t find Marlinspike’s:

I think it should actually be possible to break the law.

surprising or shocking.

Nearly everyone in law enforcement and government agrees with Marlinspike, it all depends on whose laws and for what purpose?

Murder is against the law in North Korea but several governments would applaud anyone who used encryption to arrange slipping a knife between the ribs of Kim Jong-un.

Those same governments and their citizens use encryption to carry on industrial espionage, spying on military research, trade or government negotiations, etc.

I’m happy with non-omniscient law enforcement.

How about you?

July 28, 2016

Entropy Explained, With Sheep

Filed under: Cryptography,Encryption,Information Theory,Shannon — Patrick Durusau @ 2:34 pm

Entropy Explained, With Sheep by Aatish Bhatia.

Entropy is relevant to information theory, encryption, Shannon, but I mention it here because of the cleverness of the explanation.

Aatish sets a very high bar for taking a difficult concept and creating a compelling explanation that does not involve hand-waving and/or leaps of faith on the part of the reader.

Highly recommended as a model for explanation!

Enjoy!

June 29, 2016

How Secure Are Emoji Ciphers?

Filed under: Cryptography,Encryption,Privacy — Patrick Durusau @ 2:21 pm

You Can Now Turn Messages Into Secret Code Using Emoji by Joon Ian Wong.

From the post:

Emoji are developing into their own language, albeit a sometimes impenetrable one. But they are about to become truly impenetrable. A new app from the Mozilla Foundation lets you use them for encryption.

The free web app, called Codemoji, lets users write a message in plain-text, then select an emoji “key” to mask the letters in that message with a series of emoji. To decrypt a message, the correct key must be entered in the app, turning emoji back into the alphabet.

Caesar ciphers (think letter substitution) are said to be “easy” to solve with modern computers.

Which is true, but the security of an Emoji cipher depends on how long the information must remain secret.

For example, you discover a smart phone at 11:00 AM (your local) and it has the following message:

Detonate at 12:15 P.M. (your local)

but that message is written in Emoji using the angry face as the key:

emoji-code

That Emoji coded message is as secure as a message encoded with the best the NSA can provide.

Why?

If you knew what the message said, detonation time, assuming that is today, is only 75 minutes away. Explosions are public events and knowing in hindsight that you had captured the timing message, but broke the code too late, isn’t all that useful.

The “value” of that message being kept secret expires at the same time as the explosion.

In addition to learning more about encryption, use Codemoji as a tool for thinking about your encryption requirements.

Some (conflicting) requirements: Ease of use, resistance to attack (how to keep the secret), volume of use, hardware/software requirements, etc.

Everyone would like to have brain-dead easy to use, impervious to even alien-origin quantum computers, scales linearly and runs on an Apple watch.

Not even the NSA is rumored to have such a system. Become informed so you can make informed compromises.

April 11, 2016

Knights of Ignorance (Burr and Feinstein) Hold Tourney With No Opponents

Filed under: Cryptography,Government,Journalism,News,Privacy,Reporting — Patrick Durusau @ 8:27 pm

Burr And Feinstein Plan One Sided Briefing For Law Enforcement To Bitch About ‘Going Dark’ by Mike Masnick.

From the post:

With the world mocking the sheer ignorance of their anti-encryption bill, Senators Richard Burr and Dianne Feinstein are doubling down by planning a staff “briefing” on the issue of “going dark” with a panel that is made up entirely of law enforcement folks. As far as we can tell, it hasn’t been announced publicly, but an emailed announcement was forwarded to us, in which they announce the “briefing” (notably not a “hearing“) on “barriers to law enforcement’s ability to lawfully access the electronic evidence they need to identify suspects, solve crimes, exonerate the innocent and protect communities from further crime.” The idea here is to convince others in Congress to support their ridiculous bill by gathering a bunch of staffers and scaring them with bogeyman stories of “encryption caused a crime wave!” As such, it’s no surprise that the panelists aren’t just weighted heavily in one direction, they’re practically flipping the boat. Everyone on the panel comes from the same perspective, and will lay out of the argument for “encryption bad!”

An upside to the approaching farce is it identifies people who possess “facts” to support the “encryption bad” position.

Given fair warning of their identities, what can you say about these “witnesses?”

Do you think some enterprising reporter will press them for detailed facts and not illusory hand waving? (I realize Senators are never pressed, not really, for answers. Reporters want the next interview. But these witnesses aren’t Senators.)

For example, Hillar C. Moore, III, has campaigned for a misdemeanor jail to incarcerate traffic offenders in order to lower violent crime.

“He said Wednesday that he believes the jail is an urgent public safety tool that could lower violent crime in the city. “This summer, we didn’t have the misdemeanor jail, and while it’s not responsible for every murder, this is responsible for the crime rate being slightly higher,” Moore said. “Baton Rouge could have done better than other cities, but we missed out on that. It’s time for everyone to get on board and stop looking the other way.”

Moore’s office asked the East Baton Rouge Parish Metro Council in recent weeks for authorization to use dedicated money to open a misdemeanor jail on a temporary basis, two weeks at a time for the next several months, to crack down on repeat offenders who refuse to show up in court.

The request was rejected by the council, after opponents accused law enforcement officials of using the jail to target nonviolent, low-income misdemeanor offenders as a way to shake them down for money for the courts. More than 60 percent of misdemeanor warrants are traffic-related offenses, and critics angrily took issue with a proposal that potentially could result in jailing traffic violators.”

Evidence and logic aren’t Hillar’s strong points.

That’s one fact about one of the prospective nut-job witnesses.

What’s your contribution to discrediting this circus of fools?

April 9, 2016

Visual Guide to Senate Crypto Bill [First Draft In Crayon?]

Filed under: Cryptography,Cybersecurity,Government — Patrick Durusau @ 7:13 pm

The Senate crypto bill is comically bad: A visual guide by Isaac Potoczny-Jones.

From the post:

If you’re curious about the draft text for the senate crypto bill please, read the text for yourself or a summary on Wired. If you have ever used a security product, you’ll probably quickly realize that it would make most (if not all) encryption illegal.

For example, a product like an encrypted hard drive is covered since seagate provides a process for storing data. Upon a court order, seagate must provide the data on that drive by making it intelligible, either by never encrypting it or if it is encrypted, they must decrypt it.

The following graphic is provided to illustrate the paths by which nearly all secure storage or communication needs to have a back door.

You have to see the graphic to truly appreciate how lame the draft Senate crypto bill is in fact.

These are the same people who are responsible for the “riddle, wrapped in a mystery, inside an enigma” that is the Internal Revenue Code (IRC).

I can’t say it is a fact, but I suspect the first draft of the Senate crypto bill was in crayon.

What do you think?

March 14, 2016

You Can Help Increase Frustration at the FBI, Yes! You!

Filed under: Cryptography,Cybersecurity,FBI,Government,Security — Patrick Durusau @ 1:17 pm

Skype co-founder launches ultra-private messaging, with video by Eric Auchard.

From the post:

A group of former Skype technologists, backed by the co-founder of the messaging platform, has introduced a new version of its own messaging service that promises end-to-end encryption for all conversations, including by video.

Wire, a 50-person start-up mostly made up of engineers, is stepping into a global political debate over encryption that pits privacy against security advocates, epitomized by the standoff between the U.S. government and Apple.

The company said on Thursday it was adding video calling to a package of private communications services that go beyond existing messaging providers.

See the post and/or check out new service: https://wire.com/privacy/

From the homepage of Wire:

Our personal and professional data is at the center of a new economy. The information we share on social networks, via email, and messaging services is being used to build profiles. These profiles are in turn used to sell us products and services through targeted advertising and suggestion. The data collected is vast, detailed, and often very personal. Vast resources are being spent to refine the profiles, all without transparency, policy or oversight.

Our personal and professional online communications should not be part of this economy. In the physical world we talk with each other directly. We can lower our voices or close a door to share private thoughts. In the online world we should be able to communicate directly without passing our private communications through these corporate data mines.

Wire is different.

You will also find this FBI heartburn product comparison matrix, suitable for framing, to let everyone know you are serious about security (select for larger image):

wire-matrix

There’s a web version of the service so I don’t have to buy a phone just to use it and/or annoy the FBI.

I’m signed up.

What about you?

FAQ: Why the emphasis on annoying the FBI?

Good question!

During my lifetime the FBI has illegally spied on civil rights leaders and organizations, the same for anti-war movements and virtually every other departure from the “norm.”

The more ordinary folks annoy the FBI, the less time and resources it has to conduct illegal operations against other citizens.

It won’t stop the FBI any more than being covered with 10,000 fleas would prevent you from driving. It would make driving, however, a very unpleasant experience.

Enlist to Fight in Crypto Wars 2.0

Filed under: Cryptography,Cybersecurity,Government,Security — Patrick Durusau @ 8:05 am

Nat Cardozo writes in The Next Front in the New Crypto Wars: WhatsApp:

From the post:

In Saturday’s edition of the New York Times, Matt Apuzzo reports that the Department of Justice is locked in a “prolonged standoff” with WhatsApp. The government is frustrated by its lack of real-time access to messages protected by the company’s end-to-end encryption. The story may represent a disturbing preview of the next front in the FBI’s war against encryption.

I’m sure the government is “frustrated” by it lack of access to messages but that has been possible long before WhatsApp. Anyone using PGP with email has been able to achieve end-to-end encryption for years.

The real difference: WhatsApp makes encryption is convenient for users.

If you want to fight on the side of privacy, make encryption for your app as secure and convenient as possible.

Inconvenient encryption will not be used and result in clear text streams and speech.

You can increase the level of frustration in governments around the world by engineering convenient and strong encryption.

Opportunities to afflict governments around the globe don’t come up very often.

Step up and take this one.

March 12, 2016

Obama’s Magic Pony Transcript

Filed under: Cryptography,Cybersecurity,Security — Patrick Durusau @ 9:19 pm

If you are going to write about President Obama’s magic pony speech on encryption, this transcript, courtesy of Philip Elmer-DeWitt, Here’s What Obama Said at SXSW About Apple vs. FBI.

I think your options are to believe that President Obama is so poorly informed by his technical advisers that he doesn’t understand the encryption issue and/or that he understands the issue and is simply lying.

I don’t see a third option.

Do you?

Do You Believe in Magic Ponies? (Apply at 1600 Pennsylvania Ave NW, Washington, DC 20500)

Filed under: Cryptography,Cybersecurity,Government — Patrick Durusau @ 2:31 pm

Obama: cryptographers who don’t believe in magic ponies are “fetishists,” “absolutists” by Cory Doctorow.

President Obama is looking for a few good men and women who think it is possible to have strong cryptography, that becomes upon demand.

Here’s part of what Cory has to say about the matter:


Obama conflated cryptographers’ insistence that his plan was technically impossible with the position that government should never be able to serve court orders on its citizens. This math denialism, the alternative medicine of information security.

He focused his argument on the desirability of having crypto that worked in this impossible way, another cheap rhetorical trick. Wanting it badly isn’t enough.

As a former constitutional law professor, President Obama should have pointed to historical precedent for believing impossible things:

Alice laughed. “There’s no use trying,” she said: “one can’t believe impossible things.”

“I daresay you haven’t had much practice,” said the Queen. “When I was your age, I always did it for half-an-hour a day. Why, sometimes I’ve believed as many as six impossible things before breakfast.” (Through the Looking Glass, Lewis Carroll)

If you don’t already believe in magic ponies, start practicing today!

Your ability to believe impossible things may be the key to your next position in big data, national security and a host of other positions.

PS: “Absolutists” are easy to spot. Among other things, they believe math operators give everyone the same results; gravity exists in all known frames of reference; the Earth is an oblate spheroid, i.e., not flat, etc. Feel free to contribute other beliefs that identify “absolutists” in your comments.

March 2, 2016

Graph Encryption: Going Beyond Encrypted Keyword Search [Subject Identity Based Encryption]

Filed under: Cryptography,Cybersecurity,Graphs,Subject Identity,Topic Maps — Patrick Durusau @ 4:49 pm

Graph Encryption: Going Beyond Encrypted Keyword Search by Xiarui Meng.

From the post:

Encrypted search has attracted a lot of attention from practitioners and researchers in academia and industry. In previous posts, Seny already described different ways one can search on encrypted data. Here, I would like to discuss search on encrypted graph databases which are gaining a lot of popularity.

1. Graph Databases and Graph Privacy

As today’s data is getting bigger and bigger, traditional relational database management systems (RDBMS) cannot scale to the massive amounts of data generated by end users and organizations. In addition, RDBMSs cannot effectively capture certain data relationships; for example in object-oriented data structures which are used in many applications. Today, NoSQL (Not Only SQL) has emerged as a good alternative to RDBMSs. One of the many advantages of NoSQL systems is that they are capable of storing, processing, and managing large volumes of structured, semi-structured, and even unstructured data. NoSQL databases (e.g., document stores, wide-column stores, key-value (tuple) store, object databases, and graph databases) can provide the scale and availability needed in cloud environments.

In an Internet-connected world, graph database have become an increasingly significant data model among NoSQL technologies. Social networks (e.g., Facebook, Twitter, Snapchat), protein networks, electrical grid, Web, XML documents, networked systems can all be modeled as graphs. One nice thing about graph databases is that they store the relations between entities (objects) in addition to the entities themselves and their properties. This allows the search engine to navigate both the data and their relationships extremely efficiently. Graph databases rely on the node-link-node relationship, where a node can be a profile or an object and the edge can be any relation defined by the application. Usually, we are interested in the structural characteristics of such a graph databases.

What do we mean by the confidentiality of a graph? And how to do we protect it? The problem has been studied by both the security and database communities. For example, in the database and data mining community, many solutions have been proposed based on graph anonymization. The core idea here is to anonymize the nodes and edges in the graph so that re-identification is hard. Although this approach may be efficient, from a security point view it is hard to tell what is achieved. Also, by leveraging auxiliary information, researchers have studied how to attack this kind of approach. On the other hand, cryptographers have some really compelling and provably-secure tools such as ORAM and FHE (mentioned in Seny’s previous posts) that can protect all the information in a graph database. The problem, however, is their performance, which is crucial for databases. In today’s world, efficiency is more than running in polynomial time; we need solutions that run and scale to massive volumes of data. Many real world graph datasets, such as biological networks and social networks, have millions of nodes, some even have billions of nodes and edges. Therefore, besides security, scalability is one of main aspects we have to consider.

2. Graph Encryption

Previous work in encrypted search has focused on how to search encrypted documents, e.g., doing keyword search, conjunctive queries, etc. Graph encryption, on the other hand, focuses on performing graph queries on encrypted graphs rather than keyword search on encrypted documents. In some cases, this makes the problem harder since some graph queries can be extremely complex. Another technical challenge is that the privacy of nodes and edges needs to be protected but also the structure of the graph, which can lead to many interesting research directions.

Graph encryption was introduced by Melissa Chase and Seny in [CK10]. That paper shows how to encrypt graphs so that certain graph queries (e.g., neighborhood, adjacency and focused subgraphs) can be performed (though the paper is more general as it describes structured encryption). Seny and I, together with Kobbi Nissim and George Kollios, followed this up with a paper last year [MKNK15] that showed how to handle more complex graphs queries.

Apologies for the long quote but I thought this topic might be new to some readers. Xianrui goes on to describe a solution for efficient queries over encrypted graphs.

Chase and Kamara remark in Structured Encryption and Controlled Disclosure, CK10:


To address this problem we introduce the notion of structured encryption. A structured encryption scheme encrypts structured data in such a way that it can be queried through the use of a query-specific token that can only be generated with knowledge of the secret key. In addition, the query process reveals no useful information about either the query or the data. An important consideration in this context is the efficiency of the query operation on the server side. In fact, in the context of cloud storage, where one often works with massive datasets, even linear time operations can be infeasible. (emphasis in original)

With just a little nudging, their:

A structured encryption scheme encrypts structured data in such a way that it can be queried through the use of a query-specific token that can only be generated with knowledge of the secret key.

could be re-stated as:

A subject identity encryption scheme leaves out merging data in such a way that the resulting topic map can only be queried with knowledge of the subject identity merging key.

You may have topics that represent diagnoses such as cancer, AIDS, sexual contacts, but if none of those can be associated with individuals who are also topics in the map, there is no more disclosure than census results for a metropolitan area and a list of the citizens therein.

That is you are missing the critical merging data that would link up (associate) any diagnosis with a given individual.

Multi-property subject identities would make the problem even harder, so say nothing of conferring properties on the basis of supplied properties as part of the merging process.

One major benefit of a subject identity based approach is that without the merging key, any data set, however sensitive the information, is just a data set, until you have the basis for solving its subject identity riddle.

PS: With the usual caveats of not using social security numbers, birth dates and the like as your subject identity properties. At least not in the map proper. I can think of several ways to generate keys for merging that would be resistant to even brute force attacks.

Ping me if you are interested in pursuing that on a data set.

February 28, 2016

Media Makes Terrorists Good At Encryption [Projecting Ignorance]

Filed under: Cryptography,Cybersecurity,Government,NSA,Security — Patrick Durusau @ 9:15 pm

CIA Director: It’s the Media’s Fault That Terrorists Are So Good at Encryption by Kate Knibbs.’

From the post:


Ledgett poked his finger at the media even more explicitly. “We track when our foreign intelligence targets talk about the security of their communication,” he said. “And we see a growing number of them, because of what’s in the press about the value of encryption, moving towards that.”

The implication of these statements—that media reports are somehow optimized to help terrorists be better at evading law enforcement—is a dangerous one. Yes, of course terrorists read. But Brenner and Ledgett’s statements situate media support for strong encryption on the side of terrorism. Neither intelligence leader recognized how members of their own communities might also benefit from media reports about encryption. In fact, neither Brennan or Ledgett bothered to acknowledge that their own agencies rely on encryption as a crucial security measure.

Neither Brennan or Ledgett specified which reports were believed to be frequently dog-eared on ISIS squatters, but that doesn’t matter. Extremists are interested in privacy tools, and media reports on privacy tools. Saying that they read about which tools to use is just saying that any group with goals attempts to find information that will help achieve those goals. Implying that media reports are aiding and abetting the enemy—not to mention the notion that reports highlighting privacy protections are somehow devious—is just unfair and chilling.

Kate’s right about blaming the media for extremists using encryption is far fetched, not to mention “…just unfair and chilling.”

But what we are witnessing is the projection (Jung) of ignorance of the speakers onto others.

These witnesses making these statements have as much expertise at encryption as I do at break dancing. Which is to say none at all.

They are sock puppets who “learn” about encryption or at least buzz phrases about encryption from public media.

On in the case of the FBI, from an FBI training manual that shows images of hard wired connections in a phone junction box.

Comey now wonders why encryption is allowed to defeat such measures. You have to wonder if Comey has noticed that cellphones are not followed by long phone lines.

Other than summarizing their nonsensical statements, the news media in general should not interview, quote or report any statement by these witnesses without a disclaimer that such witnesses are by definition incompetent on the question at hand.

Members of Congress can continue to billow and coo with those of skills equal to their own but the public should be forewarned of their ignorance.

February 23, 2016

Anti-Encryption National Commission News 24 February 2016

Filed under: Cryptography,Cybersecurity,Government,Security — Patrick Durusau @ 3:07 pm

Shedding Light on ‘Going Dark’: Practical Approaches to the Encryption Challenge.

WHEN: Wednesday, February 24, 2016 12:00 p.m. to 1:30 p.m. ET
WHERE: Bipartisan Policy Center, 1225 Eye Street NW, Suite 1000, Washington, DC, 20005

REGISTER NOW

From the post:

The spate of terrorist attacks last year, especially those in Paris and San Bernardino, raised the specter of terrorists using secure digital communications to evade intelligence and law enforcement agencies and, in the words of FBI Director James Comey, “go dark.” The same technologies that companies use to keep Americans safe when they shop online and communicate with their friends and family on the Internet are the same technologies that terrorists and criminals exploit to disguise their illicit activity.

In response to this challenge, House Homeland Security Committee Chairman Michael McCaul (R-TX) and Sen. Mark Warner (D-VA), a member of the Senate Intelligence Committee, have proposed a national commission on security and technology challenges in the digital age. The commission would bring together experts who understand the complexity and the stakes to develop viable recommendations on how to balance competing digital security priorities.

Please join the Bipartisan Policy Center on February 24 for a conversation with the two lawmakers as they roll out their legislation creating the McCaul-Warner Digital Security Commission followed by a panel discussion highlighting the need to take action on this critical issue.

Ironically, I won’t be able to watch the live streaming of this event because:

The video you are trying to watch is using the HTTP Live Streaming protocol which is only support in iOS devices.

Any discussion of privacy or First Amendment rights must begin without the presumption that any balancing or trade-off is necessary.

While it is true that some trade-offs have been made in the past, the question that should begin the anti-encryption discussion is whether terrorism is any more than a fictional threat or not?

Since 9/11, it has been 5278 days without a terrorist being sighted at a U.S. airport.

One explanation for those numbers is the number of terrorists in the United States is extremely small.

The FBI routinely takes advantage of people suffering from mental illness to create terrorist “threats,” which the FBI then eliminates. So those arrests should be removed from the showing of “terrorists” in our midst.

Before any discussion of “balancing” take place, challenge the need for balancing at all.

PS: Find someone with an unhacked iOS device on which to watch this presentation.

I first saw this in a post by Cory Doctorow, U.S. lawmakers expected to introduce major encryption bill.

February 3, 2016

They are deadly serious about crypto backdoors [And of the CIA and Chinese Underwear]

Filed under: Cryptography,Cybersecurity,Government,Security — Patrick Durusau @ 3:25 pm

They are deadly serious about crypto backdoors by Robert Graham.

From the post:

Julian Sanchez (@normative) has an article questioning whether the FBI is serious about pushing crypto backdoors, or whether this is all a ploy pressuring companies like Apple to give them access. I think they are serious — deadly serious.

The reason they are only half-heartedly pushing backdoors at the moment is that they believe we, the opposition, aren’t serious about the issue. After all, the 4rth Amendment says that a “warrant of probable cause” gives law enforcement unlimited power to invade our privacy. Since the constitution is on their side, only irrelevant hippies could ever disagree. There is no serious opposition to the proposition. It’ll all work itself out in the FBI’s favor eventually. Among the fascist class of politicians, like the Dianne Feinsteins and Lindsay Grahams of the world, belief in this principle is rock solid. They have absolutely no doubt.

But the opposition is deadly serious. By “deadly” I mean this is an issue we are willing to take up arms over. If congress were to pass a law outlawing strong crypto, I’d move to a non-extradition country, declare the revolution, and start working to bring down the government. You think the “Anonymous” hackers were bad, but you’ve seen nothing compared to what the tech community would do if encryption were outlawed.

On most policy questions, there are two sides to the debate, where reasonable people disagree. Crypto backdoors isn’t that type of policy question. It’s equivalent to techies what trying to ban guns would be to the NRA.

What he says.

Crypto backdoors are a choice between a policy that benefits government at the expense of everyone (crypto backdoors) versus a policy that benefits everyone at the expense of the government (no crypto backdoors). It’s really that simple.

When I say crypto backdoors benefit the government, I mean that quite literally. Collecting data via crypto backdoors and otherwise, enables government functionaries to pretend to be engaged in meaningful responses to serious issues.

Collecting and shoveling data from desk to desk is about as useless an activity as can be imagined.

Basis for that claim? Glad you asked!

If you haven’t read: Chinese Underwear and Presidential Briefs: What the CIA Told JFK and LBJ About Mao by Steve Usdin, do so.

Steve covers the development of the “presidential brief” and its long failure to provide useful information about China and Mao in particular. The CIA long opposed declassification of historical presidential briefs based on the need to protect “sources and methods.”

The presidential briefs for the Kennedy and Johnson administrations have been released and here is what Steve concludes:

In any case, at least when it comes to Mao and China, the PDBs released to date suggest that the CIA may have fought hard to keep the these documents secret not to protect “sources and methods,” but rather to conceal its inability to recruit sources and failure to provide sophisticated analyses.

Past habits of the intelligence community explain rather well why they have no, repeat no examples of how strong encryption as interfered with national security. There are none.

The paranoia about “crypto backdoors” is another way to engage in “known to be useless” action. It puts butts in seats and inflates agency budgets.


Unlike Robert, should Congress ban strong cryptography, I won’t be moving to a non-extradition country. Some of us need to be here when local police come to their senses and defect.

January 3, 2016

When back doors backfire [Uncorrected Tweet From Economist Hits 1.1K Retweets]

Filed under: Cryptography,Encryption,Ethics,Journalism,News,Reporting — Patrick Durusau @ 8:41 pm

When back doors backfire

From the post:

encryption-economist

Push back against back doors

Calls for the mandatory inclusion of back doors should therefore be resisted. Their potential use by criminals weakens overall internet security, on which billions of people rely for banking and payments. Their existence also undermines confidence in technology companies and makes it hard for Western governments to criticise authoritarian regimes for interfering with the internet. And their imposition would be futile in any case: high-powered encryption software, with no back doors, is available free online to anyone who wants it.

Rather than weakening everyone’s encryption by exploiting back doors, spies should use other means. The attacks in Paris in November succeeded not because terrorists used computer wizardry, but because information about their activities was not shared. When necessary, the NSA and other agencies can usually worm their way into suspects’ computers or phones. That is harder and slower than using a universal back door—but it is safer for everyone else.

By my count on two (2) tweets from The Economist, they are running at 50% correspondence between their tweets and actual content.

You may remember my checking their tweet about immigrants yesterday, that got 304 retweets (and was wrong) in Fail at The Economist Gets 304 Retweets!.

Today I saw the When back doors backfire tweet and I followed the link to the post to see if it corresponded to the tweet.

Has anyone else been checking on tweet/story correspondence at The Economist (zine)? The twitter account is: @TheEconomist.

I ask because no correcting tweet has appeared in @TheEconomist tweet feed. I know because I just looked at all of its tweets in chronological order.

Here is the uncorrected tweet:

econ-imm-tweet

As of today, the uncorrected tweet on immigrants has 1.1K retweets and 707 likes.

From the Economist article on immigrants:

Refugee resettlement is the least likely route for potential terrorists, says Kathleen Newland at the Migration Policy Institute, a think-tank. Of the 745,000 refugees resettled since September 11th, only two Iraqis in Kentucky have been arrested on terrorist charges, for aiding al-Qaeda in Iraq.

Do retweets and likes matter more than factual accuracy, even as reported in the tweeted article?

Is this a journalism ethics question?

What’s the standard journalism position on retweet-bait tweets?

December 8, 2015

Is It End-To-End Encrypted?

Filed under: Cryptography,Cybersecurity,Encryption — Patrick Durusau @ 5:44 pm

ZeroDB has kicked off the new question for all networked software:

Is It End-To-End Encrypted?, with a resounding YES!

From ZeroDB, an end-to-end encrypted database, is open source!:

We’re excited to release ZeroDB, an end-to-end encrypted database, to the world. ZeroDB makes it easy to develop applications with strong security and privacy guarantees by enabling applications to query encrypted data.

zerodb repo: https://github.com/zero-db/zerodb/
zerodb-server repo: https://github.com/zero-db/zerodb-server/
Documentation: http://docs.zerodb.io/

Now that it’s open source, we want your help to make it better. Try it, build awesome things with it, break it. Then tell us about it.

Today, we’re releasing a Python implementation. A JavaScript client will be following soon.

Questions? Ask us on Slack or Google Groups.

The post was authored by MacLane & Michael and you can find more information at http://zerodb.io.

PS: The question Is It End-To-End Encrypted? is a yes or no question. If anyone gives you an answer other than an unqualified yes, it’s time to move along to the next vendor. Sometimes, under some circumstances, maybe, added feature, can be, etc., are all unacceptable answers.

Just like the question: Does it have any backdoors at all? What purpose the backdoor serves isn’t relevant. That a backdoor exists is also the time at which to move to another vendor.

The answers to both of those questions should be captured in contractual language with stipulated liability in the event of breach and minimal stipulated damages.

I first saw this in Four short links: 8 December 2015 by Nat Torkington.

October 30, 2015

Apple Open Sources Cryptographic Libraries

Filed under: Cryptography,Cybersecurity,Security — Patrick Durusau @ 3:51 pm

Cryptographic Libraries

From the webpage:

The same libraries that secure iOS and OS X are available to third‑party developers to help them build advanced security features.

If you are requesting or implementing new features for a product, make cryptography a top priority.

Why?

The more strong legacy cryptography that is embedded into software if and when the feds decide on a position on cryptography the better.

Or put another way, the more secure your data, the harder for legislation to force you to make it less secure.

Word to the wise?

I first saw this in a tweet by Matthew J. Weaver.

October 13, 2015

Researchers say SHA-1 will soon be broken… [Woe for OPM’s Caesar Cipher]

Filed under: Cryptography,Cybersecurity,Security — Patrick Durusau @ 2:40 pm

Researchers say SHA-1 will soon be broken, urge migration to SHA-2 by Teri Robinson.

In as little as three short months, the SHA-1 internet security standard used for digital signatures and set to be phased out by January 2017, could be broken by motivated hackers, a team of international researchers found, prompting security specialists to call for a ramping up of the migration to SHA-2.

“We just successfully broke the full inner layer of SHA-1,” Marc Stevens of Centrum Wiskunde & Informatica in the Netherlands, one of the cryptanalysts that tested the standard, said in a release. Stevens noted that the cost of exploiting SHA-1 has dropped enough to make it affordable to every day hackers. The researchers explained that in 2012 security computer security and privacy specialist Bruce Schneier predicted that the cost of a SHA-1 attack would drop to $700,000 in 2015 and would decrease to an affordable $173,000 or so in 2018.

But the prices fell–and the opportunity rose–more quickly than predicted. “We now think that the state-of-the-art attack on full SHA-1 as described in 2013 may cost around 100,000 dollar renting graphics cards in the cloud,” said Stevens.

The silver lining in this dark cloud is that “every day hackers” can afford to spend “around $100,000 renting graphics cards in the cloud,” to break SHA-1 encryption.

I had no idea that “every day hackers” had that sort of cash flow.

Certainly something that should be mentioned at the next career day at local high schools and when recruiting for college CS programs. 😉

Depending on your interests, the even brighter silver lining will be the continued use and even upgrade to SHA-1, such as with the OPM (Office of Personnel Management), long after the graphic card rental price has broken into the three digit range.

October 10, 2015

Information theory and Coding

Filed under: Cryptography,Encryption,Information Theory — Patrick Durusau @ 5:57 am

Information theory and Coding by Mathematicalmonk.

From the introduction video:

Overview of central topics in Information theory and Coding.

Compression (source coding) theory: Source coding theorem, Kraft-McMillan inequality, Rate-distortion theorem

Error-correction (channel coding) theory: Channel coding theorem, Channel capacity, Typicality and the AEP

Compression algorithms: Huffman codes, Arithmetic coding, Lempel-Ziv

Error-correction algorithms: Hamming codes, Reed-Solomon codes, Turbo codes, Gallager (LDPC) codes

There is a great deal of cross-over between information theory and coding, cryptography, statistics, machine learning and other topics. A grounding in information theory and coding will enable you to spot and capitalize on those commonalities.

September 16, 2015

Elliptic Curve Cryptography: a gentle introduction

Filed under: Cryptography,Privacy — Patrick Durusau @ 9:06 pm

Elliptic Curve Cryptography: a gentle introduction by Andrea Corbellini.

From the post:

Those of you who know what public-key cryptography is may have already heard of ECC, ECDH or ECDSA. The first is an acronym for Elliptic Curve Cryptography, the others are names for algorithms based on it.

Today, we can find elliptic curves cryptosystems in TLS, PGP and SSH, which are just three of the main technologies on which the modern web and IT world are based. Not to mention Bitcoin and other cryptocurrencies.

Before ECC become popular, almost all public-key algorithms were based on RSA, DSA, and DH, alternative cryptosystems based on modular arithmetic. RSA and friends are still very important today, and often are used alongside ECC. However, while the magic behind RSA and friends can be easily explained, is widely understood, and rough implementations can be written quite easily, the foundations of ECC are still a mystery to most.

With a series of blog posts I’m going to give you a gentle introduction to the world of elliptic curve cryptography. My aim is not to provide a complete and detailed guide to ECC (the web is full of information on the subject), but to provide a simple overview of what ECC is and why it is considered secure, without losing time on long mathematical proofs or boring implementation details. I will also give helpful examples together with visual interactive tools and scripts to play with.

Specifically, here are the topics I’ll touch:

  1. Elliptic curves over real numbers and the group law (covered in this blog post)
  2. Elliptic curves over finite fields and the discrete logarithm problem
  3. Key pair generation and two ECC algorithms: ECDH and ECDSA
  4. Algorithms for breaking ECC security, and a comparison with RSA

In order to understand what’s written here, you’ll need to know some basic stuff of set theory, geometry and modular arithmetic, and have familiarity with symmetric and asymmetric cryptography. Lastly, you need to have a clear idea of what an “easy” problem is, what a “hard” problem is, and their roles in cryptography.

Ready? Let’s start!

Whether you can make it through this series of posts or not, it remains a great URL to have show up in a public terminal’s web browsing history.

Even if you aren’t planning on “going dark,” you can do your part to create noise that will cover those who do.

Take the opportunity to visit this site and other cryptography resources. Like the frozen North, they may not be around for your grandchildren to see.

July 1, 2015

Software developers are failing…

Filed under: Cryptography,Cybersecurity,Programming,Security — Patrick Durusau @ 9:14 am

Software developers are failing to implement crypto correctly, data reveals by Lucian Constantin.

From the post:

Despite a big push over the past few years to use encryption to combat security breaches, lack of expertise among developers and overly complex libraries have led to widespread implementation failures in business applications.

The scale of the problem is significant. Cryptographic issues are the second most common type of flaws affecting applications across all industries, according to a report this week by application security firm Veracode.

It is a deeply amusing post, with cryptography folks urging better education of programmers and programmers whining cryptography should be easier than it is.

Too many programmers think that they can just link to a crypto library and they’re done, but cryptography is hard to implement robustly if you don’t understand the finer aspects of it, like checking certificates properly, protecting the encryption keys, using appropriate key sizes or using strong pseudo-random number generators.

“All this ultimately comes down to better education of programmers to understand all the pitfalls when implementing strong crypto,” Eiram said.

But it’s not only the developers’ fault. Matthew Green, a professor of cryptography engineering at Johns Hopkins University in Baltimore, thinks that many crypto libraries are “downright bad” from a usability perspective because they’ve been designed by and for cryptographers.

“Forcing developers to use them is like expecting someone to fly an airplane when all they have is a driver’s license,” he said via email.

Green believes that making cryptographic software easier to use — ideally invisible so that people don’t even have to think about it — would be a much more efficient approach than training developers to be cryptographers.

While I like the flying an airplane on a driver’s license line, any cryptography that doesn’t require people to think about it is likely deeply flawed.

The lesson to draw from Lucian’s post is claims of encryption are valueless. Testing encryption is not a task for the same developers who wrote the encryption. Tested encryption is of value up to the extent of its testing but only that far.

Someday cryptography libraries will improve and developers will become better educated but until then, don’t accept software using encryption without testing. (That means never.)

May 31, 2015

20 tools and resources every journalist should experiment with

Filed under: Cryptography,Journalism,News,Reporting — Patrick Durusau @ 9:20 am

20 tools and resources every journalist should experiment with by Alastair Reid.

From the post:

Tools have always come from the need to carry out a specific task more effectively. It’s one of the main differences between human beings and the rest of the animal kingdom. We may still be slaves to the same old evolutionary urges but we sure know how to eat noodles in style.

In journalism, an abstract tool for uncovering the most interesting and insightful information about society, we can generally boil the workflow down to four stages: finding, reporting, producing and distributing stories.

So with that in mind, here are a range of tools which will – hopefully – help you carry out your journalism tasks more effectively.

The resources range from advanced Google and Twitter searching to odder items and even practical advice:


Funny story: Glenn Greenwald received an anonymous email in early 2013 from a source wishing to discuss a potential tip, but only if communications were encrypted. Greenwald didn’t have encryption. The source emailed a step-by-step video with instructions to install encryption software. Greenwald ignored it.

The same source, a now slightly frustrated Edward Snowden, contacted film-maker Laura Poitras about the stack of NSA files burning a hole in his hard drive. Poitras persuaded Greenwald that he might want to listen, and the resulting revelations of government surveillance is arguably the story of the decade so far.

The lesson? Learn how to encrypt your email. Mailvelope is a good option with a worthwhile tutorial for PGP encryption, the same as the NSA use, and Knight Fellow Christopher Guess has a great step-by-step guide for setting it up.

In addition to the supporting encryption advice, the other lesson is that major stories can break from new sources.

Oh, the post also mentions:

Unfortunately for reporters, one of the internet’s favourite pastimes is making up rumours and faking photos.

Sounds like a normal function of government to me.

Many journalists have reported something along the lines of:

Iraq’s Defense Ministry said Wednesday an airstrike by the U.S.-led coalition killed a senior Islamic State commander and others near the extremist-held city of Mosul, though the country’s Interior Ministry later said it wasn’t clear if he even was wounded.

The Defense Ministry said the strike killed Abu Alaa al-Afari and others who were in a meeting inside a mosque in the northern city of Tal Afar, 72 kilometers (45 miles) west of Mosul. Senior ISIS Commander Alaa Al-Afari Killed In U.S. Airstrike: Iraqi Officials

rather than:

A communique from the Iraq Defense Ministry claimed credit for killing a senior Islamic State commander and others near the city of Mosul last Wednesday.

The attack focused on a mosque inside the northern city of Tal Afar, 72 kilometers (45 miles) west of Mosul. How many people were inside the mosque at the time of this cowardly attack, along with Abu Alaa al-Afari, is unknown.

Same “facts,” but a very different view of them. I mention this because an independent press or even one that wants to pretend at independence, should not be cheerfully reporting government propaganda.

May 19, 2015

The Applications of Probability to Cryptography

Filed under: Cryptography,Mathematics — Patrick Durusau @ 1:25 pm

The Applications of Probability to Cryptography by Alan M. Turing.

From the copyright page:

The underlying manuscript is held by the National Archives in the UK and can be accessed at www.nationalarchives.gov.uk using reference number HW 25/37. Readers are encouraged to obtain a copy.

The original work was under Crown copyright, which has now expired, and the work is now in the public domain.

You can go directly to the record page: http://discovery.nationalarchives.gov.uk/details/r/C11510465.

To get a useful image, you need to add the item to your basket for £3.30.

The manuscript is a mixture of typed text with inserted mathematical expressions added by hand (along with other notes and corrections). This is a typeset version that attempts to capture the original manuscript.

Another recently declassified Turning paper (typeset): The Statistics of Repetition.

Important reads. Turing would appreciate the need to exclude government from our day to day lives.

Older Posts »

Powered by WordPress