September 22nd, 2016

From the post:

Google’s Allo messaging app and its Assistant bot have finally arrived, but Allo has been slammed for reneging on a promise that it would, by default, make it more difficult to spy on.

Because of the missing privacy feature, NSA-contractor-turned-whistleblower Edward Snowden’s first take of Allo after yesterday’s US launch is that it’s just a honeypot for surveillance.

The main complaints are that security is off by default and that chat logs are stored until deleted by users.

Now is your opportunity to make a conscious choice about Allo. Goodbye!

Don’t be mis-led into thinking end-to-end encryption ends the danger from preserving chat logs.

Intelligence agencies have long argued knowing who calls who is more important than the content of phone calls. Same is true for chats.

Google has chosen a side other than consumers, that’s enough to avoid it whenever possible.

## What Makes A Liar Lie? (Clapper Lying About The Russians)

September 21st, 2016

US intel head suggests Russia behind DNC hacks, says Moscow tried to affect elections in past

From the post:

The US director of national intelligence has suggested Russia is behind the recent hack that saw Democratic National Committee (DNC) records dumped online. The leak undermined the Democrats’ reputation ahead of November’s presidential election.

“It’s probably not real, real clear whether there’s influence in terms of an outcome [of the upcoming elections] – or what I worry about more, frankly – is just the sowing the seeds of doubt, where doubt is cast on the whole [election] process,” James Clapper said on Tuesday evening at an event hosted by the Washington Post, as cited by the Wall Street Journal.

Furthermore, the intelligence chief said Russia and its predecessor the USSR had been adhering to similar practices targeting the US since the 1960s.

“There’s a tradition in Russia of interfering with elections, their own and others.

“[…] It shouldn’t come as a big shock to people. I think it’s more dramatic maybe because now they have the cyber tools,” Clapper is cited as saying.

The comments come in contrast to Clapper’s earlier statements regarding Russia’s alleged connection to the hacking operation, which is believed to have been conducted over more than a year. In July, shortly after the documents had been leaked, he urged an end to the “reactionary mode” of blaming the leak on Russia.
… (emphasis in original)

Do you wonder why Clapper shifted from avoiding a “reactionary mode” of blaming Russia to not only blaming Russia, but claiming a history of Russian interference with United States elections?

I don’t have an email or recorded phone conversation smoking gun, but here’s one possible explanation:

From FiveThirtyEight as of today:

My prediction: The closer the odds become from FiveThirtyEight, the more frantic and far-fetched the lies from James Clapper will become.

Another DNC leak or two (real ones, not the discarded hard drive kind), and Clapper will be warning of Russian influence in county government and school board elections.

PS: If you don’t think Clapper is intentionally lying, when will you break the story his accounts have lost all connection to a reality shared by others?

September 21st, 2016

Good security practices are a must, whether you live in the Cisco universe or the more mundane realm of drug pushing.

Case in point: Photos On Dark Web Reveal Geo-locations Of 229 Drug Dealers — Here’s How by Swati Khandelwal.

From the post:

It’s a Fact! No matter how smart the criminals are, they always leave some trace behind.

Two Harvard students have unmasked around 229 drug and weapon dealers with the help of pictures taken by criminals and used in advertisements placed on dark web markets.

Do you know each image contains a range of additional hidden data stored within it that can be a treasure to the investigators fighting criminals?

Whatever services you are offering on the Dark Web, here’s an opportunity to reduce the amount of competition you are facing.

Perhaps even a reward from CrimeStoppers, although you need to price shop against your local organization for the better deal.

Failure to scrub Exchangeable Image File Format (EXIF) data lies at the heart of this technique.

See Swati’s post for more details on this “hack.”

Do your civic duty to reduce crime (your competitors) and be rewarded in the process.

Who says cybersecurity can’t be a profit center?

## Tails [Whatever The Presidential Race Outcome]

September 20th, 2016

Tails – theamnesicincognitolivesystem

Tails is a live system that aims to preserve your privacy and anonymity. It helps you to use the Internet anonymously and circumvent censorship almost anywhere you go and on any computer but leaving no trace unless you ask it to explicitly.

Whatever your prediction for the US 2016 presidential election, Hairy Thunderer or Cosmic Muffin, you are going to need Tails

It really is that simple.

## Betraying Snowden:… [Cynical, but not odd]

September 20th, 2016

From the post:

There is a special place in journalism hell reserved for The Washington Post editorial board now that it has called on President Barack Obama to not pardon National Security Agency whistleblower Edward Snowden.

As Glenn Greenwald wrote, it’s an odd move for a news publication, “which owes its sources duties of protection, and which — by virtue of accepting the source’s materials and then publishing them — implicitly declares the source’s information to be in the public interest.” Notably, the Post decided to “inexcusably omit . . . that it was not Edward Snowden, but the top editors of the Washington Post who decided to make these programs public,” as Greenwald added.

The Post’s peculiar justification is as follows: While the board grudgingly conceded that reporters, thanks to Snowden, revealed that the NSA’s collection of domestic telephone metadata — which “was a stretch, if not an outright violation, of federal surveillance law” — it condemns him for revealing “a separate overseas NSA Internet-monitoring program, PRISM, that was both clearly legal and not clearly threatening to privacy.”

Washington Post opposition to a pardon for Edward Snowden isn’t odd at all.

Which story generates more PR for the Washington Post:

1. The Washington Post, having won a Pulitzer prize due to Edward Snowden, joins a crowd calling for his pardon?
2. The Washington Post, having won a Pulitzer prize due to Edward Snowden, opposes his being pardoned?

It’s not hard to guess which one generates more ad-views and therefore the potential for click-throughs.

I have no problems with the disclosure of PRISM, save for Snowden having to break his word as a contractor to keep his client’s secrets, well, secret.

No one could be unaware that the NSA engages in illegal and immoral activity on a daily basis before agreeing to be employed by them.

Although Snowden has done no worse than his former NSA employers, it illustrates why I have no trust in government agencies.

If they are willing to lie for what they consider to be “good” reasons to you, then they are most certainly willing to lie to me.

Once it is established that an agency, take the NSA for example, has lied on multiple occasions, on what basis would you trust them to be telling the truth today?

Their assurance, “we’re not lying this time?” That seems rather tenuous.

Same rule should apply to contractors who lie to or betray their clients.

## NSA: Being Found Beats Searching, Every Time

September 20th, 2016

From the post:

This week someone auctioning hacking tools obtained from the NSA-based hacking group “Equation Group” released a dump of around 250 megabytes of “free” files for proof alongside the auction.

The dump contains a set of exploits, implants and tools for hacking firewalls (“Firewall Operations”). This post aims to be a comprehensive list of all the tools contained or referenced in the dump.

Mustafa’s post is a great illustration of why “being found beats searching, every time.”

Think of the cycles you would have to spend to duplicate this list. Multiple that by the number of people interested in this list. Assuming their time is not valueless, do you start to see the value-add of Mustafa’s post?

Mustafa found each of these items in the data dump and then preserved his finding for the use of others.

It’s not a very big step beyond this preservation to the creation of a container for each of these items, enabling the preservation of other material found on them or related to them.

Search is a starting place and not a destination.

Unless you enjoy repeating the same finding process over and over again.

## Stopping Terrorism: Thieves 2, Security Forces 0

September 19th, 2016

Murray Weiss, Nicholas Rizzi, Trevor Kapp and Aidan Gardiner document in Thieves Helped Crack the Chelsea Bombing Case, Sources Say how common street thieves thwarted terrorist attacks in New York City and New Jersey.

Albeit inadvertently, thieves prevented a second explosion in Chelsea and multiple explosion in New Jersey.

See Thieves Helped Crack the Chelsea Bombing Case, Sources Say for the full story.

Great illustration the surveillance state can track people down, after they have committed a crime. Not good at stopping people before they commit a crime.

So why are we spending $billions on a surveillance state, that is out performed by street thieves? Reward any thief discovering a terrorist bomb and turning it in with: Good for life, non-violent crimes only. Given the track record of security forces in the United States, a far better investment. ## Hackers May Fake Documents, Congress Publishes False Ones September 19th, 2016 I pointed out in Lions, Tigers, and Lies! Oh My! that Bruce Schneier‘s concerns over the potential for hackers faking documents to be leaked pales beside the mis-information distributed by government. Executive Summary of Review of the Unauthorized Disclosures of Former National Security Agency Contractor Edward Snowden (their title, not mine), is a case in point. Barton Gellman in The House Intelligence Committee’s Terrible, Horrible, Very Bad Snowden Report leaves no doubt the House Permanent Select Committee on Intelligence (HPSCI) report is a sack of lies. Not mistakes, not exaggerations, not simply misleading, but actual, factual lies. For example: Since I’m on record claiming the report is dishonest, let’s skip straight to the fourth section. That’s the one that describes Snowden as “a serial exaggerator and fabricator,” with “a pattern of intentional lying.” Here is the evidence adduced for that finding, in its entirety. “He claimed to have obtained a high school degree equivalent when in fact he never did.” I do not know how the committee could get this one wrong in good faith. According to the official Maryland State Department of Education test report, which I have reviewed, Snowden sat for the high school equivalency test on May 4, 2004. He needed a score of 2250 to pass. He scored 3550. His Diploma No. 269403 was dated June 2, 2004, the same month he would have graduated had he returned to Arundel High School after losing his sophomore year to mononucleosis. In the interim, he took courses at Anne Arundel Community College. See Gellman’s post for more examples. All twenty-two members of the HPSCI signed the report. To save you time in the future, here’s a listing of the members of Congress who agreed to report these lies: Republicans Democrats I sorted each group in to alphabetical order. The original listings were in an order that no doubt makes sense to fellow rodents but not to the casual reader. That’s twenty-two members of Congress who are willing to distribute known falsehoods. Does anyone have an equivalent list of hackers? ## Congress.gov Corrects Clinton-Impeachment Search Results September 19th, 2016 After posting Congress.gov Search Alert: “…previous total of 261 to the new total of 0.” [Solved] yesterday, pointing out that a change from http:// to https:// altered a search result for Clinton w/in 5 words impeachment, I got an email this morning: I appreciate the update and correction for saved searches, but my point about remote data changing without notice to you remains valid. I’m still waiting for word on bulk downloads from both Wikileaks and DC Leaks. Why leak information vital to public discussion and then limit access to search? ## Exotic Functional Data Structures: Hitchhiker Trees September 18th, 2016 Description: Functional data structures are awesome–they’re the foundation of many functional programming languages, allowing us to express complex logic immutably and efficiently. There is one unfortunate limitation: these data structures must fit on the heap, limiting their lifetime to that of the process. Several years ago, Datomic appeared as the first functional database that addresses these limitations. However, there hasn’t been much activity in the realm of scalable (gigabytes to terabytes) functional data structures. In this talk, we’ll first review some of the fundamental principles of functional data structures, particularly trees. Next, we’ll review what a B tree is and why it’s better than other trees for storage. Then, we’ll learn about a cool variant of a B tree called a fractal tree, how it can be made functional, and why it has phenomenal performance. Finally, we’ll unify these concepts to understand the Hitchhiker tree, an open-source functionally persistent fractal tree. We’ll also briefly look at an example API for using Hitchhiker trees that allows your application’s state to be stored off-heap, in the spirit of the 2014 paper “Fast Database Restarts at Facebook”. David Greenberg (profile) Hitchhiker Trees (GitHub) You could have searched for all the information I have included, but isn’t it more convenient to have it “already found?” ## Introducing arxiv-sanity September 18th, 2016 Only a small part of Arxiv appears at: http://www.arxiv-sanity.com/ but it is enough to show the feasibility of this approach. What captures my interest is the potential to substitute/extend the program to use other similarity measures. Bearing in mind that searching is only the first step towards the acquisition and preservation of knowledge. PS: I first saw this in a tweet by Data Science Renee. ## Congress.gov Search Alert: “…previous total of 261 to the new total of 0.” [Solved] September 18th, 2016 Odd message from the Congress.org search alert this AM: Here’s the search I created back in June, 2016: My probably inaccurate recall at the moment was I was searching for some quote from the impeachment of Bill Clinton and was too lazy to specify a term of congress, hence: all congresses – searching for Clinton within five words, impeachment Fairly trivial search that produced 261 “hits.” I set the search alert more to explore the search options than any expectation of different future results. Imagine my surprise to find that all congresses – searching for Clinton within five words, impeachment performed today, results in 0 “hits.” Suspecting some internal changes to the search interface, I re-entered the search today and got 0 “hits.” Other saved searches with radically different search results as of today? This is not, repeat not, the result of some elaborate conspiracy to assist Secretary Clinton in her bid for the presidency. I do think something fundamental has gone wrong with searching at Congress.gov and it needs to be fixed. This is an illustration of why Wikileaks, DC Leaks and other data sites should provide easy to access downloads in bulk of their materials. Providing search interfaces to document collections is a public service, but document collections or access to them can change in ways not transparent to search users. Such as demonstrated by the CIA removing documents previously delivered to the Senate. Petition Wikileaks, DC Leaks and other data sites for easy bulk downloads. That will ensure the “evidence” will not shift under your feet and the availability of more sophisticated means of analysis than brute-force search. Update: The changing from http:// to https:// by the congress.gov site, trashed my save query and using http:// to re-perform the same search. Using https:// returns the same 261 search results. What your experience with other saved searches at congress.gov? ## Scalable Vector Graphics (SVG) 2 September 17th, 2016 Scalable Vector Graphics (SVG) 2: W3C Candidate Recommendation 15 September 2016 Abstract: This specification defines the features and syntax for Scalable Vector Graphics (SVG) Version 2. SVG is a language based on XML for describing two-dimensional vector and mixed vector/raster graphics. SVG content is stylable, scalable to different display resolutions, and can be viewed stand-alone, mixed with HTML content, or embedded using XML namespaces within other XML languages. SVG also supports dynamic changes; script can be used to create interactive documents, and animations can be performed using declarative animation features or by using script. Comments: Comments on this Candidate Recommendation are welcome. Comments can be sent to www-svg@w3.org, the public email list for issues related to vector graphics on the Web. This list is archived and senders must agree to have their message publicly archived from their first posting. To subscribe send an email to www-svg-request@w3.org with the word subscribe in the subject line. W3C publishes a Candidate Recommendation to indicate that the document is believed to be stable and to encourage implementation by the developer community. This Candidate Recommendation is expected to advance to Proposed Recommendation no earlier than 15 July 2017, but we encourage early review, and requests for normative changes after 15 November 2016 may be deferred to SVG 3. 15 November 2016 will be here sooner than you realize. Read and comment early and often. Enjoy! ## Introducing OpenType Variable Fonts September 17th, 2016 From the post: Version 1.8 of the OpenType font format specification introduces an extensive new technology, affecting almost every area of the format. An OpenType variable font is one in which the equivalent of multiple individual fonts can be compactly packaged within a single font file. This is done by defining variations within the font, which constitute a single- or multi-axis design space within which many font instances can be interpolated. A variable font is a single font file that behaves like multiple fonts. There are numerous benefits to this technology. A variable font is a single binary with greatly-reduced comparable file size and, hence, smaller disc footprint and webfont bandwidth. This means more efficient packaging of embedded fonts, and faster delivery and loading of webfonts. The potential for dynamic selection of custom instances within the variations design space — or design-variations space, to use its technical name — opens exciting prospects for fine tuning the typographic palette, and for new kinds of responsive typography that can adapt to best present dynamic content to a reader’s device, screen orientation, or even reading distance. The technology behind variable fonts is officially called OpenType Font Variations. It has been jointly developed by Microsoft, Google, Apple, and Adobe, in an unprecedented collaborative effort also involving technical experts from font foundries and font tool developers. In addition to specifying the font format additions and revisions, the working group has also committed to the goal of interoperable implementation, defining expected behaviours and test suites for software displaying variable fonts. This should be welcome news to font developers and users, who have often struggled with incompatible implementations of earlier aspects of OpenType that were left to the interpretation of individual software companies. OpenType Font Variations builds on the model established in Apple’s TrueType GX variations in the mid-1990s, but has fully integrated that model into all aspects of the OpenType format, including OpenType Layout, and is available to both TrueType and Compact Font Format (CFF) flavours of OpenType. This has meant not only the addition of numerous tables to the format, but also revision of many existing tables; these changes are summarised in an appendix to this article, which is intended as an introduction and technological summary, primarily for font makers and font tool developers. The full technical specification for OpenType Font Variations is incorporated into the OpenType specification version 1.8. John Hudson developed the remarkable SBL BibLit, SBL Greek and SBL Hebrew fonts for biblical studies. An illustration from John’s post: Figure 1. Normalised design space of a 3-axis variable font. [Typeface: Kepler, an Adobe Original designed by Robert Slimbach.] Looking forward to the SBL transitioning its biblical studies font set to this new font technology. ## Lions, Tigers, and Lies! Oh My! September 17th, 2016 Bruce writes: No one is talking about this, but everyone needs to be alert to the possibility. Sooner or later, the hackers who steal an organization’s data are going to make changes in them before they release them. If these forgeries aren’t questioned, the situations of those being hacked could be made worse, or erroneous conclusions could be drawn from the documents. When someone says that a document they have been accused of writing is forged, their arguments at least should be heard. Really? Governments, the United States Government in particular, leak false information and documents as a matter of normal business practice. Not to mention corporations and special interest groups that pay for false research (think Harvard, sugar studies) to be published. In case you missed it, read Inside the fight to reveal the CIA’s torture secrets. In depth analysis of how the CIA not only lied, but destroyed evidence, spied on the U.S. Senate and otherwise misbehaved during an investigation into its torture practices. That’s just one example. One could fill a multi-volume series with the lies, false documents and fabrications of the current and immediately previous U.S. President. The argument torturers were “doing their duty to protect the country” and so merit a pass on accountability I recommend to any future political assassins. See how that plays out in a court of law. Hint: Crimes are crimes whatever your delusional understanding of “the greater good.” The easier rule is: Consider all documents/statements as false unless and until: 1. You are satisfied of the truth of the document/statement, or 2. It is to your advantage to treat the document/statement as true. That covers situations like “fact free” accusations of cyber hacking against the Russians, North Koreans and/or Chinese by the U.S. government. No “evidence” has been offered for any of those allegations, only vaguely worded rumors circulated among “experts” who are also government contractors. You can imagine the credibility I assign to such sources. Probably happenstance but such contractors could be telling the truth. Unfortunately, in the absence of any real evidence, only the self-interested in such “truths” or the extremely credulous, crack-pipe users for example, would credit such statements. ## How Mapmakers Make Mountains Rise Off the Page September 17th, 2016 From the post: The world’s most beautiful places are rarely flat. From the soaring peaks of the Himalaya to the vast chasm of the Grand Canyon, many of the most stunning sites on Earth extend in all three dimensions. This poses a problem for mapmakers, who typically only have two dimensions to work with. Fortunately, cartographers have some clever techniques for creating the illusion of depth, many of them developed by trial and error in the days before computers. The best examples of this work use a combination of art and science to evoke a sense of standing on a mountain peak or looking out an airplane window. One of the oldest surviving maps, scratched onto an earthenware plate in Mesopotamia more than 4,000 years ago, depicts mountains as a series of little domes. It’s an effective symbol, still used today in schoolchildren’s drawings and a smartphone emoji, but it’s hardly an accurate representation of terrain. Over the subsequent centuries, mapmakers made mostly subtle improvements, varying the size and shape of their mountains, for example, to indicate that some were bigger than others. But cartography became much more sophisticated during the Renaissance. Topographic surveys were done for the first time with compasses, measuring chains, and other instruments, resulting in accurate measurements of height. And mapmakers developed new methods for depicting terrain. One method, called hachuring, used lines to indicate the direction and steepness of a slope. You can see a later example of this in the 1807 map below of the Mexican volcano Pico de Orizaba. Cartographers today refer (somewhat dismissively) to mountains depicted this way as “woolly caterpillars.” Stunning illusions of depth on maps, creating depth illusions in 2 dimensions (think computer monitors), history of map making techniques, are all reasons to read this post. What seals it for me is that the quest for the “best” depth illusion continues. It’s not a “solved” problem. (No spoiler, see the post.) Physical topography to one side, how are you going to bring “depth” to your topic map? Some resources in a topic map may have great depth and others, unfortunately, may be like Wikipedia articles marked as: This article has multiple issues. How do you define and then enable navigation of your topic maps? ## How-To Discover Pay-to-Play Appointment Pricing September 16th, 2016 You have seen one or more variations on: This Is How Much It ‘Costs’ To Get An Ambassadorship: Guccifer 2.0 Leaks DNC ‘Pay-To-Play’ Donor List DNC Leak Exposes Pay to Play Politics, How the Clinton’s REALLY Feel About Obama You may be wondering why CNN, the New York Time and the Washington Post aren’t all over this story? While selling public offices surprises some authors, whose names I omitted out of courtesy to their families, selling offices is a regularized activity in the United States. So regularized that immediately following each presidential election, the Government Printing Office publishes the United States Government Policy and Supporting Positions 2012 (Plum Book) that lists the 9,000 odd positions that are subject to presidential appointment. From the description of the 2012 edition: Every four years, just after the Presidential election, “United States Government Policy and Supporting Positions” is published. It is commonly known as the “Plum Book” and is alternately published between the House and Senate. The Plum Book is a listing of over 9,000 civil service leadership and support positions (filled and vacant) in the Legislative and Executive branches of the Federal Government that may be subject to noncompetitive appointments, or in other words by direct appointment. These “plum” positions include agency heads and their immediate subordinates, policy executives and advisors, and aides who report to these officials. Many positions have duties which support Administration policies and programs. The people holding these positions usually have a close and confidential relationship with the agency head or other key officials. Even though the 2012 “plum” book is currently on sale for$19.00 (usual price is $38.00), given that a new one will appear later this year, consider using the free online version at: Plum Book 2012. The online interface is nothing to brag on. You have to select filters and then find to obtain further information on positions. Very poor UI. However, if under title you select “Chief of Mission, Monaco” and then select “find,” the resulting screen looks something like this: To your far right there is a small arrow that if selected, takes you to the details: If you were teaching a high school civics class, the question would be: How much did Charles Rivkin have to donate to obtain the position of Chief of Mission, Monaco? FYI, the CIA World FactBook gives this brief description for Monaco: Monaco, bordering France on the Mediterranean coast, is a popular resort, attracting tourists to its casino and pleasant climate. The principality also is a banking center and has successfully sought to diversify into services and small, high-value-added, nonpolluting industries. Unlike the unhappy writers that started this post, you would point the class to: Transaction Query By Individual Contributor at the Federal Election Commission site. Entering the name Rivkin, Charles and select “Get Listing.” Rivkin’s contributions are broken into categories and helpfully summed to assist you in finding the total. Contributions to All Other Political Committees Except Joint Fundraising Committees –$72399.00

Joint Fundraising Contributions – $22300.00 Recipient of Joint Fundraiser Contributions –$36052.00

Caution: There is an anomalous Rivkin in that last category, contributing $40 to Donald Trump. For present discussions, I would subtract that from the grand total of:$130,711 to be the Chief of Mission, Monaco.

Realize that this was not a lump sum payment but a steady stream of contributions starting in the year 2000.

Using the Transaction Query By Individual Contributor resource, you can correct stories that claim:

Jane Hartley paid DNC $605,000 and then was nominated by Obama to serve concurrently as the U.S. Ambassador to the French Republic and the Principality of Monaco. If you run the FEC search you will find: Contributions to Super PACs, Hybrid PACs and Historical Soft Money Party Accounts –$5000.00

Contributions to All Other Political Committees Except Joint Fundraising Committees – $516609.71 Joint Fundraising Contributions –$116000.00

Grand total: $637,609.71. So,$637,609.71, not $605,000.00 but also as a series of contributions starting in 1997, not one lump sum. You don’t have to search discarded hard drives to get pay-to-play appointment pricing. It’s all a matter of public record. PS: I’m not sure how accurate or complete Nominations & Appointments (White House) may be, but its an easier starting place for current appointees than the online Plum book. PPS: Estimated pricing for “Plum” book positions could be made more transparent. Not a freebie. Let me know if you are interested. ## Android Hacking –$200K First Prize – Other Offers?

September 16th, 2016

Announcing the Project Zero Prize by Natalie Silvanovich.

Before reading the “official” post, consider this Dilbert cartoon.

Same logic applies here:

How to compare alternatives? ($200K sets a minimum bid.) Potential for repeat business? For a pwn of any Android phone,$200K sounds a bit “lite.”

Watch the Android issue tracker. A third-party bidder won’t insist on you using only your reported bugs in an exploit chain.

Before anyone gets indignant, the NSA, CIA, the “Russians,” Chinese, Mossad, etc., will all be watching as well. Think of it as having “governmental” ethics.

From the post:

Despite the existence of vulnerability rewards programs at Google and other companies, many unique, high-quality security bugs have been discovered as a result of hacking contests. Hoping to continue the stream of great bugs, we’ve decided to start our own contest: The Project Zero Prize.

The goal of this contest is to find a vulnerability or bug chain that achieves remote code execution on multiple Android devices knowing only the devices’ phone number and email address. Successful submissions will be eligible for the following prizes.

First Prize

$200,000 USD, awarded to the first winning entry. Second Prize$100,000 USD, awarded to the second winning entry.

Third Prize

At least \$50,000 USD awarded by Android Security Rewards, awarded to additional winning entries.

In addition, participants who submit a winning entry will be invited to write a short technical report on their entry, which will be posted on the Project Zero Blog.

Contest Structure

This contest will be structured a bit differently than other contests. Instead of saving up bugs until there’s an entire bug chain, and then submitting it to the Project Zero Prize, participants are asked to report the bugs in the Android issue tracker. They can then be used as a part of submission by the participant any time during the six month contest period. Only the first person to file a bug can use it as a part of their submission, so file early and file often! Of course, any bugs that don’t end up being used in a submission will be considered for Android Security Rewards and any other rewards program at Google they might be eligible for after the contest has ended.

In addition, unlike other contests, the public sharing of vulnerabilities and exploits submitted is paramount. Participants will submit a full description of how their exploit works with their submission, which will eventually be published on the Project Zero blog. Every vulnerability and exploit technique used in each winning submission will be made public.

Full contest rules

Contest period:

The Contest begins at 12:00:00 A.M. Pacific Time (PT) Zone in the United States on September 13, 2016 and ends at 11:59:59 P.M. PT on March 14, 2017 (“Contest Period”).

Good hunting!

PS: If possible, post the paid price for your exploit to help set the market price for future such exploits.

## If It’s Good Enough For Colin Powell…

September 16th, 2016

Graham posted webmail security advice for Colin Powell after 26 months worth of his private emails were leaked by DC Leaks.

Nothing surprising for my readers but pass it on to the c-suite types.

You can search and view Powell’s emails at DC Leaks / Colin Luther Powell.

Graham omits any link to DC Leaks and says:

Of course, the emails aren’t just embarrassing and damaging for the privacy of Colin Powell – they are also potentially humiliating for the people he was corresponding with, who have had their own private conversations exposed to the world.

Oh, the horror! Invasions of privacy!

You mean like the millions of ordinary people who aren’t secure in their phone calls, emails, web browsing, banking, credit histories, etc., all the time?

The extremely privileged getting nicked every now and again doesn’t trouble me.

“Oversight” hasn’t protected our freedoms, perhaps constant and detailed exposure of the privileged will. Worth a shot!

## Guccifer 2.0 – 13Sept2016 Leak – A Reader’s Guide (Part 2) [Discarded Hard Drive?]

September 15th, 2016

Guccifer 2.0‘s latest release of DNC documents is generally described as:

In total, the latest dump contains more than 600 megabytes of documents. It is the first Guccifer 2.0 release to not come from the hacker’s WordPress account. Instead, it was given out via a link to the small group of security experts attending the London conference. Guccifer 2.0 drops more DNC docs by Cory Bennett.

The “600 megabytes of documents” is an attention grabber, but how much of that 600 megabytes is useful and/or interesting?

The answer turns out to be, not a lot.

Here’s an overview of the directories and files:

/CIR

Financial investment data.

/CNBC

Financial investment data.

/DNC

Redistricting documents.

/DNCBSUser

One file with fields of VANDatabaseCode StateID VanID cons_id?

/documentation

A large amount of documentation for “IQ8,” apparently address cleaning software. Possibly useful if you want to know address cleaning rules from eight years ago.

/DonorAnalysis

Sound promising but is summary data based on media markets.

/early

Early voting analysis.

/eday

Typical election voting analysis, from 2002 to 2008.

/FEC

Duplicates to FEC filings. Checking the .csv file, data from 2008. BTW, you can find this date (2008) and later data of the same type at: http://fec.gov.

/finance

More duplicates to FEC filings. 11-26-08 NFC Members Raised.xlsx (no credit cards) – Dated but 453 names with contacts, amounts raised, etc.

/HolidayCards

Holiday card addresses, these are typical:

holiday_list_noproblems.txt
holidaycards.mdb
morethanonename.xls

/jpegs

Two jpegs were included in the dump.

/marketing

Lists of donors.

DNC union_05-09.txt
DNCunion0610.txt
GDSA11A.CSV
November VF EOC – MEYER.txt
dem0702a[1].zip
dem977.txt
dem978.txt
dem979.txt
dem980.txt
dem981.txt
dem982.txt
dem9A3_NGP.txt
dem9A6_NGP.txt
dsg.txt
gsi.txt
harris.txt
marketing_phones.txt
ofa_actives_non-donor.csv
tm_files.txt

/May-FEC

Grepping looks like May, 2009 data for the FEC.

/newmedia

More donor lists.

20090715_new_synetech_emails.csv
emails_w_contactinfo.txt
ofa_email_export.zip

/pdfs

IT hosting proposals.

/Reports for Kaine

Various technology memos

/security

IT security reports

/stuffformike/WH/

Contacts not necessarily in FEC records

Contact List-Complete List.xlsx – Contact list with emails and phone numbers (no credit cards)
WH Staff 2010.xlsx – Names but no contact details

The data is eight (8) years old. Do you have the same phone number you did eight (8) years ago?

Guccifer 2.0 makes no claim on their blog for ownership of this leak.

A “hack” that results in eight year old data, most of which is more accessible at http://fec.gov?

No, this looks more like a discarded hard drive that was harvested and falsely labeled as a “hack” of the DNC.

Unless Guccifer 2.0 says otherwise on their blog, you have better things to do with your time.

PS: You don’t need old hard drives to discover pay-to-play purchases of public appointments. Check back tomorrow for: How-To Discover Pay-to-Play Appointment Pricing.

## Guccifer 2.0 – 13Sept2016 Leak – A Reader’s Guide (Part 1)

September 14th, 2016

Guccifer 2.0 dropped a new bundle of DNC documents on September 13, 2016! Like most dumps, there was no accompanying guide to make use of that dump easier. Not a criticism, just an observation.

As a starting point to make your use of that dump a little easier, I am posting an ls -lR listing of all the files in that dump, post extraction with 7z and unrar. Guccifer2.0-13Sept2016-filelist.txt.

I’m working on a list of the files most likely to be of interest. Look for that tomorrow.

I can advise that no credit card numbers were included in this dump.

Using:

grep --color -H -rn --include="*.txt" '$[345]\{1\}[0-9]\{3\}\|6011$\{1\}[ -]\?[0-9]\{4\}[ -]\?[0-9]\{2\}[-]\?[0-9]\{2\}[ -]\?[0-9]\{1,4\}' 

I checked all the .txt files for credit card numbers. (I manually checked the xsl/xslx files.)

There were “hits” but those were in Excel exports of vote calculations. Funny how credit card numbers don’t ever begin with “0.” as a prefix.

Since valid credit card numbers vary in length, I don’t know of an easy way to avoid that issue. So inspection of the files it was.

## Investigatory Powers Bill As Amended In Committee

September 13th, 2016

For those of you watching the UK’s plunge into darkness, the Investigatory Powers Bill, as amended in committee, has been posted online.

Apologies for the lite amount of posting today but a very large data dump was released earlier today that distracted me from posting.

## FPCasts

September 13th, 2016

FPCasts – Your source for Functional Programming Related Podcasts

Ten (10) sources of podcasts, with a link to the latest podcast from each source.

Not a problem but took me by surprise on my first visit.

As useful as this will be, indexed podcasts where you could jump to a subject of interest would be even better.

Enjoy!

## R Weekly

September 12th, 2016

R Weekly

A new weekly publication of R resources that began on 21 May 2016 with Issue 0.

Mostly titles of post and news articles, which is useful, but not as useful as short summaries, including the author’s name.

## Persuasive Cartography

September 12th, 2016

From the post:

A recurrent topic here on Vintage InfoDesign is “persuasive cartography” – the use of maps to influence and in many cases, deceive. We showcased examples of these maps here and here, with a special mention to the PJ Mode Collection at Cornell University Library. The collection was donated to Cornell back in 2014, and until now more than 300 examples are available online in high resolution.

A must for all of those interested in the subject, and we picked a few examples to open this post, courtesy of Allison Meier, who published a rente article about the PJ Mode Collection over at Hyperallergic.

Re-reading The Power of Maps (1992) by Denis Wood, in preparation to read Rethinking The Power of Maps (2010), also by Denis Wood, has made me acutely aware of aspersions such as:

“persuasive cartography” – the use of maps to influence and in many cases, deceive.

I say “aspersion” because Wood makes the case that all maps, with no exceptions, are the results of omissions, characterizations, enhancements, emphasis on some features and not others, for stated and/or unstated purposes.

Indeed, all of The Power of Maps (1992) is devoted to teasing out, with copious examples, where a user of a map may fail to recognize the “truth” of any map, is a social construct in a context shaped by factors known and unknown.

I characterize maps I disagree with as being deceptive, disingenuous, inaccurate, etc., but doesn’t take away from Wood’s central point that all maps are acts of persuasion.

The critical question being: Do you support the persuasion a map is attempting to make?

When I teach topic maps again I will make The Power of Maps (1992) required reading.

It is an important lesson to realize that any map, even a topic map, need only map so much of the territory or domain, as is sufficient for the task at hand.

A topic maps for nuclear physics won’t have much in common with one for war criminals of the George W. Bush and Barack Obama administrations.

Moreover, even topic maps of the same subject domain, may or may not merge in a meaningful way.

The idea of useful merger of arbitrary topic maps, like the idea of “objective maps,” is a false one that serves no useful purpose.

Say rather that topic maps can make enough information explicit about subjects to determine if merging will be meaningful to one or more users of a topic map. That alone is quite a feat.

## Invite Government Into The Cellphone Fish Bowl

September 12th, 2016

Sam summarizes the high points from around 200 pages of current but never seen before Harris instruction manuals. Good show!

From the post:

Harris declined to comment. In a 2014 letter to the Federal Communications Commission, the company argued that if the owner’s manuals were released under the Freedom of Information Act, this would “harm Harris’s competitive interests” and “criminals and terrorist[s] would have access to information that would allow them to build countermeasures.”

Creating countermeasures?

Better, treat these documents as a basis for reverse-engineering Harris Stingrays into DIY kits.

False promises from known liars on use of “Stingray”s or “IMSI catchers are not going to combat government abuse of this technology.

Inviting governments to join the general public in the cellphone fish bowl might.

Can you imagine the reaction of your local sheriff, district attorney, judge, etc. when they are being silently tracked?

Not just in their routine duties but to mistresses, drug dens, prostitutes, porn parlors and the like?

We won’t have to wait long for the arrival of verifiable, secure cellphones.

## Inside the fight to reveal the CIA’s torture secrets [Support The Guardian]

September 12th, 2016

Part one: Crossing the bridge

Part two: A constitutional crisis

Part three: The aftermath

Ackerman captures the drama of a failed attempt by the United States Senate to exercise oversight on the Central Intelligence Agency (CIA) in this series.

I say “failed attempt” because even if the full 6,200+ page report is ever released, the lead Senate investigator, Daniel Jones, obscured the identities of all the responsible CIA personnel and sources of information in the report.

Even if the full report is serialized in your local newspaper, the CIA contractors and staff guilty of multiple felonies, will be not one step closer to being brought to justice.

To that extent, the “full” report is itself a disservice to the American people, who elect their congressional leaders and expect them to oversee agencies such as the CIA.

From Ackerman’s account you will learn that the CIA can dictate to its overseers, the location and conditions under which it can view documents, decide which documents it is allowed to see and in cases of conflict, the CIA can spy on the Select Senate Committee on Intelligence.

Does that sound like effective oversight to you?

BTW, you will also learn that members of the “most transparent administration in history” aided and abetted the CIA in preventing an effective investigation into the CIA and its torture program. I use “aided and abetted” deliberately and in their legal sense.

I mention in my header that you should support The Guardian.

This story by Spencer Ackerman is one reason.

Another reason is that given the plethora of names and transfers recited in Ackerman’s story, we need The Guardian to cover future breaks in this story.

Despite the tales of superhuman security, nobody is that good.

I leave you with the thought that if more than one person knows a secret, then it it can be discovered.

Check Ackerman’s story for a starting list of those who know secrets about the CIA torture program.

Good hunting!

## United States Treaties [Library of Congress] – Incomplete – Missing Native American Treaties

September 11th, 2016

From the webpage:

We have added the United States Treaty Series, compiled by Charles I. Bevans, to our online digital collection. This collection includes treaties that the United States signed with other countries from 1776 to 1949. The collection consists of 13 volumes: four volumes of multilateral treaties, eight volumes of bilateral treaties and one volume of an index.

Multilateral Treaties

Bilateral Treaties

Charles I. Bevans did not include the treaties with native Americans listed at Treaties Between the United States and Native Americans, part of the Avalon project at Yale Law School, Lillian Goldman Law Library.

The Avalon project lists thirty treaties from 1778 – 1868, along with links to their full texts.

1778
• Treaty With the Delawares
• 1782
• Chickasaw Peace Treaty Feeler
• 1784
• Treaty With the Six Nations
• 1785
• Treaty With the Wyandot, etc.

• Treaty With The Cherokee
• 1786
• Treaty With the Chocktaw

• Treaty With the Chickasaw

• Treaty With the Shawnee
• 1789
• Treaty With the Wyandot, etc.

• Treaty With the Six Nations
• 1790
• Treaty With the Creeks
• 1791
• Treaty With the Cherokee
• 1794
• Treaty With the Cherokee

• Treaty With the Six Nations

• Treaty With the Oneida, etc.
• 1795
• Treaty of Greenville
• 1805
• Chickasaw Treaty
• 1816
• Treaty With the Chickasaw
• 1818
• “Secret” Journal on Negotiations of the Chickasaw Treaty of 1818

• Treaty With the Chickasaw : 1818
• 1826
• Refusal of the Chickasaws and Choctaws to Cede Their Lands in Mississippi : 1826
• 1828
• Treaty With The Potawatami, 1828.
• 1830
• Treaty With the Chickasaw : 1830, Unratified
• 1832
• Treaty With the Potawatami, 1832.
• 1852
• Treaty with the Apache, July 1, 1852.
• 1853
• Treaty with the Comanche, Kiowa, and Apache; July 27, 1853
• 1865
• Treaty with the Cheyenne and Arapaho; October 14, 1865

• Treaty with the Apache, Cheyenne, and Arapaho; October 17, 1865.
• 1867
• Treaty With the Kiowa, Comanche, and Apache; October 21, 1867.
• 1868
• Fort Laramie Treaty : 1868
• You should draw your own conclusions about why these treaties were omitted from the Bevans edition. Their omission isn’t mentioned or explained in its preface.

## projectSlam [Public self-protection. Think Trojans.]

September 11th, 2016

projectSlam by Michael Banks.

From the webpage:

Project Slam is an initiative to utilize open source programs, operating systems and tools to aid in defending against nefarious adversaries. The overall focus is to research adversary’s behavior and utilize the data that can be captured to generate wordlists, blacklists, and expose methodologies of various threat actors that can be provided back to the public in a meaningful and useful way…

Partial data for 2016 includes:

A medium interaction honeypot was deployed with a focus on usernames and passwords. While attackers were attacking the honeypot, projectSlam was sucking up the attempts to generate a wordlist of what NOT to make your passwords.

Imagine that! Instead of hoarding information from a vulnerable public, or revealing only the top 10/20 worst passwords, Michael is posting the passwords hackers are looking for online!

Looking forward to more results from projectSlam and cybersecurity projects that enable the public to protect themselves!

Contrast a national network of Trojan dispensers versus Trojan representatives catching couples in need of a condom.

Which one is more effective?

Promote cyberself-protection today!

## Watch your Python script with strace

September 11th, 2016

Description:

Modern operating systems sandbox each process inside of a virtual memory map from which direct I/O operations are generally impossible. Instead, a process has to ask the operating system every time it wants to modify a file or communicate bytes over the network. By using operating system specific tools to watch the system calls a Python script is making — using “strace” under Linux or “truss” under Mac OS X — you can study how a program is behaving and address several different kinds of bugs.

Brandon Rhodes does a delightful presentation on using strace with Python.

Slides for Tracing Python with strace or truss.

I deeply enjoyed this presentation, which I discovered while looking at a Python regex issue.

Anticipate running strace on the Python script this week and will report back on any results or failure to obtain results! (Unlike in academic publishing, experiments and investigations do fail.)