Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

September 23, 2016

Are You A Closet Book Burner? Google Crowdsources Censorship!

Filed under: Censorship,Free Speech — Patrick Durusau @ 12:52 pm

YouTube is cleaning up and it wants your help! by Lisa Vaas.

From the post:

Google is well aware that the hair-raising comments of YouTube users have turned the service into a fright fest.

It’s tried to drain the swamp. In February 2015, for example, it created a kid-safe app that would keep things like, oh, say, racist/anti-Semitic/homophobic comments or zombies from scaring the bejeezus out of young YouTubers.

Now, Google’s trying something new: it’s soliciting “YouTube Heroes” to don their mental hazmat suits and dive in to do some cleanup.

You work hard to make YouTube better for everyone… and like all heroes, you deserve a place to call home.

Google has renamed the firemen of Fahrenheit 451 to YouTube Heroes.

Positive names cannot change the fact that censors by any name, are in fact just that, censors.

Google has taken censorship to a new level in soliciting the participation of the close-minded, the intolerant, the bigoted, the fearful, etc., from across the reach of the Internet, to censor YouTube.

Google does own YouTube and if it wants to turn it into a pasty gray pot of safe gruel, it certainly can do so.

As censors flood into YouTube, free thinkers, explorers, users who prefer new ideas over pablum, need to flood out of YouTube.

Ad revenue needs to fall as this ill-advised campaign, “come be a YouTube censor” succeeds.

Only falling ad revenue will stop this foray into the folly of censorship by Google.

First steps:

  1. Don’t post videos to Google.
  2. Avoid watching videos on Google as much as possible.
  3. Urge other to not post/use YouTube.
  4. Post videos to other venues.
  5. Speak out against YouTube censorship.
  6. Urge YouTube authors to post/repost elsewhere

“Safe place” means a place safe from content control at the whim and caprice of governments, corporations and even other individuals.

What’s so hard to “get” about that?

September 22, 2016

Hacker-Proof Code Confirmed [Can Liability Be Far Behind?]

Filed under: Formal Methods,Law,Programming — Patrick Durusau @ 8:56 pm

Hacker-Proof Code Confirmed by Kevin Hartnett.

From the post:

In the summer of 2015 a team of hackers attempted to take control of an unmanned military helicopter known as Little Bird. The helicopter, which is similar to the piloted version long-favored for U.S. special operations missions, was stationed at a Boeing facility in Arizona. The hackers had a head start: At the time they began the operation, they already had access to one part of the drone’s computer system. From there, all they needed to do was hack into Little Bird’s onboard flight-control computer, and the drone was theirs.

When the project started, a “Red Team” of hackers could have taken over the helicopter almost as easily as it could break into your home Wi-Fi. But in the intervening months, engineers from the Defense Advanced Research Projects Agency (DARPA) had implemented a new kind of security mechanism — a software system that couldn’t be commandeered. Key parts of Little Bird’s computer system were unhackable with existing technology, its code as trustworthy as a mathematical proof. Even though the Red Team was given six weeks with the drone and more access to its computing network than genuine bad actors could ever expect to attain, they failed to crack Little Bird’s defenses.

“They were not able to break out and disrupt the operation in any way,” said Kathleen Fisher, a professor of computer science at Tufts University and the founding program manager of the High-Assurance Cyber Military Systems (HACMS) project. “That result made all of DARPA stand up and say, oh my goodness, we can actually use this technology in systems we care about.”

Reducing the verification requirement to a manageable size appears to be the key to DARPA’s success.

That is rather than verification of the entire program, only critical parts, such as excluding hackers, need to be verified.

If this spreads, failure to formally verify critical parts of software would be a natural place to begin imposing liability for poorly written code.

PS: Would formal proof of data integration be a value-add?

Cisco Hunting Report – ISAKMP – 859,233 Vulnerable IPs

Filed under: Cybersecurity,Security — Patrick Durusau @ 8:15 pm

The Vulnerable ISAKMP Scanning Project, courtesy of ShadowServer reports:

This scan is looking for devices that contain a vulnerability in their IKEv1 packet processing code that could allow an unauthenticated, remote attacker to retrieve memory contents, which could lead to the disclosure of confidential information. More information on this issue can be found on Cisco’s site at: https://tools.cisco.com/security/center/content/CiscoSecurityAdvisory/cisco-sa-20160916-ikev1.

The goal of this project is to identify the vulnerable systems and report them back to the network owners for remediation.

Statistics on current run

859,233 distinct IPs have responded as vulnerable to our ISAKMP probe.

(emphasis in the original)

If visuals help:

isakmp_north_america_current-460

isakmp_europe_current-460

I trust your map reading skills are sufficient to conclude that ISAKMP vulnerabilities aren’t common in Iceland and northern Finland. There are more fertile areas for exploration.

iceland-finland-460

You can see other land masses or all vulnerable devices.

Is anyone selling ISAKMP scan data?

That would be valuable intell.

Imagine converting it into domain names so c-suite types could cross-check reassurances from their IT departments.

Apache Lucene 6.2.1 and Apache Solr 6.2.1 Available [Presidential Data Leaks]

Filed under: Lucene,Searching,Solr — Patrick Durusau @ 10:55 am

Lucene can be downloaded from http://www.apache.org/dyn/closer.lua/lucene/java/6.2.1

Solr can be downloaded from http://www.apache.org/dyn/closer.lua/lucene/solr/6.2.1

If you aren’t using Lucene/Solr 6.2, here’s your chance to grab the latest bug fixes as well!

Data leaks will accelerate as the US presidential election draws to a close.

What’s your favorite tool for analysis and delivery of data dumps?

Enjoy!

Google Allo – Goodbye!

Filed under: Cybersecurity,Privacy,Security — Patrick Durusau @ 10:39 am

Google Allo: Don’t use it, says Edward Snowden by Liam Tung.

From the post:

Google’s Allo messaging app and its Assistant bot have finally arrived, but Allo has been slammed for reneging on a promise that it would, by default, make it more difficult to spy on.

Because of the missing privacy feature, NSA-contractor-turned-whistleblower Edward Snowden’s first take of Allo after yesterday’s US launch is that it’s just a honeypot for surveillance.

The main complaints are that security is off by default and that chat logs are stored until deleted by users.

Google made a conscious choice on both of those features.

Now is your opportunity to make a conscious choice about Allo. Goodbye!

Don’t be mis-led into thinking end-to-end encryption ends the danger from preserving chat logs.

Intelligence agencies have long argued knowing who calls who is more important than the content of phone calls. Same is true for chats.

Google has chosen a side other than consumers, that’s enough to avoid it whenever possible.

September 21, 2016

What Makes A Liar Lie? (Clapper Lying About The Russians)

Filed under: Government,Politics — Patrick Durusau @ 3:45 pm

US intel head suggests Russia behind DNC hacks, says Moscow tried to affect elections in past

From the post:

The US director of national intelligence has suggested Russia is behind the recent hack that saw Democratic National Committee (DNC) records dumped online. The leak undermined the Democrats’ reputation ahead of November’s presidential election.

“It’s probably not real, real clear whether there’s influence in terms of an outcome [of the upcoming elections] – or what I worry about more, frankly – is just the sowing the seeds of doubt, where doubt is cast on the whole [election] process,” James Clapper said on Tuesday evening at an event hosted by the Washington Post, as cited by the Wall Street Journal.

Furthermore, the intelligence chief said Russia and its predecessor the USSR had been adhering to similar practices targeting the US since the 1960s.

“There’s a tradition in Russia of interfering with elections, their own and others.

“[…] It shouldn’t come as a big shock to people. I think it’s more dramatic maybe because now they have the cyber tools,” Clapper is cited as saying.

The comments come in contrast to Clapper’s earlier statements regarding Russia’s alleged connection to the hacking operation, which is believed to have been conducted over more than a year. In July, shortly after the documents had been leaked, he urged an end to the “reactionary mode” of blaming the leak on Russia.
… (emphasis in original)

Do you wonder why Clapper shifted from avoiding a “reactionary mode” of blaming Russia to not only blaming Russia, but claiming a history of Russian interference with United States elections?

I don’t have an email or recorded phone conversation smoking gun, but here’s one possible explanation:

From FiveThirtyEight as of today:

fivethirtyeight-21september2016-460

My prediction: The closer the odds become from FiveThirtyEight, the more frantic and far-fetched the lies from James Clapper will become.

Another DNC leak or two (real ones, not the discarded hard drive kind), and Clapper will be warning of Russian influence in county government and school board elections.

PS: If you don’t think Clapper is intentionally lying, when will you break the story his accounts have lost all connection to a reality shared by others?

Reducing Your “Competition”

Filed under: Cybersecurity,Security — Patrick Durusau @ 10:54 am

Good security practices are a must, whether you live in the Cisco universe or the more mundane realm of drug pushing.

Case in point: Photos On Dark Web Reveal Geo-locations Of 229 Drug Dealers — Here’s How by Swati Khandelwal.

From the post:

It’s a Fact! No matter how smart the criminals are, they always leave some trace behind.

Two Harvard students have unmasked around 229 drug and weapon dealers with the help of pictures taken by criminals and used in advertisements placed on dark web markets.

Do you know each image contains a range of additional hidden data stored within it that can be a treasure to the investigators fighting criminals?

Whatever services you are offering on the Dark Web, here’s an opportunity to reduce the amount of competition you are facing.

Perhaps even a reward from CrimeStoppers, although you need to price shop against your local organization for the better deal.

Failure to scrub Exchangeable Image File Format (EXIF) data lies at the heart of this technique.

See Swati’s post for more details on this “hack.”

Do your civic duty to reduce crime (your competitors) and be rewarded in the process.

Who says cybersecurity can’t be a profit center? 😉

September 20, 2016

Tails [Whatever The Presidential Race Outcome]

Filed under: Politics,Privacy — Patrick Durusau @ 7:58 pm

Tails – theamnesicincognitolivesystem

From the about page:

Tails is a live system that aims to preserve your privacy and anonymity. It helps you to use the Internet anonymously and circumvent censorship almost anywhere you go and on any computer but leaving no trace unless you ask it to explicitly.

Whatever your prediction for the US 2016 presidential election, Hairy Thunderer or Cosmic Muffin, you are going to need Tails

For free speech and/or privacy in 2017, get Tails.

It really is that simple.

Betraying Snowden:… [Cynical, but not odd]

Filed under: Journalism,News,NSA,Reporting — Patrick Durusau @ 6:30 pm

Betraying Snowden: There’s a special place in journalism hell for The Washington Post editorial board by Daniel Denvir.

From the post:

There is a special place in journalism hell reserved for The Washington Post editorial board now that it has called on President Barack Obama to not pardon National Security Agency whistleblower Edward Snowden.

As Glenn Greenwald wrote, it’s an odd move for a news publication, “which owes its sources duties of protection, and which — by virtue of accepting the source’s materials and then publishing them — implicitly declares the source’s information to be in the public interest.” Notably, the Post decided to “inexcusably omit . . . that it was not Edward Snowden, but the top editors of the Washington Post who decided to make these programs public,” as Greenwald added.

The Post’s peculiar justification is as follows: While the board grudgingly conceded that reporters, thanks to Snowden, revealed that the NSA’s collection of domestic telephone metadata — which “was a stretch, if not an outright violation, of federal surveillance law” — it condemns him for revealing “a separate overseas NSA Internet-monitoring program, PRISM, that was both clearly legal and not clearly threatening to privacy.”

Washington Post opposition to a pardon for Edward Snowden isn’t odd at all.

Which story generates more PR for the Washington Post:

  1. The Washington Post, having won a Pulitzer prize due to Edward Snowden, joins a crowd calling for his pardon?
  2. The Washington Post, having won a Pulitzer prize due to Edward Snowden, opposes his being pardoned?

It’s not hard to guess which one generates more ad-views and therefore the potential for click-throughs.

I have no problems with the disclosure of PRISM, save for Snowden having to break his word as a contractor to keep his client’s secrets, well, secret.

No one could be unaware that the NSA engages in illegal and immoral activity on a daily basis before agreeing to be employed by them.

Although Snowden has done no worse than his former NSA employers, it illustrates why I have no trust in government agencies.

If they are willing to lie for what they consider to be “good” reasons to you, then they are most certainly willing to lie to me.

Once it is established that an agency, take the NSA for example, has lied on multiple occasions, on what basis would you trust them to be telling the truth today?

Their assurance, “we’re not lying this time?” That seems rather tenuous.

Same rule should apply to contractors who lie to or betray their clients.

NSA: Being Found Beats Searching, Every Time

Filed under: Searching,Topic Maps — Patrick Durusau @ 4:41 pm

Equation Group Firewall Operations Catalogue by Mustafa Al-Bassam.

From the post:

This week someone auctioning hacking tools obtained from the NSA-based hacking group “Equation Group” released a dump of around 250 megabytes of “free” files for proof alongside the auction.

The dump contains a set of exploits, implants and tools for hacking firewalls (“Firewall Operations”). This post aims to be a comprehensive list of all the tools contained or referenced in the dump.

Mustafa’s post is a great illustration of why “being found beats searching, every time.”

Think of the cycles you would have to spend to duplicate this list. Multiple that by the number of people interested in this list. Assuming their time is not valueless, do you start to see the value-add of Mustafa’s post?

Mustafa found each of these items in the data dump and then preserved his finding for the use of others.

It’s not a very big step beyond this preservation to the creation of a container for each of these items, enabling the preservation of other material found on them or related to them.

Search is a starting place and not a destination.

Unless you enjoy repeating the same finding process over and over again.

Your call.

September 19, 2016

Stopping Terrorism: Thieves 2, Security Forces 0

Filed under: Government,Security — Patrick Durusau @ 4:44 pm

Murray Weiss, Nicholas Rizzi, Trevor Kapp and Aidan Gardiner document in Thieves Helped Crack the Chelsea Bombing Case, Sources Say how common street thieves thwarted terrorist attacks in New York City and New Jersey.

Albeit inadvertently, thieves prevented a second explosion in Chelsea and multiple explosion in New Jersey.

See Thieves Helped Crack the Chelsea Bombing Case, Sources Say for the full story.

Great illustration the surveillance state can track people down, after they have committed a crime. Not good at stopping people before they commit a crime.

So why are we spending $billions on a surveillance state, that is out performed by street thieves?

Reward any thief discovering a terrorist bomb and turning it in with:

get-out-jail-2-460

Good for life, non-violent crimes only.

Given the track record of security forces in the United States, a far better investment.

Hackers May Fake Documents, Congress Publishes False Ones

Filed under: Cybersecurity,Government,Government Data — Patrick Durusau @ 12:47 pm

I pointed out in Lions, Tigers, and Lies! Oh My! that Bruce Schneier‘s concerns over the potential for hackers faking documents to be leaked pales beside the mis-information distributed by government.

Executive Summary of Review of the Unauthorized Disclosures of Former National Security Agency Contractor Edward Snowden (their title, not mine), is a case in point.

Barton Gellman in The House Intelligence Committee’s Terrible, Horrible, Very Bad Snowden Report leaves no doubt the House Permanent Select Committee on Intelligence (HPSCI) report is a sack of lies.

Not mistakes, not exaggerations, not simply misleading, but actual, factual lies.

For example:


Since I’m on record claiming the report is dishonest, let’s skip straight to the fourth section. That’s the one that describes Snowden as “a serial exaggerator and fabricator,” with “a pattern of intentional lying.” Here is the evidence adduced for that finding, in its entirety.

“He claimed to have obtained a high school degree equivalent when in fact he never did.”

I do not know how the committee could get this one wrong in good faith. According to the official Maryland State Department of Education test report, which I have reviewed, Snowden sat for the high school equivalency test on May 4, 2004. He needed a score of 2250 to pass. He scored 3550. His Diploma No. 269403 was dated June 2, 2004, the same month he would have graduated had he returned to Arundel High School after losing his sophomore year to mononucleosis. In the interim, he took courses at Anne Arundel Community College.

See Gellman’s post for more examples.

All twenty-two members of the HPSCI signed the report. To save you time in the future, here’s a listing of the members of Congress who agreed to report these lies:

Republicans

Democrats

I sorted each group in to alphabetical order. The original listings were in an order that no doubt makes sense to fellow rodents but not to the casual reader.

That’s twenty-two members of Congress who are willing to distribute known falsehoods.

Does anyone have an equivalent list of hackers?

Congress.gov Corrects Clinton-Impeachment Search Results

Filed under: Government,Government Data,Searching — Patrick Durusau @ 8:14 am

After posting Congress.gov Search Alert: “…previous total of 261 to the new total of 0.” [Solved] yesterday, pointing out that a change from http:// to https:// altered a search result for Clinton w/in 5 words impeachment, I got an email this morning:

congress-gov-correction-460

I appreciate the update and correction for saved searches, but my point about remote data changing without notice to you remains valid.

I’m still waiting for word on bulk downloads from both Wikileaks and DC Leaks.

Why leak information vital to public discussion and then limit access to search?

September 18, 2016

Exotic Functional Data Structures: Hitchhiker Trees

Filed under: B-trees,Data Structures,Fractal Trees,Functional Programming — Patrick Durusau @ 8:00 pm

Description:

Functional data structures are awesome–they’re the foundation of many functional programming languages, allowing us to express complex logic immutably and efficiently. There is one unfortunate limitation: these data structures must fit on the heap, limiting their lifetime to that of the process. Several years ago, Datomic appeared as the first functional database that addresses these limitations. However, there hasn’t been much activity in the realm of scalable (gigabytes to terabytes) functional data structures.

In this talk, we’ll first review some of the fundamental principles of functional data structures, particularly trees. Next, we’ll review what a B tree is and why it’s better than other trees for storage. Then, we’ll learn about a cool variant of a B tree called a fractal tree, how it can be made functional, and why it has phenomenal performance. Finally, we’ll unify these concepts to understand the Hitchhiker tree, an open-source functionally persistent fractal tree. We’ll also briefly look at an example API for using Hitchhiker trees that allows your application’s state to be stored off-heap, in the spirit of the 2014 paper “Fast Database Restarts at Facebook”.

David Greenberg (profile)

Hitchhiker Trees (GitHub)

Fast Database Restarts at Facebook by Aakash Goel, Bhuwan Chopra, Ciprian Gerea, Dhrúv Mátáni, Josh Metzler, Fahim Ul Haq, Janet Wiener.

You could have searched for all the information I have included, but isn’t it more convenient to have it “already found?”

Introducing arxiv-sanity

Filed under: Archives,Searching,Similarity — Patrick Durusau @ 7:43 pm

Only a small part of Arxiv appears at: http://www.arxiv-sanity.com/ but it is enough to show the feasibility of this approach.

What captures my interest is the potential to substitute/extend the program to use other similarity measures.

Bearing in mind that searching is only the first step towards the acquisition and preservation of knowledge.

PS: I first saw this in a tweet by Data Science Renee.

Congress.gov Search Alert: “…previous total of 261 to the new total of 0.” [Solved]

Filed under: Government,Government Data,Searching — Patrick Durusau @ 11:03 am

Odd message from the Congress.org search alert this AM:

congress-alert-460

Here’s the search I created back in June, 2016:

congress-alert-search-460

My probably inaccurate recall at the moment was I was searching for some quote from the impeachment of Bill Clinton and was too lazy to specify a term of congress, hence:

all congresses – searching for Clinton within five words, impeachment

Fairly trivial search that produced 261 “hits.”

I set the search alert more to explore the search options than any expectation of different future results.

Imagine my surprise to find that all congresses – searching for Clinton within five words, impeachment performed today, results in 0 “hits.”

Suspecting some internal changes to the search interface, I re-entered the search today and got 0 “hits.”

Other saved searches with radically different search results as of today?

This is not, repeat not, the result of some elaborate conspiracy to assist Secretary Clinton in her bid for the presidency.

I do think something fundamental has gone wrong with searching at Congress.gov and it needs to be fixed.

This is an illustration of why Wikileaks, DC Leaks and other data sites should provide easy to access downloads in bulk of their materials.

Providing search interfaces to document collections is a public service, but document collections or access to them can change in ways not transparent to search users. Such as demonstrated by the CIA removing documents previously delivered to the Senate.

Petition Wikileaks, DC Leaks and other data sites for easy bulk downloads.

That will ensure the “evidence” will not shift under your feet and the availability of more sophisticated means of analysis than brute-force search.


Update: The changing from http:// to https:// by the congress.gov site, trashed my save query and using http:// to re-perform the same search.

Using https:// returns the same 261 search results.

What your experience with other saved searches at congress.gov?

September 17, 2016

Scalable Vector Graphics (SVG) 2

Filed under: Graphics,SVG — Patrick Durusau @ 8:32 pm

Scalable Vector Graphics (SVG) 2: W3C Candidate Recommendation 15 September 2016

Abstract:

This specification defines the features and syntax for Scalable Vector Graphics (SVG) Version 2. SVG is a language based on XML for describing two-dimensional vector and mixed vector/raster graphics. SVG content is stylable, scalable to different display resolutions, and can be viewed stand-alone, mixed with HTML content, or embedded using XML namespaces within other XML languages. SVG also supports dynamic changes; script can be used to create interactive documents, and animations can be performed using declarative animation features or by using script.

Comments:

Comments on this Candidate Recommendation are welcome. Comments can be sent to www-svg@w3.org, the public email list for issues related to vector graphics on the Web. This list is archived and senders must agree to have their message publicly archived from their first posting. To subscribe send an email to www-svg-request@w3.org with the word subscribe in the subject line.

W3C publishes a Candidate Recommendation to indicate that the document is believed to be stable and to encourage implementation by the developer community. This Candidate Recommendation is expected to advance to Proposed Recommendation no earlier than 15 July 2017, but we encourage early review, and requests for normative changes after 15 November 2016 may be deferred to SVG 3.

15 November 2016 will be here sooner than you realize. Read and comment early and often.

Enjoy!

Introducing OpenType Variable Fonts

Filed under: Fonts — Patrick Durusau @ 8:13 pm

Introducing OpenType Variable Fonts by John Hudson.

From the post:

Version 1.8 of the OpenType font format specification introduces an extensive new technology, affecting almost every area of the format. An OpenType variable font is one in which the equivalent of multiple individual fonts can be compactly packaged within a single font file. This is done by defining variations within the font, which constitute a single- or multi-axis design space within which many font instances can be interpolated. A variable font is a single font file that behaves like multiple fonts.

There are numerous benefits to this technology. A variable font is a single binary with greatly-reduced comparable file size and, hence, smaller disc footprint and webfont bandwidth. This means more efficient packaging of embedded fonts, and faster delivery and loading of webfonts. The potential for dynamic selection of custom instances within the variations design space — or design-variations space, to use its technical name — opens exciting prospects for fine tuning the typographic palette, and for new kinds of responsive typography that can adapt to best present dynamic content to a reader’s device, screen orientation, or even reading distance.

The technology behind variable fonts is officially called OpenType Font Variations. It has been jointly developed by Microsoft, Google, Apple, and Adobe, in an unprecedented collaborative effort also involving technical experts from font foundries and font tool developers. In addition to specifying the font format additions and revisions, the working group has also committed to the goal of interoperable implementation, defining expected behaviours and test suites for software displaying variable fonts. This should be welcome news to font developers and users, who have often struggled with incompatible implementations of earlier aspects of OpenType that were left to the interpretation of individual software companies.

OpenType Font Variations builds on the model established in Apple’s TrueType GX variations in the mid-1990s, but has fully integrated that model into all aspects of the OpenType format, including OpenType Layout, and is available to both TrueType and Compact Font Format (CFF) flavours of OpenType. This has meant not only the addition of numerous tables to the format, but also revision of many existing tables; these changes are summarised in an appendix to this article, which is intended as an introduction and technological summary, primarily for font makers and font tool developers. The full technical specification for OpenType Font Variations is incorporated into the OpenType specification version 1.8.

John Hudson developed the remarkable SBL BibLit, SBL Greek and SBL Hebrew fonts for biblical studies.

An illustration from John’s post:

variable-opentype-font-460

Figure 1. Normalised design space of a 3-axis variable font.
[Typeface: Kepler, an Adobe Original designed by Robert Slimbach.]

Looking forward to the SBL transitioning its biblical studies font set to this new font technology.

Lions, Tigers, and Lies! Oh My!

Filed under: Cybersecurity,Government — Patrick Durusau @ 7:49 pm

How Long Until Hackers Start Faking Leaked Documents? by Bruce Schneier.

Bruce writes:


No one is talking about this, but everyone needs to be alert to the possibility. Sooner or later, the hackers who steal an organization’s data are going to make changes in them before they release them. If these forgeries aren’t questioned, the situations of those being hacked could be made worse, or erroneous conclusions could be drawn from the documents. When someone says that a document they have been accused of writing is forged, their arguments at least should be heard.

Really?

Governments, the United States Government in particular, leak false information and documents as a matter of normal business practice. Not to mention corporations and special interest groups that pay for false research (think Harvard, sugar studies) to be published.

In case you missed it, read Inside the fight to reveal the CIA’s torture secrets. In depth analysis of how the CIA not only lied, but destroyed evidence, spied on the U.S. Senate and otherwise misbehaved during an investigation into its torture practices.

That’s just one example. One could fill a multi-volume series with the lies, false documents and fabrications of the current and immediately previous U.S. President.

The argument torturers were “doing their duty to protect the country” and so merit a pass on accountability I recommend to any future political assassins. See how that plays out in a court of law. Hint: Crimes are crimes whatever your delusional understanding of “the greater good.”

The easier rule is:

Consider all documents/statements as false unless and until:

  1. You are satisfied of the truth of the document/statement, or
  2. It is to your advantage to treat the document/statement as true.

That covers situations like “fact free” accusations of cyber hacking against the Russians, North Koreans and/or Chinese by the U.S. government.

No “evidence” has been offered for any of those allegations, only vaguely worded rumors circulated among “experts” who are also government contractors. You can imagine the credibility I assign to such sources.

Probably happenstance but such contractors could be telling the truth. Unfortunately, in the absence of any real evidence, only the self-interested in such “truths” or the extremely credulous, crack-pipe users for example, would credit such statements.

How Mapmakers Make Mountains Rise Off the Page

Filed under: Cartography,Graphics,Mapping,Maps,Visualization — Patrick Durusau @ 10:34 am

How Mapmakers Make Mountains Rise Off the Page by Greg Miller.

From the post:

The world’s most beautiful places are rarely flat. From the soaring peaks of the Himalaya to the vast chasm of the Grand Canyon, many of the most stunning sites on Earth extend in all three dimensions. This poses a problem for mapmakers, who typically only have two dimensions to work with.

Fortunately, cartographers have some clever techniques for creating the illusion of depth, many of them developed by trial and error in the days before computers. The best examples of this work use a combination of art and science to evoke a sense of standing on a mountain peak or looking out an airplane window.

One of the oldest surviving maps, scratched onto an earthenware plate in Mesopotamia more than 4,000 years ago, depicts mountains as a series of little domes. It’s an effective symbol, still used today in schoolchildren’s drawings and a smartphone emoji, but it’s hardly an accurate representation of terrain. Over the subsequent centuries, mapmakers made mostly subtle improvements, varying the size and shape of their mountains, for example, to indicate that some were bigger than others.

But cartography became much more sophisticated during the Renaissance. Topographic surveys were done for the first time with compasses, measuring chains, and other instruments, resulting in accurate measurements of height. And mapmakers developed new methods for depicting terrain. One method, called hachuring, used lines to indicate the direction and steepness of a slope. You can see a later example of this in the 1807 map below of the Mexican volcano Pico de Orizaba. Cartographers today refer (somewhat dismissively) to mountains depicted this way as “woolly caterpillars.”

Stunning illusions of depth on maps, creating depth illusions in 2 dimensions (think computer monitors), history of map making techniques, are all reasons to read this post.

What seals it for me is that the quest for the “best” depth illusion continues. It’s not a “solved” problem. (No spoiler, see the post.)

Physical topography to one side, how are you going to bring “depth” to your topic map?

Some resources in a topic map may have great depth and others, unfortunately, may be like Wikipedia articles marked as:

This article has multiple issues.

How do you define and then enable navigation of your topic maps?

September 16, 2016

How-To Discover Pay-to-Play Appointment Pricing

Filed under: Government,Politics — Patrick Durusau @ 3:55 pm

You have seen one or more variations on:

This Is How Much It ‘Costs’ To Get An Ambassadorship: Guccifer 2.0 Leaks DNC ‘Pay-To-Play’ Donor List

DNC Leak Exposes Pay to Play Politics, How the Clinton’s REALLY Feel About Obama

CORRUPTION! Obama caught up in Pay for Play Scandal, sold every job within his power to sell.

You may be wondering why CNN, the New York Time and the Washington Post aren’t all over this story?

While selling public offices surprises some authors, whose names I omitted out of courtesy to their families, selling offices is a regularized activity in the United States.

So regularized that immediately following each presidential election, the Government Printing Office publishes the United States Government Policy and Supporting Positions 2012 (Plum Book) that lists the 9,000 odd positions that are subject to presidential appointment.

From the description of the 2012 edition:

Every four years, just after the Presidential election, “United States Government Policy and Supporting Positions” is published. It is commonly known as the “Plum Book” and is alternately published between the House and Senate.

The Plum Book is a listing of over 9,000 civil service leadership and support positions (filled and vacant) in the Legislative and Executive branches of the Federal Government that may be subject to noncompetitive appointments, or in other words by direct appointment.

These “plum” positions include agency heads and their immediate subordinates, policy executives and advisors, and aides who report to these officials. Many positions have duties which support Administration policies and programs. The people holding these positions usually have a close and confidential relationship with the agency head or other key officials.

Even though the 2012 “plum” book is currently on sale for $19.00 (usual price is $38.00), given that a new one will appear later this year, consider using the free online version at: Plum Book 2012.

plum-book-2012-460

The online interface is nothing to brag on. You have to select filters and then find to obtain further information on positions. Very poor UI.

However, if under title you select “Chief of Mission, Monaco” and then select “find,” the resulting screen looks something like this:

monaco-chief-01-460

To your far right there is a small arrow that if selected, takes you to the details:

monaco-chief-02-460

If you were teaching a high school civics class, the question would be:

How much did Charles Rivkin have to donate to obtain the position of Chief of Mission, Monaco?

FYI, the CIA World FactBook gives this brief description for Monaco:

Monaco, bordering France on the Mediterranean coast, is a popular resort, attracting tourists to its casino and pleasant climate. The principality also is a banking center and has successfully sought to diversify into services and small, high-value-added, nonpolluting industries.

Unlike the unhappy writers that started this post, you would point the class to: Transaction Query By Individual Contributor at the Federal Election Commission site.

Entering the name Rivkin, Charles and select “Get Listing.”

Rivkin’s contributions are broken into categories and helpfully summed to assist you in finding the total.

Contributions to All Other Political Committees Except Joint Fundraising Committees – $72399.00

Joint Fundraising Contributions – $22300.00

Recipient of Joint Fundraiser Contributions – $36052.00

Caution: There is an anomalous Rivkin in that last category, contributing $40 to Donald Trump. For present discussions, I would subtract that from the grand total of:

$130,711 to be the Chief of Mission, Monaco.

Realize that this was not a lump sum payment but a steady stream of contributions starting in the year 2000.

Using the Transaction Query By Individual Contributor resource, you can correct stories that claim:

Jane Hartley paid DNC $605,000 and then was nominated by Obama to serve concurrently as the U.S. Ambassador to the French Republic and the Principality of Monaco.

jane-hartley

(from: This Is How Much It ‘Costs’ To Get An Ambassadorship: Guccifer 2.0 Leaks DNC ‘Pay-To-Play’ Donor List)

If you run the FEC search you will find:

Contributions to Super PACs, Hybrid PACs and Historical Soft Money Party Accounts – $5000.00

Contributions to All Other Political Committees Except Joint Fundraising Committees – $516609.71

Joint Fundraising Contributions – $116000.00

Grand total: $637,609.71.

So, $637,609.71, not $605,000.00 but also as a series of contributions starting in 1997, not one lump sum.

You don’t have to search discarded hard drives to get pay-to-play appointment pricing. It’s all a matter of public record.

PS: I’m not sure how accurate or complete Nominations & Appointments (White House) may be, but its an easier starting place for current appointees than the online Plum book.

PPS: Estimated pricing for “Plum” book positions could be made more transparent. Not a freebie. Let me know if you are interested.

Android Hacking – $200K First Prize – Other Offers?

Filed under: Cybersecurity,Security — Patrick Durusau @ 10:46 am

Announcing the Project Zero Prize by Natalie Silvanovich.

Before reading the “official” post, consider this Dilbert cartoon.

Same logic applies here:

How to compare alternatives? ($200K sets a minimum bid.)

Potential for repeat business?

For a pwn of any Android phone, $200K sounds a bit “lite.”

Watch the Android issue tracker. A third-party bidder won’t insist on you using only your reported bugs in an exploit chain.

Before anyone gets indignant, the NSA, CIA, the “Russians,” Chinese, Mossad, etc., will all be watching as well. Think of it as having “governmental” ethics.

From the post:

Despite the existence of vulnerability rewards programs at Google and other companies, many unique, high-quality security bugs have been discovered as a result of hacking contests. Hoping to continue the stream of great bugs, we’ve decided to start our own contest: The Project Zero Prize.

The goal of this contest is to find a vulnerability or bug chain that achieves remote code execution on multiple Android devices knowing only the devices’ phone number and email address. Successful submissions will be eligible for the following prizes.

First Prize

$200,000 USD, awarded to the first winning entry.

Second Prize

$100,000 USD, awarded to the second winning entry.

Third Prize

At least $50,000 USD awarded by Android Security Rewards, awarded to additional winning entries.

In addition, participants who submit a winning entry will be invited to write a short technical report on their entry, which will be posted on the Project Zero Blog.

Contest Structure

This contest will be structured a bit differently than other contests. Instead of saving up bugs until there’s an entire bug chain, and then submitting it to the Project Zero Prize, participants are asked to report the bugs in the Android issue tracker. They can then be used as a part of submission by the participant any time during the six month contest period. Only the first person to file a bug can use it as a part of their submission, so file early and file often! Of course, any bugs that don’t end up being used in a submission will be considered for Android Security Rewards and any other rewards program at Google they might be eligible for after the contest has ended.

In addition, unlike other contests, the public sharing of vulnerabilities and exploits submitted is paramount. Participants will submit a full description of how their exploit works with their submission, which will eventually be published on the Project Zero blog. Every vulnerability and exploit technique used in each winning submission will be made public.

Full contest rules

Frequently asked questions

Contest period:

The Contest begins at 12:00:00 A.M. Pacific Time (PT) Zone in the United States on September 13, 2016 and ends at 11:59:59 P.M. PT on March 14, 2017 (“Contest Period”).

Good hunting!

PS: If possible, post the paid price for your exploit to help set the market price for future such exploits.

If It’s Good Enough For Colin Powell…

Filed under: Cybersecurity,Security — Patrick Durusau @ 8:20 am

Some security advice for Colin Powell to better protect his Gmail account by Graham Cluley.

Graham posted webmail security advice for Colin Powell after 26 months worth of his private emails were leaked by DC Leaks.

Nothing surprising for my readers but pass it on to the c-suite types.

You can search and view Powell’s emails at DC Leaks / Colin Luther Powell.

Graham omits any link to DC Leaks and says:


Of course, the emails aren’t just embarrassing and damaging for the privacy of Colin Powell – they are also potentially humiliating for the people he was corresponding with, who have had their own private conversations exposed to the world.

Oh, the horror! Invasions of privacy!

You mean like the millions of ordinary people who aren’t secure in their phone calls, emails, web browsing, banking, credit histories, etc., all the time?

The extremely privileged getting nicked every now and again doesn’t trouble me.

“Oversight” hasn’t protected our freedoms, perhaps constant and detailed exposure of the privileged will. Worth a shot!

September 15, 2016

Guccifer 2.0 – 13Sept2016 Leak – A Reader’s Guide (Part 2) [Discarded Hard Drive?]

Filed under: Government,Politics — Patrick Durusau @ 4:52 pm

Guccifer 2.0‘s latest release of DNC documents is generally described as:

In total, the latest dump contains more than 600 megabytes of documents. It is the first Guccifer 2.0 release to not come from the hacker’s WordPress account. Instead, it was given out via a link to the small group of security experts attending the London conference. Guccifer 2.0 drops more DNC docs by Cory Bennett.

The “600 megabytes of documents” is an attention grabber, but how much of that 600 megabytes is useful and/or interesting?

The answer turns out to be, not a lot.


Here’s an overview of the directories and files:

/CIR

Financial investment data.

/CNBC

Financial investment data.

/DNC

Redistricting documents.

/DNCBSUser

One file with fields of VANDatabaseCode StateID VanID cons_id?

/documentation

A large amount of documentation for “IQ8,” apparently address cleaning software. Possibly useful if you want to know address cleaning rules from eight years ago.

/DonorAnalysis

Sound promising but is summary data based on media markets.

/early

Early voting analysis.

/eday

Typical election voting analysis, from 2002 to 2008.

/FEC

Duplicates to FEC filings. Checking the .csv file, data from 2008. BTW, you can find this date (2008) and later data of the same type at: http://fec.gov.

/finance

More duplicates to FEC filings. 11-26-08 NFC Members Raised.xlsx (no credit cards) – Dated but 453 names with contacts, amounts raised, etc.

/HolidayCards

Holiday card addresses, these are typical:

holiday_list_noproblems.txt
holidaycards.mdb
morethanonename.xls

/jpegs

Two jpegs were included in the dump.

/marketing

Lists of donors.

DNC union_05-09.txt
DNCunion0610.txt
GDSA11A.CSV
November VF EOC – MEYER.txt
dem0702a[1].zip
dem977.txt
dem978.txt
dem979.txt
dem980.txt
dem981.txt
dem982.txt
dem9A3_NGP.txt
dem9A6_NGP.txt
dnc_harris_eoc_nov09_canvass.zip – password protected
dsg.txt
gsi.txt
harris.txt
marketing_phones.txt
ofa_actives_non-donor.csv
tm_files.txt

/May-FEC

Grepping looks like May, 2009 data for the FEC.

/newmedia

More donor lists.

20090715_new_synetech_emails.csv
emails_w_contactinfo.txt
ofa_email_export.zip

/pdfs

IT hosting proposals.

/Reports for Kaine

Various technology memos

/security

IT security reports

/stuffformike/WH/

Contacts not necessarily in FEC records

Contact List-Complete List.xlsx – Contact list with emails and phone numbers (no credit cards)
WH Staff 2010.xlsx – Names but no contact details


The data is eight (8) years old. Do you have the same phone number you did eight (8) years ago?

Guccifer 2.0 makes no claim on their blog for ownership of this leak.

A “hack” that results in eight year old data, most of which is more accessible at http://fec.gov?

No, this looks more like a discarded hard drive that was harvested and falsely labeled as a “hack” of the DNC.

Unless Guccifer 2.0 says otherwise on their blog, you have better things to do with your time.

PS: You don’t need old hard drives to discover pay-to-play purchases of public appointments. Check back tomorrow for: How-To Discover Pay-to-Play Appointment Pricing.

September 14, 2016

Guccifer 2.0 – 13Sept2016 Leak – A Reader’s Guide (Part 1)

Filed under: Government,Politics — Patrick Durusau @ 9:00 pm

Guccifer 2.0 dropped a new bundle of DNC documents on September 13, 2016! Like most dumps, there was no accompanying guide to make use of that dump easier. 😉 Not a criticism, just an observation.

As a starting point to make your use of that dump a little easier, I am posting an ls -lR listing of all the files in that dump, post extraction with 7z and unrar. Guccifer2.0-13Sept2016-filelist.txt.

I’m working on a list of the files most likely to be of interest. Look for that tomorrow.

I can advise that no credit card numbers were included in this dump.

Using:

grep --color -H -rn --include="*.txt" '\([345]\{1\}[0-9]\{3\}\|6011\)\{1\}[ -]\?[0-9]\{4\}[ -]\?[0-9]\{2\}[-]\?[0-9]\{2\}[ -]\?[0-9]\{1,4\}'

I checked all the .txt files for credit card numbers. (I manually checked the xsl/xslx files.)

There were “hits” but those were in Excel exports of vote calculations. Funny how credit card numbers don’t ever begin with “0.” as a prefix.

Since valid credit card numbers vary in length, I don’t know of an easy way to avoid that issue. So inspection of the files it was.

September 13, 2016

Investigatory Powers Bill As Amended In Committee

Filed under: Government,Privacy — Patrick Durusau @ 7:31 pm

For those of you watching the UK’s plunge into darkness, the Investigatory Powers Bill, as amended in committee, has been posted online.

Apologies for the lite amount of posting today but a very large data dump was released earlier today that distracted me from posting. 😉

FPCasts

Filed under: Functional Programming — Patrick Durusau @ 7:04 pm

FPCasts – Your source for Functional Programming Related Podcasts

Ten (10) sources of podcasts, with a link to the latest podcast from each source.

Without notice to the reader, the main link to each podcast series is a link to an RSS file.

Not a problem but took me by surprise on my first visit.

As useful as this will be, indexed podcasts where you could jump to a subject of interest would be even better.

Enjoy!

September 12, 2016

R Weekly

Filed under: Programming,R — Patrick Durusau @ 8:47 pm

R Weekly

A new weekly publication of R resources that began on 21 May 2016 with Issue 0.

Mostly titles of post and news articles, which is useful, but not as useful as short summaries, including the author’s name.

Persuasive Cartography

Filed under: Cartography,Mapping,Maps,Persuasion — Patrick Durusau @ 8:12 pm

Vintage Infodesign [161]: More examples of persuasive cartography, diagrams and charts from before 1960 by Tiago Veloso.

From the post:

A recurrent topic here on Vintage InfoDesign is “persuasive cartography” – the use of maps to influence and in many cases, deceive. We showcased examples of these maps here and here, with a special mention to the PJ Mode Collection at Cornell University Library. The collection was donated to Cornell back in 2014, and until now more than 300 examples are available online in high resolution.

A must for all of those interested in the subject, and we picked a few examples to open this post, courtesy of Allison Meier, who published a rente article about the PJ Mode Collection over at Hyperallergic.

new-black-plague-460

Re-reading The Power of Maps (1992) by Denis Wood, in preparation to read Rethinking The Power of Maps (2010), also by Denis Wood, has made me acutely aware of aspersions such as:

“persuasive cartography” – the use of maps to influence and in many cases, deceive.

I say “aspersion” because Wood makes the case that all maps, with no exceptions, are the results of omissions, characterizations, enhancements, emphasis on some features and not others, for stated and/or unstated purposes.

Indeed, all of The Power of Maps (1992) is devoted to teasing out, with copious examples, where a user of a map may fail to recognize the “truth” of any map, is a social construct in a context shaped by factors known and unknown.

I characterize maps I disagree with as being deceptive, disingenuous, inaccurate, etc., but doesn’t take away from Wood’s central point that all maps are acts of persuasion.

The critical question being: Do you support the persuasion a map is attempting to make?

When I teach topic maps again I will make The Power of Maps (1992) required reading.

It is an important lesson to realize that any map, even a topic map, need only map so much of the territory or domain, as is sufficient for the task at hand.

A topic maps for nuclear physics won’t have much in common with one for war criminals of the George W. Bush and Barack Obama administrations.

Moreover, even topic maps of the same subject domain, may or may not merge in a meaningful way.

The idea of useful merger of arbitrary topic maps, like the idea of “objective maps,” is a false one that serves no useful purpose.

Say rather that topic maps can make enough information explicit about subjects to determine if merging will be meaningful to one or more users of a topic map. That alone is quite a feat.

Invite Government Into The Cellphone Fish Bowl

Filed under: Cybersecurity,Privacy,Security — Patrick Durusau @ 4:25 pm

Long-Secret Stingray Manuals Detail How Police Can Spy On Phones by Sam Biddle.

Sam summarizes the high points from around 200 pages of current but never seen before Harris instruction manuals. Good show!

From the post:


Harris declined to comment. In a 2014 letter to the Federal Communications Commission, the company argued that if the owner’s manuals were released under the Freedom of Information Act, this would “harm Harris’s competitive interests” and “criminals and terrorist[s] would have access to information that would allow them to build countermeasures.”

Creating countermeasures?

Better, treat these documents as a basis for reverse-engineering Harris Stingrays into DIY kits.

False promises from known liars on use of “Stingray”s or “IMSI catchers are not going to combat government abuse of this technology.

Inviting governments to join the general public in the cellphone fish bowl might.

Can you imagine the reaction of your local sheriff, district attorney, judge, etc. when they are being silently tracked?

Not just in their routine duties but to mistresses, drug dens, prostitutes, porn parlors and the like?

We won’t have to wait long for the arrival of verifiable, secure cellphones.

« Newer PostsOlder Posts »

Powered by WordPress