Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

December 26, 2018

Hacker Digest – Volume 21 Released!

Filed under: Cybersecurity,Hacking — Patrick Durusau @ 8:23 pm

Volume 21 of the Hacker Digest Released

From the post:

Volume 21 of The Hacker Digest is now out. If you’re a lifetime digital subscriber, you will have already received this edition. Volume 21 is comprised of issues from 2004, our 20th anniversary and a year where we embraced propaganda, at least on all of our covers. It was a time of soul searching in the hacker community, the year of The Fifth HOPE, and a changing country.

You can click here to buy Volume 21 or become a lifetime digital subscriber here. If you do the latter, you will receive digital copies of everything we have published to date, plus everything that we publish in the future. We have now digitized 31 out of our 34 years.

If you also want paper copies, we have a special offer here. And if you’re an existing paper lifetime subscriber who wants to upgrade to digital at a discounted rate, just click here.

The current list price is $260 for a lifetime digital subscription. A real bargain considering many of the hacks are still viable today.

Practical Gremlin – An Apache TinkerPop Tutorial

Filed under: Gremlin,TinkerGraph,TinkerPop — Patrick Durusau @ 4:06 pm

Practical Gremlin – An Apache TinkerPop Tutorial by Kelvin R. Lawrence.

From the webpage:

This book is a work in progress. Feedback (ideally via issue) is very much encouraged and welcomed!

The title of this book could equally well be “A getting started guide for users of graph databases and the Gremlin query language featuring hints, tips and sample queries”. It turns out that is a bit too too long to fit on one line for a heading but in a single sentence that describes the focus of this book pretty well.

The book introduces the Apache TinkerPop 3 Gremlin graph query and traversal language via real examples against a real world graph. They are given as a set of working examples against a graph that is also provided in the sample-data folder. The graph, air-routes.graphml, is a model of the world airline route network between 3,367 airports including 43,160 routes. The examples we present will work unmodified with the air-routes.graphml file loaded into the Gremlin console running with a TinkerGraph.

What do you think? Is “A getting started guide for users of graph databases and the Gremlin query language featuring hints, tips and sample queries”. too long for a title? Perhaps not for a German dissertation (too short) but web title? I suspect Lawrence is right.

Still, at 400 pages with more content to be added, it won’t be a quick read. Enjoyable one, but not a quick one! Be sure to give feedback as issues if your New Year starts off with this book.

December 24, 2018

Intel Neural Compute Stick 2

Filed under: Neural Information Processing,Neural Networks — Patrick Durusau @ 3:20 pm

Intel Neural Compute Stick 2 (Mouser Electronics)

From the webpage:

Intel® Neural Compute Stick 2 is powered by the Intel™ Movidius™ X VPU to deliver industry leading performance, wattage, and power. The NEURAL COMPUTE supports OpenVINO™, a toolkit that accelerates solution development and streamlines deployment. The Neural Compute Stick 2 offers plug-and-play simplicity, support for common frameworks and out-of-the-box sample applications. Use any platform with a USB port to prototype and operate without cloud compute dependence. The Intel NCS 2 delivers 4 trillion operations per second with 8X performance boost over previous generations.

At $99 (US) with a USB stick form factor, the Intel® Neural Compute Stick 2 makes a great gift any time of the year. Not to mention offering the opportunity to test your hacking skills on “out-of-the-box sample applications.” The most likely ones you will see in the wild.

Enjoy!

December 6, 2018

Teaching Cybersecurity Law and Policy (Chesney) [Cui Bono?]

Filed under: Cybersecurity,Law — Patrick Durusau @ 11:43 am

Teaching Cybersecurity Law and Policy: My Revised 62-Page Syllabus/Primer by Robert Chesney.

From the post:

Cybersecurity law and policy is a fun subject to teach. There is vast room for creativity in selecting topics, readings and learning objectives. But that same quality makes it difficult to decide what to cover, what learning objectives to set, and which reading assignments to use.

With support from the Hewlett Foundation, I’ve spent a lot of time in recent years wrestling with this challenge, and last spring I posted the initial fruits of that effort in the form of a massive “syllabus” document. Now, I’m back with version 2.0.

Here’s the document.

At 62 pages (including a great deal of original substantive content, links to readings, and endless discussion prompts), it is probably most accurate to describe it as a hybrid between a syllabus and a textbook. Though definitely intended in the first instance to benefit colleagues who teach in this area or might want to do so, I think it also will be handy as a primer for anyone—practitioner, lawyer, engineer, student, etc.—who wants to think deeply about the various substrands of this emergent field and how they relate to one another.

Feel free to make use of this any way you wish. Share it with others who might enjoy it (or at least benefit from it), and definitely send me feedback if you are so inclined (rchesney@law.utexas.edu or @bobbychesney on Twitter).

The technical side of the law is deeply fascinating and perhaps even more so in cybersecurity. It’s worth noting that Chesney does a great job laying out normative law as a given.

You are not going to find an analysis of the statutes cited to identify who benefits or is penalized by those statutes. You know the adage about laws that prohibit the rich and the poor equally from sleeping under bridges? The same applies to cybersecurity statutes. They are always presented as fair and accomplished public policies. Nothing could be further from the truth.

That’s not a criticism of Chesney’s syllabus, the technical side of existing laws is a quite lucrative one for anyone who masters its complexities. And it is certainly a worthy subject for study. I mention looking behind laws as it were to promote an awareness that shaping the winners and losers encoded in laws, also merits your attention.

Cybersecurity laws have adversely impacted security researchers, as steps suggested to reduce the odds of your liability for disclosure of a vulnerability show:

  • Don’t ask for money in exchange for keeping vulnerability information quiet. Researchers have been accused of extortion after saying they would reveal the vulnerability unless the company wants to pay a finder’s fee or enter into a contract to fix the problem. See, e.g. GameSpy warns security researcher
  • If you are under a non-disclosure agreement, you may not be allowed to publish. Courts are likely to hold researchers to their promises to maintain confidentiality.
  • You may publish information to the general public, but do not publish directly to people you know intend to break the law.
  • Consider disclosing to the vendor or system administrator first and waiting a reasonable and fair amount of time for a patch before publishing to a wider audience.
  • Consider having a lawyer negotiate an agreement with the company under which you will provide details about the vulnerability—thus helping to make the product better—in exchange for the company’s agreement not to sue you for the way you discovered the problem.
  • Consider the risks and benefits of describing the flaw with proof-of-concept code, and whether that code could describe the problem without unnecessarily empowering an attacker.
  • Consider whether your proof of concept code is written or distributed in a manner that suggests it is “primarily” for the purpose of gaining unauthorized access or unlawful data interception, or marketed for that purpose. Courts look both to the attributes of the tool itself as well as the circumstances surrounding the distribution of that tool to determine whether it would violate such a ban.
  • Consider whether to seek advance permission to publish, even if getting it is unlikely.
  • Consider how to publish your advisory in a forum and manner that advances the state of knowledge in the field.
  • Do not publish in a manner that enables or a forum that encourages copyright infringement, privacy invasions, computer trespass or other offenses.

The oppression of independent security researchers in cybersecurity law is fairly heavy-handed but there are subtleties and nuances that lie deeper in the interests that drove drafting of such legislation.

Fairly obvious but have you noticed there is no liability for faulty software? The existence of EULAs, waivers of liability, are a momentary diversion. It is a rare case when a court finds such agreements enforceable, outside the context of software.

The discovery and publication of vulnerabilities, should vendors not fix them in a timely fashion, would raise serious questions about their “gross negligence” in failing to fix such vulnerabilities. And thence to greater abilities to attack EULAs.

Not only are major software vendors bastards, but they are clever bastards as well.

That’s only one example of an unlimited number once you ask qui bono? (whose good) for any law.

In a world where governments treat the wholesale slaughter of millions of people of color and condemning of millions to lives of deprivation and want as “business as usual,” you may ask, what obligation is there to obey any cybersecurity or other law?

Your obligation to obey any law is a risk assesment of the likelihood of a soverign attributing a particular act to you. The better your personal security, the greater the range of behavior choices you have.

Basic Text [Leaked Email] Processing in R

Filed under: R,Text Mining — Patrick Durusau @ 10:08 am

Basic Text Processing in R by Taylor Arnold and Lauren Tilton.

From Learning Goals:

A substantial amount of historical data is now available in the form of raw, digitized text. Common examples include letters, newspaper articles, personal notes, diary entries, legal documents and transcribed speeches. While some stand-alone software applications provide tools for analyzing text data, a programming language offers increased flexibility to analyze a corpus of text documents. In this tutorial we guide users through the basics of text analysis within the R programming language. The approach we take involves only using a tokenizer that parses text into elements such as words, phrases and sentences. By the end of the lesson users will be able to:

  • employ exploratory analyses to check for errors and detect high-level patterns;
  • apply basic stylometric methods over time and across authors;
  • approach document summarization to provide a high-level description of the
    elements in a corpus.

The tutorial uses United States Presidential State of the Union Addresses, yawn, as their dataset.

Great tutorial but aren’t there more interesting datasets to use as examples?

Modulo that I haven’t prepared such a dataset or matched it to a tutorial such as this one.

Question: What would make a more interesting dataset than United States Presidential State of the Union Addresses?

Anything is not a helpful answer.

Suggestions?

December 5, 2018

Open Letter to NRCC Hackers

Filed under: Cybersecurity,Government,Hacking,Politics,Wikileaks — Patrick Durusau @ 11:04 am

We have never met or communicated but I wanted to congratulate you on the hack of top NRCC officials in 2018. Good show!

I’m sure you remember the drip-drip-drip release technique used by Wikileads with the Clinton emails. I had to check the dates but the first batch was in early October 2016, before the presidential election in November 2016.

The weekly release cycle, with the prior publicity concerning the leak, kept both alternative and mainstream media on the edge of climaxing every week. Even though the emails themselves were mostly office gossip and pettiness found in any office email system.

The most obvious target event for weekly drops of the NRCC emails is the 2020 election but that is subject to change.

Please consider the Wikileaks partial release tactic, which transformed office gossip into front-page news, when you select a target event for releasing the NRCC emails.

Your public service in damaging the NRCC will go unrewarded but not unappreciated. Once again, good show!

December 4, 2018

Bulk US Congress Bills, Laws in XML

Filed under: Government,Government Data,Law,Legal Informatics,XML — Patrick Durusau @ 8:47 am

GPO Makes Documents Easy To Download and Repurpose in New XML Format

From the news release:

The U.S. Government Publishing Office (GPO) makes available a subset of enrolled bills, public and private laws, and the Statutes at Large in Beta United States Legislative Markup (USLM) XML, a format that makes documents easier to download and repurpose. The documents available in the Beta USLM XML format include enrolled bills and public laws beginning with the 113th Congress (2013) and the Statutes at Large beginning with the 108th Congress (2003). They are available on govinfo, GPO’s one-stop site to authentic, published Government information. https://www.govinfo.gov/bulkdata.

The conversion of legacy formats into Beta USML XML will provide a uniform set of laws for the public to download. This new format maximizes the number of ways the information can be used or repurposed for mobile apps or other digital or print projects. The public will now be able to download large sets of data in one click rather than downloading each file individually, saving significant time for developers and others who seek to repurpose the data.

GPO is collaborating with various legislative and executive branch organizations on this project, including the Office of the Clerk of the House, the Office of the Secretary of the Senate, and the Office of the Federal Register. The project is being done in support of the Legislative Branch Bulk Data Task Force which was established to examine the increased dissemination of Congressional information via bulk data download by non-Governmental groups for the purpose of supporting openness and transparency in the legislative process.

“Making these documents available in Beta USLM XML is another example of how GPO is meeting the technological needs of Congress and the public,“ said GPO Acting Deputy Director Herbert H. Jackson, Jr. “GPO is committed to working with Congress on new formats that provide the American people easy access to legislative information.“

GPO is the Federal Government’s official, digital, secure resource for producing, procuring, cataloging, indexing, authenticating, disseminating, and preserving the official information products of the U.S. Government. The GPO is responsible for the production and distribution of information products and services for all three branches of the Federal Government, including U.S. passports for the Department of State as well as the official publications of Congress, the White House, and other Federal agencies in digital and print formats. GPO provides for permanent public access to Federal Government information at no charge through www.govinfo.gov and partnerships with approximately 1,140 libraries nationwide participating in the Federal Depository Library Program. For more information, please visit www.gpo.gov.

Not that I have lost any of my disdain and distrust for government, but when any government does something good, they should be praised.

Making “enrolled bills, public and private laws, and the Statues at Large in Beta United States Legislative markup (USML) XML” is a step towards to tracing and integrating legislation with those it benefits.

I’m not convinced that if you could trace specific legislation to a set of donations that the outcomes on legislation would be any different. It’s like tracing payments made to a sex worker. That’s their trade, why should they be ashamed of it?

The same holds true for most members of Congress, save that the latest election has swept non-sex worker types into office. It remains to be seen how many will resist the temptation to sell their offices and which will not.

In either case, kudos to the GPO and Lauren Wood, who I understand has been a major driver in this project!

December 3, 2018

Remotely Hijacking Zoom Clients

Filed under: Cybersecurity,Hacking — Patrick Durusau @ 8:45 pm

Remotely Hijacking Zoom Clients by David Wells.

From the post:

I would like to walkthrough a severe logic flaw vulnerability found in Zoom’s Desktop Conferencing Application. This logic flaw (CVE-2018–15715) affects Zoom clients for MacOS, Linux, and Windows and allows an attacker (doesn’t even have to be meeting attendee) to hijack various components of a live meeting such as forcefully enable desktop control permissions and send keystrokes to meeting attendees sharing their screen. Zoom has released an update for MacOS and Windows and users of Zoom should make sure they are running the most up-to-date version.

Great description of a vulnerability, even if Wells reports that Zoom servers now appear to be patched.

Telecommuting Trend Data from GlobalWorkplaceAnalytics.com leaves no doubt that remote work by employees is increasing, meaning so are avenues into corporate computer infrastructures.

To say nothing of moves towards telecommuting by the United States government, led by of all agencies, the IRS. Telecommuting Options in Government Jobs

Vulnerabilities in telecommuting and/or video conferencing software may result is a bountiful harvest of data. But you won’t know if you don’t look for them.

Distributed Denial of Secrets (#DDoSecrets) – There’s a New Censor in Town

Filed under: Censorship,CIA,Leaks,NSA — Patrick Durusau @ 6:59 pm

Distributed Denial of Secrets (#DDoSecrets) (ddosecretspzwfy7.onion/)

From a tweet by @NatSecGeek:

Distributed Denial of Secrets (#DDoSecrets), a collective/distribution system for leaked and hacked data, launches today with over 1 TB of data from our back catalogue (more TK).

Great right? Well, maybe not so great:

Our goal is to preserve info and ensure its available to those who need it. When possible, we will distribute complete datasets to everyone. In some instances, we will offer limited distribution due to PII or other sensitive info. #DDoSecrets currently has ~15 LIMDIS releases.

As we’re able, #DDoSecrets will produce sanitized versions of these datasets for public distribution. People who can demonstrate good cause for a copy of the complete dataset will be provided with it.

Rahael Satter in Leak site’s launch shows dilemma of radical transparency documents the sad act of self-mutilation (self-censorship) by #DDoSecrets.

Hosting the Ashley Madison hack drew criticism from Joseph Cox (think Motherboard) and Gabriella Coleman (McGill University anthropologist). The Ashley Madison data is available for searching (by email for example https://ashley.cynic.al/), so the harm of a bulk release isn’t clear.

What is clear is the reasoning of Coleman:


Best said the data would now be made available to researchers privately on a case-by-case basis, a decision that mollified some critics.

“Much better,” said Coleman after reviewing the newly pared-back site. “Exactly the model we might want.”

I am not surprised this is the model Coleman wants, academics are legendary for treating access as a privilege, thus empowering themselves to sit in judgment on others.

Let me explicitly say that I have no doubts that Emma Best will be as fair handed with such judgments as anyone.

But once we concede any basis for censorship, the withholding of information of any type, then we are cast into a darkness from which there is no escape. A censor claims to have withheld only X, but how are we to judge? We have no access to the original data. Only its mutilated, bastard child.

Emma Best is likely the least intrusive censor you can find but what is your response when the CIA or the NSA makes the same claim?

Censorship is a danger when practiced by anyone for any reason.

Support and leak to the project but always condition deposits on raw leaking by #DDoSecrets.

December 2, 2018

Programming Language Foundations in Agda [Hackers Fear Not!]

Filed under: Agda,Computer Science,Cybersecurity,Hacking,Programming,Proof Theory — Patrick Durusau @ 11:47 am

Programming Language Foundations in Agda by Philip Wadler and Wen Kokke.

From the preface:

The most profound connection between logic and computation is a pun. The doctrine of Propositions as Types asserts that a certain kind of formal structure may be read in two ways: either as a proposition in logic or as a type in computing. Further, a related structure may be read as either the proof of the proposition or as a programme of the corresponding type. Further still, simplification of proofs corresponds to evaluation of programs.

Accordingly, the title of this book also has two readings. It may be parsed as “(Programming Language) Foundations in Agda” or “Programming (Language Foundations) in Agda” — the specifications we will write in the proof assistant Agda both describe programming languages and are themselves programmes.

The book is aimed at students in the last year of an undergraduate honours programme or the first year of a master or doctorate degree. It aims to teach the fundamentals of operational semantics of programming languages, with simply-typed lambda calculus as the central example. The textbook is written as a literate script in Agda. The hope is that using a proof assistant will make the development more concrete and accessible to students, and give them rapid feedback to find and correct misapprehensions.

The book is broken into two parts. The first part, Logical Foundations, develops the needed formalisms. The second part, Programming Language Foundations, introduces basic methods of operational semantics.

Hackers should attend closely to Wadler and Kokke’s text to improve their own tools. The advantages of type-dependent programming are recited by Andrew Hynes in Why you should care about dependently typed programming and I won’t repeat them here.

Hynes also reassures hackers (perhaps not his intent) that a wave of type-dependent programming is not on the near horizon saying:

So we’ve got these types that act as self-documenting proofs that functionality works, add clarity, add confidence our code works as well as runs. And, more than that, they make sense. Why didn’t we have these before? The short answer is, they’re a new concept, they’re not in every language, a large amount of people don’t know they exist or that this is even possible. Also, there are those I mentioned earlier, who hear about its use in research and dismiss it as purely for that purpose (let’s not forget that people write papers about languages like C and [Idealized] Algol, too). The fact I felt the need to write this article extolling their virtues should be proof enough of that.

Like object orientation and other ideas before it, it may take a while before this idea seeps down into being taught at universities and seen as standard. Functional programming has only just entered this space. The main stop-gap right now is this knowledge, and it’s the same reason you can’t snap your fingers together and have a bunch of Java devs who have never seen Haskell before writing perfect Haskell day one. Dependently typed programming is still a new concept, but that doesn’t mean you need to wait. Things we take for granted were new once, too.

I’m not arguing in favour of everybody in the world switching to a dependently typed language and doing everything possible dependently typed, that would be silly, and it encourages misuse. I am arguing in favour of, whenever possible (e.g. if you’re already using Haskell or similar) perhaps thinking whether dependent types suit what you’re writing. Chances are, there’s probably something they do suit very well indeed. They’re a truly fantastic tool and I’d argue that they will get better as time goes on due to way architecture will evolve. I think we’ll be seeing a lot more of them in the future. (emphasis in original)

Vulnerabilities have been, are and will continue to be etched into silicon. Vulnerabilities exist in decades of code and in the code written to secure it. Silicon and code that will still be running as type-dependent programming slowly seeps into the mainstream.

Hackers should benefit from and not fear type-dependent programming!

Powered by WordPress