Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

October 30, 2017

Bottery

Filed under: Bots,Social Media — Patrick Durusau @ 7:58 pm

Bottery – A conversational agent prototyping platform by katecompton@

From the webpage:

Bottery is a syntax, editor, and simulator for prototyping generative contextual conversations modeled as finite state machines.

Bottery takes inspiration from the Tracery opensource project for generative text (also by katecompton@ in a non-google capacity) and the CheapBotsDoneQuick bot-hosting platform, as well as open FSM-based storytelling tools like Twine.

Like Tracery, Bottery is a syntax that specifies the script of a conversation (a map) with JSON. Like CheapBotsDoneQuick, the BotteryStudio can take that JSON and run a simulation of that conversation in a nice Javascript front-end, with helpful visualizations and editting ability.

The goal of Bottery is to help everyone, from designers to writers to coders, be able to write simple and engaging contextual conversational agents, and to test them out in a realistic interactive simulation, mimicking how they’d work on a “real” platform like API.AI.

Not a bot to take your place on social media but it does illustrate the potential of such a bot.

Drive your social “engagement” score with a bot!

Hmmm, gather up comments and your responses on say Facebook, then compare for similarity to a new comment, then select the closest response. With or without an opportunity to override the automatic response.

Enjoy!

Smart HTML Form Trick

Filed under: HTML,Search Interface,Searching — Patrick Durusau @ 7:37 pm

An HTML form trick to add some convenience to life by Bob DuCharme.

From the post:

On the computers that I use the most, the browser home page is an HTML file with links to my favorite pages and a “single” form that lets me search the sites that I search the most. I can enter a search term in the field for any of the sites, press Enter, and then that site gets searched. The two tricks that I use to create these fields have been handy enough that I thought I’d share them in case they’re useful to others.

I quote the word “single” above because it appears to be a single form but is actually multiple little forms in the HTML. Here is an example with four of my entries; enter something into any of the fields and press Enter to see what I mean:

As always, an immediately useful tip from DuCharme!

The multiple search boxes reminded me of the early metasearch engines that combined results from multiple search engines.

Will vary by topic but what resources would you search across day to day?

Russians Influence 2017 World Series #Upsidasium (Fake News)

Filed under: Fake News,Humor,Journalism — Patrick Durusau @ 8:36 am

Unnamed sources close to moose and squirrel, who are familiar with the evidence, say Russians are likely responsible for contamination of 2017 World Series baseballs with Upsidaisium. The existence and properties of Upsidaisium was documented in the early 1960s. This is the first known use of Upsidaisium to interfere with the World Series.

Sports Illustrated has photographic evidence that world series baseballs are “slicker” that a “normal” baseball, one sign of the use of Upsidaisium.

Unfortunately, Upsidaisim decays completely after the impact of being hit, into a substance indistinguishable from cowhide.

Should you obtain more unattributed statements from sources close to:

By Source, Fair use, Link

or,

By Source, Fair use, Link

Please add it in the comments below.

Thanks!

Journalists/Fake News hunters: Part truth, part fiction, just like reports of Russian “influence” (whatever the hell that means) in the 2016 presidential election and fears of Kasperkey Lab software.

Yes, Russia exists; yes, there was a 2016 presidential election; yes, Clinton is likely disliked by Putin, so do millions of others; yes, Wikileaks conducted a clever ad campaign with leaked emails, bolstered by major news outlets; but like Upsidaisim, there is no evidence tying Russians, much less Putin to anything to do with the 2016 election.

A lot of supposes, maybes and could have beens are reported, but no evidence. But US media outlets have kept repeating “Russia influenced the 2016” election until even reasonable people assume it is true.

Don’t do be complicit in that lie. Make #Upsidasium the marker for such fake news.

October 29, 2017

The Infectiousness of Pompous Prose – #GettysburgAbstract

Filed under: Publishing — Patrick Durusau @ 8:34 am

See the full size version.

Editors could improve readability of authors but then peer reviewers could actually read papers they are assigned to review.

Don’t wait on miracles to improve readability of scientific articles.

Create your Gettysburg Abstract of the most important point of the paper.

Lincoln’s Gettysburg address was 272 words long. A #GettysburgAbstract is 272 words or less.

If you can’t capture an article in 272 words or less, read the article again.

Post your #GettysburgAbstract as a comment and/or review of the article. Spread the word.

If first saw this image in a tweet by Mara Averick.

October 28, 2017

YAGO: A High-Quality Knowledge Base (Open Source)

Filed under: Knowledge,Ontology,Wikipedia — Patrick Durusau @ 8:56 pm

YAGO: A High-Quality Knowledge Base

Overview:

YAGO is a huge semantic knowledge base, derived from Wikipedia WordNet and GeoNames. Currently, YAGO has knowledge of more than 10 million entities (like persons, organizations, cities, etc.) and contains more than 120 million facts about these entities.

YAGO is special in several ways:

  1. The accuracy of YAGO has been manually evaluated, proving a confirmed accuracy of 95%. Every relation is annotated with its confidence value.
  2. YAGO combines the clean taxonomy of WordNet with the richness of the Wikipedia category system, assigning the entities to more than 350,000 classes.
  3. YAGO is an ontology that is anchored in time and space. YAGO attaches a temporal dimension and a spacial dimension to many of its facts and entities.
  4. In addition to a taxonomy, YAGO has thematic domains such as "music" or "science" from WordNet Domains.
  5. YAGO extracts and combines entities and facts from 10 Wikipedias in different languages.

YAGO is developed jointly with the DBWeb group at Télécom ParisTech University.

Before you are too impressed by the numbers, which are impressive, realize that 10 million entities is 3% of the current US population. To say nothing of any other entities we might want include along with them. It’s a good start and very useful, but realize it is a limited set of entities.

All the source data is available, along with the source code.

Would be interesting to see how useful the entity set is when used with US campaign contribution data.

Thoughts?

CMU Neural Networks for NLP 2017 (16 Lectures)

Filed under: Natural Language Processing,Neural Networks — Patrick Durusau @ 8:31 pm

Course Description:

Neural networks provide powerful new tools for modeling language, and have been used both to improve the state-of-the-art in a number of tasks and to tackle new problems that were not easy in the past. This class will start with a brief overview of neural networks, then spend the majority of the class demonstrating how to apply neural networks to natural language problems. Each section will introduce a particular problem or phenomenon in natural language, describe why it is difficult to model, and demonstrate several models that were designed to tackle this problem. In the process of doing so, the class will cover different techniques that are useful in creating neural network models, including handling variably sized and structured sentences, efficient handling of large data, semi-supervised and unsupervised learning, structured prediction, and multilingual modeling.

Suggested pre-requisite: 11-711 “Algorithms for NLP”.

I wasn’t able to find videos for the algorithms for NLP course but you can explore the following as supplemental materials:


Each of these courses can be found in two places: YouTube and Academic Torrents. The advantage of Academic Torrents is that you can also download the supplementary course materials, like transcripts, PDFs, or PPTs.

  1. Natural Language Processing: Dan Jurafsky and Christopher Manning, Stanford University. YouTube | Academic Torrents
  2. Natural Language Processing: Michael Collins, Columbia University. YouTube | Academic Torrents
  3. Introduction to Natural Language Processing: Dragomir Radev, University of Michigan. YouTube | Academic Torrents

… (From 9 popular online courses that are gone forever… and how you can still find them)

Enjoyable but not as suited to binge watching as Stranger Things. 😉

Enjoy!

The Little Black Box That Took Over Piracy (tl;dr – Read or Watch GoT?)

Filed under: Cybersecurity,Entertainment,Security — Patrick Durusau @ 7:56 pm

The Little Black Box That Took Over Piracy by Brian Barrett.

At > 2400 words, Barrett’s report on Kodi is a real time sink.

Three links instead:

  1. TV Addons
  2. AliExpress.com
  3. HOW-TO:Install Kodi for Linux

Enjoy!


At “Enjoy” 33 words versus > 2400. Comments?

Useless List of Dark Web Bargains – NRA Math/Social Science Problems

Filed under: Cybersecurity,Dark Web,Malware,Security — Patrick Durusau @ 3:00 pm

A hacker’s toolkit, shocking what you can buy on Dark Web for a few bucks by Mark Jones.

From the post:

Ransomware

  • Sophisticated license for widespread attacks $200
  • Unsophisticated license for targeted attacks $50

Spam

  • 500 SMS (Flooding) $20
  • 500 malicious email spam $400
  • 500 phone calls (Flooding) $20
  • 1 million email spam (legal) $200

What makes this listing useless? Hmmm, did you notice the lack of URLs?

With URLs, a teacher could create realistic math problems like:

How much money would Los Vegas shooting survivors and families of the deceased victims have to raise to “flood” known NRA phone numbers during normal business hours (US Eastern time zone) for thirty consecutive days? (give the total number of phone lines and their numbers as part of your answer)

or research problems (social science/technology),

Using the current NRA 504c4 report, choose a minimum of three (3) directors of the NRA and specify what tools, Internet or Dark Web, you would use to find additional information about each director, along with the information you discovered with each tool for each director.

or advanced research problems (social science/technology),

Using any tool or method, identify a minimum of five (5) contributors to the NRA that are not identified on the NRA website or in any NRA publication. The purpose of this exercise is to discover NRA members who have not been publicly listed by the NRA itself. For each contributor, describe your process, including links and results.

Including links in posts, even lists, helps readers reuse and even re-purpose content.

It’s called the World Wide Web for a reason, hyperlinks.

October 27, 2017

Introduction to ClojureScript

Filed under: ClojureScript,Functional Programming,React — Patrick Durusau @ 9:22 pm

Introduction to ClojureScript by Roman Liutikov. (ClojureScript Workshop @ ReactiveConf ’17)

From the webpage:

Requirements

It’s nice to know the following:

  • React.js
  • Basics of functional programming

Help during the workshop

Here’s a couple of useful resources that will help you during the workshop:

Four hour workshop with a full set of topics and useful links.

Ah, one missing: React.js. 😉

Enjoy!

Success in Astronomy? Some Surprising Strategies

Filed under: Astroinformatics,Publishing — Patrick Durusau @ 8:48 pm

Success in Astronomy? Some Surprising Strategies by Stacy Kim.

Kim reviews How long should an astronomical paper be to increase its impact? by K. Z. Stanek, saying:

What do you think it takes to succeed in astronomy? Some innate brilliance? Hard work? Creativity? Great communication skills?

What about writing lots of short papers? For better or for worse, one’s success as an astronomer is frequently measured in the number of papers one’s written and how well cited they are. Papers are a crucial method of communicating results to the rest of the astronomy community, and the way they’re written and how they’re published can have a significant impact on the number of citations that you receive.

There are a number of simple ways to increase the citation counts on your papers. There are things you might expect: if you’re famous within the community (e.g. a Nobel Prize winner), or are in a very hot topic like exoplanets or cosmology, you’ll tend to get cited more often. There are those that make sense: papers that are useful, such as dust maps, measurements of cosmological parameters, and large sky surveys often rank among the most-cited papers in astronomy. And then there’s the arXiv, a preprint service that is highly popular in astronomy. It’s been shown that papers that appear on the arXiv are cited twice as much as those that aren’t, and furthermore—those at the top of the astro-ph list are twice as likely to be cited than those that appear further down.

If you need a quick lesson from the article, Kim suggests posting to arXiv at 4pm, so your paper appears higher on the list.

For more publishing advice, see Kim’s review or the paper in full.

Enjoy!

New York Times Goes Dark (As in Dark Web)

Filed under: Journalism,News,Tor — Patrick Durusau @ 1:06 pm

The New York Times is Now Available as a Tor Onion Service by Runa Sandvik.

From the post:

Today we are announcing an experiment in secure communication, and launching an alternative way for people to access our site: we are making the nytimes.com website available as a Tor Onion Service.

The New York Times reports on stories all over the world, and our reporting is read by people around the world. Some readers choose to use Tor to access our journalism because they’re technically blocked from accessing our website; or because they worry about local network monitoring; or because they care about online privacy; or simply because that is the method that they prefer.

The Times is dedicated to delivering quality, independent journalism, and our engineering team is committed to making sure that readers can access our journalism securely. This is why we are exploring ways to improve the experience of readers who use Tor to access our website.

One way we can help is to set up nytimes.com as an Onion Service — making our website accessible via a special, secure and hard-to-block VPN-like “tunnel” through the Tor network. The address for our Onion Service is:

https://www.nytimes3xbfgragh.onion/

This onion address is accessible only through the Tor network, using special software such as the Tor Browser. Such tools assure our readers that our website can be reached without monitors or blocks, and they provide additional guarantees that readers are connected securely to our website.

The New York Times (NYT) “going dark,” benefits the Tor project several ways:

  • Increases the legitimacy of Tor
  • Increases the visibility of Tor
  • Lead to more robust Tor relays
  • More support for Tor development
  • Spreading usage of Tor browsers

Time to press other publishers, Wall Street Journal, the Washington Post, ABC, NBC, CBS, the Daily Beast, The Hill, NPR, the LA Times, USA Today, Newsweek, Reuters, the Guardian, to name only a few, for Tor onion services.

Be forewarned, a login to the NYT destroys whatever anonymity you sought by accessing https://www.nytimes3xbfgragh.onion/.

You may be anonymous to your local government, but the NYT is subject to the whims and caprices of the US government. A login to the NYT site, even using Tor, puts your identity and reading habits at risk.

0-Days vs. Human Stupidity

Filed under: Cybersecurity,Security — Patrick Durusau @ 10:15 am

Kaspersky Lab released The Human Factor in IT Security last July (2017), which was summarized by Nikolay Pankov in The human factor: Can employees learn to not make mistakes?, saying in part:

  • 46% of incidents in the past year involved employees who compromised their company’s cybersecurity unintentionally or unwittingly;
  • Of the companies affected by malicious software, 53% said that infection could not have happened without the help of inattentive employees, and 36% blame social engineering, which means that someone intentionally tricked the employees;
  • Targeted attacks involving phishing and social engineering were successful in 28% of cases;
  • In 40% of cases, employees tried to conceal the incident after it happened, amplifying the damage and further compromising the security of the affected company;
  • Almost half of the respondents worry that their employees inadvertently disclose corporate information through the mobile devices they bring to the workplace.

If anything, human stupidity is a constant with little hope of improvement.

For example, the “Big Three” automobile manufacturers were founded in the 1920’s and now almost a century later, the National Highway Traffic Safety Administration reports in 2015 there were 6.3 million police reported automobile accidents (an increase of 3.8% over the previous year).

Or, another type of “accident” covered by the Guttmacher Institute shows for 2011:

Not to rag on users exclusively, vulnerabilities due to mis-configuration, failure to patch and vulnerabilities in security programs and programs more generally, are due to human stupidity as well.

0-Days will always capture the headlines and are a necessity against some opponents. At the same time, testing for human stupidity is certainly cheaper and often just as effective as advanced techniques.

Transparency is coming … to the USA! (Apologies to Leonard Cohen)

October 26, 2017

What’s New in the JFK Files? [A Topic Map Could Help Answer That Question]

Filed under: Government,Government Data,History,Topic Maps — Patrick Durusau @ 9:07 pm

The JFK Files: Calling On Citizen Reporters

From the webpage:

The government has released long-secret files on John F. Kennedy’s assassination, and we want your help.

The files are among the last to be released by the National Archives under a 1992 law that ordered the government to make public all remaining documents pertaining to the assassination. Other files are being withheld because of what the White House says are national security, law enforcement and foreign policy concerns.

There has long been a trove of conspiracy theories surrounding Kennedy’s murder in Dallas on Nov. 22, 1963, including doubts about whether Lee Harvey Oswald acted alone, as the Warren Commission determined in its report the following year.

Here’s where you come in. Read the documents linked here. If you find news or noteworthy nuggets among the pages, share them with us on the document below. If we use what you find, we’ll be sure to give you a shoutout!

Given the linear feet of existing files, finding new nuggets or aligning them with old nuggets in the original files, is going to be a slow process.

What more, you or I may find the exact nugget needed to connect dots for someone else, but since we all read, search, and maintain our searches separately, effective sharing of those nuggets won’t happen.

Depending on the granularity of a topic map over those same materials, confirmation of Oswald’s known whereabouts and who reported those could be easily examined and compared to new (if any) whereabouts information in these files. If new files confirm what is known, researchers could skip that material and move to subjects unknown in the original files.

A non-trivial encoding task but full details have been delayed pending another round of hiding professional incompetence. A topic map will help you ferret out the incompetents seeking to hide in the last releases of documents. Interested?

Democratizing CyberCrime – Messaging Apps As New Dark Web

Filed under: Cybersecurity,Security — Patrick Durusau @ 8:36 pm

Cyber criminals use messaging apps to locate new hideouts after dark web market crackdown

Mobile messaging apps said to be the “in” place for cyber criminals, leading to these observations:


“Today’s black market is accessible more than ever, with the tap of a finger over a portable pocket-held device,” the study said. “This could prove to cause a proliferation of low-level cybercrime, that is conducted by less qualified perpetrators”.

Traditional dark web markets required would-be users to know which sites to visit and how, using a special browser, all of which required no small amount of technical sophistication.

IntSights said hackers are turning to smaller, closed networks on social media and mobile messaging apps instead of traditionally open, moderated dark web forums because such groups can be easily set up, shut down and relocated via apps.

I’m all in favor of democratization of technology but like you, I nearly choked on:

…Traditional dark web markets required would-be users to know which sites to visit and how, using a special browser, all of which required no small amount of technical sophistication….

Wow, just wow! Being able to download/install Tor and finding .onion sites is “technical sophistication?”

Messaging apps mentioned:

Discord – #1 with a bullet.

Skype – Microsoft.

Telegram

WhatsApp – Facebook.

By sacrificing an email address, you can get a copy of the dark web/mobile app report.

SciPy 1.0.0! [Awaiting Your Commands]

Filed under: Programming,Python — Patrick Durusau @ 10:50 am

SciPy 1.0.0

From the webpage:

We are extremely pleased to announce the release of SciPy 1.0, 16 years after version 0.1 saw the light of day. It has been a long, productive journey to get here, and we anticipate many more exciting new features and releases in the future.

Why 1.0 now?

A version number should reflect the maturity of a project – and SciPy was a mature and stable library that is heavily used in production settings for a long time already. From that perspective, the 1.0 version number is long overdue.

Some key project goals, both technical (e.g. Windows wheels and continuous integration) and organisational (a governance structure, code of conduct and a roadmap), have been achieved recently.

Many of us are a bit perfectionist, and therefore are reluctant to call something “1.0” because it may imply that it’s “finished” or “we are 100% happy with it”. This is normal for many open source projects, however that doesn’t make it right. We acknowledge to ourselves that it’s not perfect, and there are some dusty corners left (that will probably always be the case). Despite that, SciPy is extremely useful to its users, on average has high quality code and documentation, and gives the stability and backwards compatibility guarantees that a 1.0 label imply.

In case your hands are trembling too much to type in the URLs:

SciPy.org

SciPy Cookbook

Scipy 1.0.0 Reference Guide, [HTML+zip], [PDF]

Like most tools, it isn’t weaponized until you apply it to data.

Enjoy!

PS: If you want to get ahead of a co-worker, give them this URL: http://planet.scipy.org/. Don’t look, it’s a blog feed for SciPy. Sorry, you looked didn’t you?

Test Your Qualifications To Run A Web Hidden Service

Filed under: Cybersecurity,Security,Tor — Patrick Durusau @ 10:30 am

Securing a Web Hidden Service

From the post:

While browsing the darknet (Onion websites), it’s quite stunning to see the number of badly configured Hidden Services that will leak directly or indirectly the underlying clearnet IP address. Thus canceling the server anonymity protection that can offer Tor Hidden Services.

Here are a few rules you should consider following before setting up a Onion-only website. This guide covers both Apache and Nginx.
… (emphasis in original)

Presented as rules to preserve .onion anonymity, these five rules also test of your qualifications to run a web hidden service.

If you don’t understand or won’t any of these five rules, don’t run a web hidden service.

You are likely to expose yourself and others.

Just don’t.

2nd International Electronic Conference on Remote Sensing – March 22 – April 5, 2018

2nd International Electronic Conference on Remote Sensing

From the webpage:

We are very pleased to announce that the 2nd International Electronic Conference on Remote Sensing (ECRS-2) will be held online, between 22 March and 5 April 2018.

Today, remote sensing is already recognised as an important tool for monitoring our planet and assessing the state of our environment. By providing a wealth of information that is used to make sound decisions on key issues for humanity such as climate change, natural resource monitoring and disaster management, it changes our world and affects the way we think.

Nevertheless, it is very inspirational that we continue to witness a constant growth of amazing new applications, products and services in different fields (e.g. archaeology, agriculture, forestry, environment, climate change, natural and anthropogenic hazards, weather, geology, biodiversity, coasts and oceans, topographic mapping, national security, humanitarian aid) which are based on the use of satellite and other remote sensing data. This growth can be attributed to the following: large number (larger than ever before) of available platforms for data acquisition, new sensors with improved characteristics, progress in computer technology (hardware, software), advanced data analysis techniques, and access to huge volumes of free and commercial remote sensing data and related products.

Following the success of the 1st International Electronic Conference on Remote Sensing (http://sciforum.net/conference/ecrs-1), ECRS-2 aims to cover all recent advances and developments related to this exciting and rapidly changing field, including innovative applications and uses.

We are confident that participants of this unique multidisciplinary event will have the opportunity to get involved in discussions on theoretical and applied aspects of remote sensing that will contribute to shaping the future of this discipline.

ECRS-2 (http://sciforum.net/conference/ecrs-2) is hosted on sciforum, the platform developed by MDPI for organising electronic conferences and discussion groups, and is supported by Section Chairs and a Scientific Committee comprised of highly reputable experts from academia.

It should be noted that there is no cost for active participation and attendance of this virtual conference. Experts from different parts of the world are encouraged to submit their work and take the exceptional opportunity to present it to the remote sensing community.

I have a less generous view of remote sensing, seeing it used to further exploit/degrade the environment, manipulate regulatory processes, and to generally disadvantage those not skilled in its use.

Being aware of the latest developments in remote sensing is a first step towards developing your ability to question, defend and even use remote sensing data for your own ends.

ECRS-2 (http://sciforum.net/conference/ecrs-2) is a great opportunity to educate yourself about remote sensing. Enjoy!

While electronic conferences lack the social immediacy of physical gatherings, one wonders why more data technologies aren’t holding electronic conferences? Thoughts?

October 25, 2017

Proton Sets A High Bar For Malware

Filed under: Cybersecurity,Malware,Security — Patrick Durusau @ 9:14 pm

Malware hidden in vid app is so nasty, victims should wipe their Macs by Iain Thomson

Proton was distributed by legitimate servers and is so severe that only a clean install will rid your system of the malware.

From the post:


Proton is a remote-control trojan designed specifically for Mac systems. It opens a backdoor granting root-level command-line access to commandeer the computer, and can steal passwords, encryption and VPN keys, and crypto-currencies from infected systems. It can gain access to a victim’s iCloud account, even if two-factor authentication is used, and went on sale in March with a $50,000 price tag.

Impressive!

Imagine a Windows trojan that requires a clean system install to disinfect your system.

Well, “disinfecting” a Windows system is a relative term.

If you are running Windows 10, you have already granted root access to Microsoft plus whoever they trust to your system.

Perhaps “disinfect within the terms and limitations of your EULA with Microsoft” is the better way to put it.

A bit verbose don’t you think?

October 24, 2017

Targeting Government Websites

Filed under: Cybersecurity,Government,Security — Patrick Durusau @ 8:05 pm

With only 379 days until congressional mid-terms, you should not waste time hardening or attacking seldom used or obscure government webpages.

If that sounds like a difficult question, then you don’t know about analytics.usa.gov!

This data provides a window into how people are interacting with the government online. The data comes from a unified Google Analytics account for U.S. federal government agencies known as the Digital Analytics Program. This program helps government agencies understand how people find, access, and use government services online. The program does not track individuals, and anonymizes the IP addresses of visitors.

Not every government website is represented in this data. Currently, the Digital Analytics Program collects web traffic from around 400 executive branch government domains, across about 4500 total websites, including every cabinet department. We continue to pursue and add more sites frequently; to add your site, email the Digital Analytics Program.

This open source project is in the public domain, which means that this website and its data are free for you to use without restriction. You can find the code for this website and the code behind the data collection on GitHub.

We plan to expand the data made available here. If you have any suggestions, or spot any issues or bugs, please open an issue on GitHub or contact the Digital Analytics Program.

Download the data

You can download the data here. Available in JSON and CSV format.

Whether you imagine yourself carrying out or defending against a Putin/FSB/KGB five-year cyberattack plan, analytics.usa.gov can bring some grounding to your defense/attack plans.

Sorry, but government web data won’t help with your delusions about Putin. For assistance in maintaining those, check with the Democratic National Committee and/or the New York Times.

October 23, 2017

US Senate Vermin List

Filed under: Government,Politics — Patrick Durusau @ 8:18 pm

The US Senate recently voted to approve a budget granting large tax cuts, paid for by cuts to Medicaid and Medicare.

On the Concurrent Resolution: H. Con. Res. 71 As Amended; A concurrent resolution establishing the congressional budget for the United States Government for fiscal year 2018 and setting forth the appropriate budgetary levels for fiscal years 2019 through 2027.

The “US Senate” is an identity concealing and accountability avoiding fiction.

H. Con. Res. 71 As Amended was approved by fifty-one (51) members of the Senate, all of who have names and websites.

You may find the following list helpful:

  1. Alexander (R-TN)
  2. Barrasso (R-WY)
  3. Blunt (R-MO)
  4. Boozman (R-AR)
  5. Burr (R-NC)
  6. Capito (R-WV)
  7. Cassidy (R-LA)
  8. Cochran (R-MS)
  9. Collins (R-ME)
  10. Corker (R-TN)
  11. Cornyn (R-TX)
  12. Cotton (R-AR)
  13. Crapo (R-ID)
  14. Cruz (R-TX)
  15. Daines (R-MT)
  16. Enzi (R-WY)
  17. Ernst (R-IA)
  18. Fischer (R-NE)
  19. Flake (R-AZ)
  20. Gardner (R-CO)
  21. Graham (R-SC)
  22. Grassley (R-IA)
  23. Hatch (R-UT)
  24. Heller (R-NV)
  25. Hoeven (R-ND)
  26. Inhofe (R-OK)
  27. Isakson (R-GA)
  28. Johnson (R-WI)
  29. Kennedy (R-LA)
  30. Lankford (R-OK)
  31. Lee (R-UT)
  32. McCain (R-AZ)
  33. McConnell (R-KY)
  34. Moran (R-KS)
  35. Murkowski (R-AK)
  36. Perdue (R-GA)
  37. Portman (R-OH)
  38. Risch (R-ID)
  39. Roberts (R-KS)
  40. Rounds (R-SD)
  41. Rubio (R-FL)
  42. Sasse (R-NE)
  43. Scott (R-SC)
  44. Shelby (R-AL)
  45. Strange (R-AL)
  46. Sullivan (R-AK)
  47. Thune (R-SD)
  48. Tillis (R-NC)
  49. Toomey (R-PA)
  50. Wicker (R-MS)
  51. Young (R-IN)

Where would you take this list from here?

October 22, 2017

Router Games While Waiting in Congressional Rep’s Parking Lot

Filed under: Cybersecurity,Government,Security — Patrick Durusau @ 8:00 pm

With the US congressional mid-term election only 381 days away (2018-11-06), I can only imagine the boredom from sitting in your representative’s branch office parking lot.

Watching for your representative and his/her visitors is a thankless task. The public always being interested in such details.

One amusing and potentially skill building exercise is described in Man-in-the-middle Router.

From the post:

Turn any linux computer into a public Wi-Fi network that silently mitms all http traffic. Runs inside a Docker container using hostapd, dnsmasq, and mitmproxy to create a open honeypot wireless network named “Public”. For added fun, change the network name to “xfinitywifi” to autoconnect anyone who has ever connected to those networks… they are everywhere.

The suggestion of using popular network names, which you can discover by cruising about with your Linux laptop, seems especially interesting.

Brush up on your cyberskills!

2018 is brimming with promise!

Comparative Presidential Corruption

Filed under: Government,History,Politics — Patrick Durusau @ 7:48 pm

Reporters wanting to add a historical flavor to their accounts of corruption and investigations of corruption in the Trump regime, will be glad to see: Papers of Ulysses S. Grant Now Online.

From the post:

The Library of Congress has put the papers of Ulysses S. Grant online for the first time in their original format at https://www.loc.gov/collections/ulysses-s-grant-papers/about-this-collection/.

The Library holds a treasure trove of documents from the Civil War commander and 18th president of the United States, including personal correspondence, “headquarters records” created during the Civil War and the original handwritten manuscript of Grant’s memoir— regarded as one of the best in history—among other items. The collection totals approximately 50,000 items dating from 1819-1974, with the bulk falling in the period 1843-1885.

The collection includes general and family correspondence, speeches, reports, messages, military records, financial and legal records, newspaper clippings, scrapbooks, memorabilia and other papers. The collection relates to Grant’s service in the Mexican War and Civil War, his pre-Civil War career, and his postwar service as U.S. secretary of war ad interim under President Andrew Johnson, his 1868 presidential campaign and two-term presidency, his unsuccessful 1880 presidential bid, his extensive international travels and the financial difficulties late in life that spurred the writing of his memoir, which he completed just days before his death from tongue cancer in July 1885.

If you think the IRS has an unsavory reputation now, one tax collector (liquor taxes) was hired with a 50% commission on his collections. The Sanborn incident.

There have been a number of deeply corrupt American presidencies but this collection crossed my desk recently.

Enjoy!

October 20, 2017

Not Zero-Day But Effective Hacking

Filed under: Cybersecurity,Security — Patrick Durusau @ 12:43 pm

Catalin Cimpanu reminds us in Student Expelled for Using Hardware Keylogger to Hack School, Change Grades not every effective hacking attack uses a zero-day vulnerability.

Zero-days get most of the press, ‘Zero Days’ Documentary Exposes A Looming Threat Of The Digital Age, but capturing the keystrokes on a computer keyboard, can be just as effective for stealing logins/passwords and other data.

Cimpanu suggests that hardware keyloggers can be had on Amazon or eBay for a little as $20.

I’m not sure when he looked but a search today shows the cheapest price on Amazon is $52.59 and on eBay $29.79. Check for current pricing.

I haven’t used it but the Keyllama 4MB USB Value Keylogger has an attractive form factor (1.6″) at $55.50.

USB keyloggers (there are software keyloggers) require physical access for installation and retrieval.

You can attempt to play your favorite spy character or you can identify the cleaning service used by your target. Turnover in the cleaning business runs from 75 percent to 400 percent so finding or inserting a confederate is only a matter of time.

USB keyloggers aren’t necessary at the NSA as logins/passwords are available for the asking. (Snowden)

October 19, 2017

Gender Discrimination and Pew – The Obvious and Fake News

Filed under: Fake News,Feminism,Journalism,News — Patrick Durusau @ 9:07 pm

Women are more concerned than men about gender discrimination in tech industry by Kim Parker and Cary Funk.

From the post:

Women in the U.S. are substantially more likely than men to say gender discrimination is a major problem in the technology industry, according to a Pew Research Center survey conducted in July and August.

The survey comes amid public debate about underrepresentation and treatment of women – as well as racial and ethnic minorities – in the industry. Critics of Silicon Valley have cited high-profile cases as evidence that the industry has fostered a hostile workplace culture. For their part, tech companies point to their commitment to increasing workforce diversity, even as some employees claim the industry is increasingly hostile to white males.

Was Pew repeating old news?

Well, Vogue: New Study Finds Gender Discrimination in the Tech Industry Is Still Sky-High (2016), Forbes: The Lack Of Diversity In Tech Is A Cultural Issue (2015), Gender Discrimination and Sexism in the Technology Industry (2014), Women Matter (2013), to cite only a few of the literally thousands of studies and surveys, onto which to stack the repetitive Pew report.

Why waste Pew funds to repeat what was commonly known and demonstrated by published research?

One not very generous explanation is the survey provided an opportunity to repeat “fake news.” You know, news that gets repeated so often that you don’t remember its source but it has credibility because you hear it so often?

“Fake news,” is the correct category for:

…even as some employees claim the industry is increasingly hostile to white males.

Repeating that claim in a Pew publication legitimates the equivalent of cries coming from an asylum.

One quick quote from Forbes, hardly a bastion of forward social thinking dispels the “hostile to white male” fantasy, The Lack Of Diversity In Tech Is A Cultural Issue:


It has been a commonly held belief that the gender gap in tech is primarily a pipeline issue; that there are simply not enough girls studying math and science. Recently updated information indicates an equal number of high school girls and boys participating in STEM electives, and at Stanford and Berkeley, 50% of the introductory computer science students are women. That may be the case, but the U.S. Census Bureau reported last year that twice as many men as women with the same qualifications were working in STEM fields.

A USA Today study discloses that top universities graduate black and Hispanic computer science and computer engineering students at twice the rate that leading technology companies hire them. Although these companies state they don’t have a qualified pool of applicants, the evidence does not support that claim.

When 2/3 of the workers in a field are male, it’s strains the imagination to credit claims of “hostility.”

I have no fact based explanation for the reports of “hostility” to white males.

Speculations abound, perhaps they are so obnoxious that even other males can’t stand them? Perhaps they are using “hostility” as a cover for incompetence? Who knows?

What is known is that money is needed to address sexism in the workplace (not repeating the research of others) and fake news such as “hostile to white males” should not be repeated by reputable sources, like Pew.

October 18, 2017

Fake News, Facts, and Alternative Facts – Danger of Inaccurate News (Spoiler – Trump)

Filed under: Journalism,News — Patrick Durusau @ 1:33 pm

Why Inaccurate News is a Threat by Josh Pasek.

Pasek’s clip is part of the larger Fake News, Facts, and Alternative Facts.

Pasek uses a couple of examples from the 2016 presidential campaign to conclude:


So what we end up with, then, is an environment where we have an ideal news consumer or even a suboptimal news consumer. And what can happen as they get and interact with inaccurate information, is they come to a point where their views and the way that they start voting, making decisions, etc., can be based on something that’s wrong. And that, in turn, can mean that we elect people who aren’t necessarily the candidates that will best enact what people want. That people end up saying that they’re for a particular thing. When, in fact, if they knew more about it, they’d be against it. And those sorts of biases can be hugely pernicious to a democracy that successfully represents what it is that its people want.

Pasek has decided “inaccurate information” resulted in the election of Donald Trump and that’s his proof of the danger of inaccurate news.

If you remember his earlier comments about inference, his case runs like this:

  • There was inaccurate information reported in the media during the 2016 presidential election.
  • Therefore inaccurate information was responsible for the election of Donald Trump.

I don’t doubt inaccurate information was circulating during the 2016 presidential election but it’s a terrifying leap from the presence of inaccurate information crediting a presidential election to that single cause.

Especially without asking inaccurate information as compared to how much accurate information?, how many voters were influenced?, to what degree were influenced voters influenced?, to which candidate were they influenced?, in which states were they influenced?, what other factors impacted voters?, to what degree did other factors influence voters?, etc.

Without much deeper and complex analysis of voters and their voting behavior, claims that inaccurate information was in circulation, while factually true, are akin to saying the sun rose and set on election day, 2016. True but its impact on the election is far from certain.

Fake News, Facts, and Alternative Facts – Claims vs. Deductions

Filed under: Journalism,News — Patrick Durusau @ 10:40 am

Auto-grading for the first quiz in Fake News, Facts, and Alternative Facts marked my responses as incorrect for:

On the contrary, in a news report, both:

  • “In a survey of Americans, Democrats were more likely than Republicans to believe that September 11th was a government cover-up.”
  • “Scientists have looked for a potential link between vaccinations and autism an cannot find any evidence across multiple epidemiological studies.”

are claims by the person reporting that information.

You have no doubt heard surveys show a majority of Americans favor gun control. Would your opinion about those reports change if you knew the survey asked: “Do you think convicted murderers should be allowed to own guns?” Prohibiting gun ownership by convicted murderers is a form of gun control.

Knowing the questions asked in a survey, how respondents were selected, the method of conducting the survey and a host of other information is necessary before treating any report of a survey as anything other than a claim. You have no way of knowing if a reporter knew any more about the survey than the statement shown in the test. That’s a claim, not “systematically derived evidence … [that] reflects deductive testing using the scientific method.”

The claim about scientists and a link between vaccinations and autism is even weaker. Notice you are given the reporters conclusion about a report by scientists and not the report per se. You have no way to evaluate the reporters claim by examining the article, what “multiple epidemiological studies” were compared, out of a universe of how many other “epidemiological studies,” in which countries, etc.

I don’t doubt the absence of such a connection but “summarizes deductive evidence that was generated to specifically and rigorously evaluate a particular question. It reflects deductive testing using the scientific method” is an attempt to dress the claim by a reporter in the garb of what may or may not be true for the scientific study.

Reporting a scientific study isn’t the same thing as a scientific study. A scientific study can be evaluated, questioned, etc., all things that a bare report, a “claim” in my view, cannot.

Every report of a scientific study should link or give a standard reference to the scientific study. Reports that don’t, I skip and you should as well.

October 17, 2017

Thinking Critically About “Fake News, Facts, and Alternative Facts” (Coursera)

Filed under: Critical Reading,Journalism,News — Patrick Durusau @ 3:17 pm

Fake News, Facts, and Alternative Facts by Will Potter, Josh Pasek, and Brian Weeks.

From “About this course:”

How can you distinguish credible information from “fake news”? Reliable information is at the heart of what makes an effective democracy, yet many people find it harder to differentiate good journalism from propaganda. Increasingly, inaccurate information is shared on Facebook and echoed by a growing number of explicitly partisan news outlets. This becomes more problematic because people have a tendency to accept agreeable messages over challenging claims, even if the former are less objectively credible. In this teach-out, we examine the processes that generate both accurate and inaccurate news stories, and that lead people to believe those stories. We then provide a series of tools that ordinary citizens can use to tell fact from fiction.

To honor the exhortations “use critical thinking,” here are some critical thoughts on course description for “Fake News, Facts, and Alternative Facts.”

How can you distinguish credible information from “fake news”?

The description starts with black and white, normative classifications, one good, “credible information,” and one bad,“fake news.” Information other than being alive or dead is rarely that clear cut. As Tom Petty recently proved, even being dead can be questionable.

You are being emotionally primed to choose “credible information,” as opposed to evaluating information to determine the degree, if any, it should be trusted or used.

Reliable information is at the heart of what makes an effective democracy,

A remarkable claim, often repeated but I have never seen any empirical evidence for that proposition. In critical thinking terms, you would first have to define “reliable information” and “effective democracy.” Then using those definitions, provide empirical evidence to prove that in the absence of “reliable information” democracy is ineffective and with “reliable information” democracy is effective.

It’s an easy claim to make, but in the context of a critical thinking course, isn’t more required than repeating popular cant?

I’ll grant many theories of democracy are predicated upon “reliable information but then those theories also posit equal treatment of all citizens, another popular fiction.

yet many people find it harder to differentiate good journalism from propaganda.

As opposed to when? What is the baseline for when people could more easily “…differentiate good journalism from propaganda…?” Whenever you hear this claim made, press for the study with evidence to prove this point.

You do realize any claiming such a position considers themselves capable of making those distinctions and you are very likely in the class of people who cannot. In traditional terminology, that’s called having a bias. In favor of their judgment as opposed to yours.

Increasingly, inaccurate information is shared on Facebook and echoed by a growing number of explicitly partisan news outlets.

You know the factual objections by this point, what documentation is there for an increase in “inaccurate information” (is that the same as false information?) over when? When was there less inaccurate information. Moreover, when were there fewer “explicitly partisan news outlets?”

By way of example, consider these statements about Jefferson during the presidential election in 1800:


In the election of 1800, ministers spread rumors that Jefferson held worship services at Monticello where he prayed to the “Goddess of Reason” and sacrificed dogs on an altar. Yale University president Timothy Dwight warned that if he became president, “we may see the Bible cast into a bonfire.” Alexander Hamilton asked the governor of New York to take a “legal and constitutional step” to stop the supposed atheist vice president from becoming head of state. Federalists who opposed him called him a “howling atheist,” a “manifest enemy to the religion of Christ,” a “hardened infidel,” and, for good measure, a “French infidel.” As Smith describes it, insults like these were issued forth from hundreds of pulpits in New England and the mid-Atlantic. When Jefferson won the election, many New England Federalists buried their Bibles in their gardens so the new administration would not confiscate and burn them.

It may just be me but it sounds like there was “inaccurate information” and “explicitly partisan news outlets” available during the presidential election of 1800.

When anyone claims there is more “inaccurate information” or “explicitly partisan news outlets,” ask for percentage evidence against some base period.

Surely if they are devoted to “credible/reliable information,” they would not make such statements in the absence of facts to back them up. Yes?

This becomes more problematic because people have a tendency to accept agreeable messages over challenging claims, even if the former are less objectively credible.

People accepting messages they find agreeable is a statement of how people process information. Thinking Fast, Thinking Slow, Kahneman.

The claim goes off the rails with “…even if the former are less objectively credible.”

Where does “…less objectively credible.” come from? It’s a nice sleight of hand but never fall for anyone claiming an “objective” context. It doesn’t, hasn’t and won’t ever exist.

You can make claims from the context of a community of people, scholars, experts, etc., that is every claim originates in shared values and worldview. (See Stanley Fish if you are interested in the objectivity issue.

As with all such claims, the authors have a criteria for “objectively credible” they want you to use in preference to other criteria, suggested by others.

There’s nothing wrong with advocating a particular criteria for judging information, we can all do no more or less. What I object to is cloaking it in the fiction of being beyond a context, to be “objective.” Let us all put forth our criteria and contend for which one should be preferred on an equal footing.

In this teach-out, we examine the processes that generate both accurate and inaccurate news stories, and that lead people to believe those stories. We then provide a series of tools that ordinary citizens can use to tell fact from fiction.

I can almost buy into “accurate” versus “inaccurate” news stories but then I’m promised “tools” to enable me to “…tell fact from fiction.”

Hmmm, but “Who is this class for:” promises:

This course is aimed at anyone who wants to distinguish credible news from “Fake News” by learning to identify biases and become a critical information consumer.

I don’t read “…learning to identify biases…” as being the same thing as “…tools…to tell fact for fiction.”

The latter sounds more like someone is telling me which is fact and fiction? Not the same as being on my own.

I’m enrolling in the course now and will have more comments along the way.

The crucial point here is that “critical thinking” should be universally applied, especially so to discussions of critical thinking.

Tor Keeps You Off #KRACK

Filed under: Cybersecurity,Security,Tor — Patrick Durusau @ 12:44 pm

You have seen the scrambling to address KRACK (Key Reinstallation Attack), a weakness in the WPA2 protocol. Serious flaw in WPA2 protocol lets attackers intercept passwords and much more by Dan Goodin, Falling through the KRACKs by John Green, are two highly informative and amusing posts out of literally dozens on KRACK.

I won’t repeat their analysis here but wanted to point out Tor users are immune from KRACK, unpatched, etc.

A teaching moment to educate users about Tor!

October 16, 2017

Unicode Egyptian Hieroglyphic Fonts

Filed under: Ancient World,Fonts,Language,Unicode — Patrick Durusau @ 8:57 pm

Unicode Egyptian Hieroglyphic Fonts by Bob Richmond.

From the webpage:

These fonts all contain the Unicode 5.2 (2009) basic set of Egyptian Hieroglyphs.

Please contact me if you know of any others, or information to include.

Also of interest:

UMdC Coding Manual for Egyptian Hieroglyphic in Unicode

UMdC (Unicode MdC) aims to provides guidelines for encoding Egyptian Hieroglyphic and related scripts In Unicode using plain text with optional lightweight mark-up.

This GitHub project is the central point for development of UMdC and associated resources. Features of UMdC are still in a discussion phase so everything here should be regarded as preliminary and subject to change. As such the project is initially oriented towards expert Egyptologists and software developers who wish to help ensure ancient Egyptian writing system is well supported in modern digital media.

The Manuel de Codage (MdC) system for digital encoding of Ancient Egyptian textual data was adopted as an informal standard in the 1980s and has formed the basis for most subsequent digital encodings, sometimes using extensions or revisions to the original scheme. UMdC links to the traditional methodology in various ways to help with the transition to Unicode-based solutions.

As with the original MdC system, UMdC data files (.umdc) can be viewed and edited in standard text editors (such as Windows Notepad) and the HTML <textarea></textarea> control. Specialist software applications can be adapted or developed to provide a simpler workflow or enable additional techniques for working with the material.

Also see UMdC overview [pdf].

A UMdC-compatible hieroglyphic font Aaron UMdC Alpha (relative to the current draft) can be downloaded from the Hieroglyphs Everywhere Fonts project.

For news and information on Ancient Egyptian in Unicode see https://hieroglyphseverywhere.blogspot.co.uk/.

I understand the need for “plain text” viewing of hieroglyphics, especially for primers and possibly for search engines, but Egyptian hieroglyphs can be written facing right or left, top to bottom and more rarely bottom to top. Moreover, artistic and other considerations can result in transposition of glyphs out of their “linear” order in a Western reading sense.

Unicode hieroglyphs are a major step forward for the interchange of hieroglyphic texts but we should remain mindful “linear” presentation of inscription texts is a far cry from their originals.

The greater our capacity for graphic representation, the more we simplify complex representations from the past. Are the needs of our computers really that important?

October 13, 2017

A cRyptic crossword with an R twist

Filed under: Games,Humor,R — Patrick Durusau @ 3:14 pm

A cRyptic crossword with an R twist

From the post:

Last week’s R-themed crossword from R-Ladies DC was popular, so here’s another R-related crossword, this time by Barry Rowlingson and published on page 39 of the June 2003 issue of R-news (now known as the R Journal). Unlike the last crossword, this one follows the conventions of a British cryptic crossword: the grid is symmetrical, and eschews 4×4 blocks of white or black squares. Most importantly, the clues are in the cryptic style: rather than being a direct definition, cryptic clues pair wordplay (homonyms, anagrams, etc) with a hidden definition. (Wikipedia has a good introduction to the types of clues you’re likely to find.) Cryptic crosswords can be frustrating for the uninitiated, but are fun and rewarding once you get to into it.

In fact, if you’re unfamiliar with cryptic crosswords, this one is a great place to start. Not only are many (but not all) of the answers related in some way to R, Barry has helpfully provided the answers along with an explanation of how the cryptic clue was formed. There’s no shame in peeking, at least for a few, to help you get your legs with the cryptic style.

Another R crossword for your weekend enjoyment!

Enjoy!

Older Posts »

Powered by WordPress