Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

May 23, 2017

The power of algorithms and how to investigate them (w/ resources)

Filed under: Algorithms,Journalism,News,Reporting — Patrick Durusau @ 2:03 pm

The power of algorithms and how to investigate them by Katrien Vanherck.

From the post:

Most Americans these days get their main news from Google or Facebook, two tools that rely heavily on algorithms. A study in 2015 showed that the way a search engine like Google selects and prioritises search results on political candidates can have an influence on voters’ preferences.

Similarly, it has been shown that by tweaking the algorithms behind the Facebook newsfeed, the turnout of voters in American elections can be influenced. If Marc Zuckerberg were ever to run for president, he would theoretically have an enormously powerful tool at his disposal. (Note: as recent article in The Guardian investigated the misuse of big data and social media in the context of the Brexit referendum).

Algorithms are everywhere in our everyday life and are exerting a lot of power in our society. They prioritise, classify, connect and filter information, automatically making decisions on our behalf all the time. But as long as the algorithms remain a ‘black box’, we don’t know exactly how these decisions are made.

Are these algorithms always fair? Examples of possible racial bias in algorithms include the risk analysis score that is calculated for prisoners that are up for parole or release (white people appear to get more favourable scores more often) and the service quality of Uber in Washington DC (waiting times are shorter in predominantly white neighbourhoods). Maybe such unfair results are not only due to the algorithms, but the lack of transparency remains a concern.

So what is going on in these algorithms, and how can we make them more accountable?
… (emphasis in original)

A great inspirational keynote but short on details for investigation of algorithms.

Such as failing to mention the algorithms of both Google and Facebook are secret.

Reverse engineering those from results would be a neat trick.

Google would be the easier of the two, since you could script searches domain by domain with a list of search terms to build up a data set of its results. That would not result in the algorithm per se but you could detect some of its contours.

Google has been accused of liberal bias, Who would Google vote for? An analysis of political bias in internet search engine results, bias in favor of Hillary Clinton, Google defends its search engine against charges it favors Clinton, and, bias in favor of the right wing, How Google’s search algorithm spreads false information with a rightwing bias.

To the extent you identify Hillary Clinton with the rightwing, those results may be expressions of the same bias.

In any event, you can discern from those studies some likely techniques to use in testing Google search/auto-completion results.

Facebook is be harder because you don’t have access to or control over the content it is manipulating for delivery. Although by manipulating social media identities, you could test and compare the content that Facebook delivers.

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress