Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

June 13, 2012

Faster Ranking As A Goal?

Filed under: PageRank,Quantum — Patrick Durusau @ 1:25 pm

When I read in Quantum Computers Could Help Search Engines Keep Up With the Internet’s Growth:

Most people don’t think twice about how Internet search engines work. You type in a word or phrase, hit enter, and poof — a list of web pages pops up, organized by relevance.

Behind the scenes, a lot of math goes into figuring out exactly what qualifies as most relevant web page for your search. Google, for example, uses a page ranking algorithm that is rumored to be the largest numerical calculation carried out anywhere in the world. With the web constantly expanding, researchers at USC have proposed — and demonstrated the feasibility — of using quantum computers to speed up that process.

“This work is about trying to speed up the way we search on the web,” said Daniel Lidar, corresponding author of a paper on the research that appeared in the journal Physical Review Letters on June 4.

As the Internet continues to grow, the time and resources needed to run the calculation — which is done daily — grow with it, Lidar said.

I thought of my post earlier today about inexact computing and how our semantics are inexact. (On the value of being inexact)

Is it the case that quantum computing is going to help us be more exact more quickly?

I am not sure what the advantage of being wrong more quickly could be? Do you?


The full reference:

Silvano Garnerone, Paolo Zanardi, Daniel Lidar. Adiabatic Quantum Algorithm for Search Engine Ranking. Physical Review Letters, 2012; 108 (23) DOI: 10.1103/PhysRevLett.108.230506

Chance discover of an interesting journal feature:

Abstract:

We propose an adiabatic quantum algorithm for generating a quantum pure state encoding of the PageRank vector, the most widely used tool in ranking the relative importance of internet pages. We present extensive numerical simulations which provide evidence that this algorithm can prepare the quantum PageRank state in a time which, on average, scales polylogarithmically in the number of web pages. We argue that the main topological feature of the underlying web graph allowing for such a scaling is the out-degree distribution. The top-ranked log⁡(n) entries of the quantum PageRank state can then be estimated with a polynomial quantum speed-up. Moreover, the quantum PageRank state can be used in “q-sampling” protocols for testing properties of distributions, which require exponentially fewer measurements than all classical schemes designed for the same task. This can be used to decide whether to run a classical update of the PageRank.

Physics Synopsis:

Although quantum computing has only been demonstrated for small calculations so far, researchers are interested in finding problems where its potentially massive parallelism would pay off if scaled-up versions can be made. In Physical Review Letters, Silvano Garnerone of the Institute for Quantum Computing at the University of Waterloo, Canada, and colleagues simulate the speedup achieved by using a quantum approach to rank websites.

The PageRank method, implemented by Google, assigns each website a score based on how many other sites link to it and what their scores are. Starting with an enormous matrix that represents which sites link to which others, the algorithm evaluates the probability that a steady stream of surfers starting at random sites and following random links will be found at each site. This information helps determine which search results should be listed highest. The PageRank calculation currently requires a time that is roughly proportional to the number of sites. This slowdown with size is not as bad as for many complex problems, but it can still take many days to rank the entire worldwide web.

Garnerone and colleagues propose an approach to page ranking that uses an “adiabatic quantum algorithm,” in which a simple matrix with a known solution is gradually transformed into the real problem, producing the desired solution. They simulated many relatively small networks that had similar link topology to the worldwide web, and found that reconstructing and reading out the most relevant part of the PageRank required a time that grows more slowly than the best classical algorithms available. – Don Monroe

That looks like a really cool feature to me.

Abstract for the initiated. Synopsis for the may be interested.

Are there IR/KD/etc. journals following that model?

Seems like a good way to create “trading zones” where we will become aware of work in other areas.

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress