Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

December 9, 2017

Lisp at the Frontier of Computation

Filed under: Computer Science,Lisp,Quantum — Patrick Durusau @ 10:18 am

Abstract:

Since the 1950s, Lisp has been used to describe and calculate in cutting-edge fields like artificial intelligence, robotics, symbolic mathematics, and advanced optimizing compilers. It is no surprise that Lisp has also found relevance in quantum computation, both in academia and industry. Hosted at Rigetti Computing, a quantum computing startup in Berkeley, Robert Smith will provide a pragmatic view of the technical, sociological, and psychological aspects of working with an interdisciplinary team, writing Lisp, to build the next generation of technology resource: the quantum computer.

ABOUT THE SPEAKER: Robert has been using Lisp for over decade, and has been fortunate to work with and manage expert teams of Lisp programmers to build embedded fingerprint analysis systems, machine learning-based product recommendation software, metamaterial phased-array antennas, discrete differential geometric computer graphics software, and now quantum computers. As Director of Software Engineering, Robert is responsible for building the publicly available Rigetti Forest platform, powered by both a real quantum computer and one of the fastest single-node quantum computer simulators in the world.

Video notes mention “poor audio quality.” Not the best but clear and audible to me.

The coverage of the quantum computer work is great but mostly a general promotion of Lisp.

Important links:

Forest (beta) Forest provides development access to our 30-qubit simulator the Quantum Virtual Machine ™ and limited access to our quantum hardware systems for select partners. Workshop video plus numerous other resources.

A Practical Quantum Instruction Set Architecture by Robert S. Smith, Michael J. Curtis, William J. Zeng. (speaker plus two of his colleagues)

January 18, 2017

Quantum Computer Resistant Encryption

Filed under: Cryptography,Quantum,Security — Patrick Durusau @ 10:37 am

Irish Teen Introduces New Encryption System Resistant to Quantum Computers by Joseph Young.

From the post:


… a 16-year-old student was named as Ireland’s top young scientist and technologist of 2017, after demonstrating the application of qCrypt, which offers higher levels of protection, privacy and encryption in comparison to other innovative and widely-used cryptographic systems.

BT Young Scientist Judge John Dunnion, the associate professor at University of College Dublin, praised Curran’s project that foresaw the impact quantum computing will have on current cryptographic and encryption methods.

“qCrypt is a novel distributed data storage system that provides greater protection for user data than is currently available. It addresses a number of shortfalls of current data encryption systems; in particular, the algorithm used in the system has been demonstrated to be resistant to attacks by quantum computers in the future,” said Dunnion.

While it may be too early to predict whether technologies like qCrypt can protect existing encryption methods and data protection systems from quantum computers, Curran and the judges of the competition saw promising potential in the technology.

Word is spreading rapidly.

qCrypt has a place-holder website, Post-Quantum Cryptography for the Masses.

A Youtube video:

Shane’s Github repository (no qCrypt, yet)

Not to mention Shane’s website.

qCrypt has the potential to provide safety from government surveillance for everyone, everywhere.

Looking forward to this!

January 13, 2017

D-Wave Just Open-Sourced Quantum Computing [DC Beltway Parking Lot Distraction]

Filed under: Computer Science,Quantum — Patrick Durusau @ 9:10 pm

D-Wave Just Open-Sourced Quantum Computing by Dom Galeon.

D-Wave has just released a welcome distraction for CS types sitting in the DC Beltway Parking Lot on January 20-21, 2017. (I assuming you brought extra batteries for your laptop.) After you run out of gas, your laptop will be running on battery power alone.

Just remember to grab a copy of Qbsolv before you leave for the tailgate/parking lot party on the Beltway.

A software tool known as Qbsolv allows developers to program D-Wave’s quantum computers even without knowledge of quantum computing. It has already made it possible for D-Wave to work with a bunch of partners, but the company wants more. “D-Wave is driving the hardware forward,” Bo Ewald, president of D-Wave International, told Wired. “But we need more smart people thinking about applications, and another set thinking about software tools.”

To that end, D-Wave has open-sourced Qbsolv, making it possible for anyone to freely share and modify the software. D-Wave hopes to build an open source community of sorts for quantum computing. Of course, to actually run this software, you’d need access to a piece of hardware that uses quantum particles, like one of D-Wave’s quantum computers. However, for the many who don’t have that access, the company is making it possible to download a D-Wave simulator that can be used to test Qbsolv on other types of computers.

This open-source Qbsolv joins an already-existing free software tool called Qmasm, which was developed by one of Qbsolv’s first users, Scott Pakin of Los Alamos National Laboratory. “Not everyone in the computer science community realizes the potential impact of quantum computing,” said mathematician Fred Glover, who’s been working with Qbsolv. “Qbsolv offers a tool that can make this impact graphically visible, by getting researchers and practitioners involved in charting the future directions of quantum computing developments.”

D-Wave’s machines might still be limited to solving optimization problems, but it’s a good place to start with quantum computers. Together with D-Wave, IBM has managed to develop its own working quantum computer in 2000, while Google teamed up with NASA to make their own. Eventually, we’ll have a quantum computer that’s capable of performing all kinds of advanced computing problems, and now you can help make that happen.

From the github page:

qbsolv is a metaheuristic or partitioning solver that solves a potentially large quadratic unconstrained binary optimization (QUBO) problem by splitting it into pieces that are solved either on a D-Wave system or via a classical tabu solver.

The phrase, “…might still be limited to solving optimization problems…” isn’t as limiting as it might appear.

A recent (2014) survey of quadratic unconstrained binary optimization (QUBO), The Unconstrained Binary Quadratic Programming Problem: A Survey runs some thirty-three pages and should keep you occupied however long you sit on the DC Beltway.

From page 10 of the survey:


Kochenberger, Glover, Alidaee, and Wang (2005) examine the use of UBQP as a tool for clustering microarray data into groups with high degrees of similarity.

Where I read one person’s “similarity” to be another person’s test of “subject identity.”

PS: Enjoy the DC Beltway. You may never see it motionless ever again.

December 1, 2016

If You Don’t Get A Quantum Computer For Christmas

Filed under: Haskell,Physics,Quantum — Patrick Durusau @ 1:38 pm

Learn Quantum Mechanics with Haskell by Scott N. Walck.

Abstract:

To learn quantum mechanics, one must become adept in the use of various mathematical structures that make up the theory; one must also become familiar with some basic laboratory experiments that the theory is designed to explain. The laboratory ideas are naturally expressed in one language, and the theoretical ideas in another. We present a method for learning quantum mechanics that begins with a laboratory language for the description and simulation of simple but essential laboratory experiments, so that students can gain some intuition about the phenomena that a theory of quantum mechanics needs to explain. Then, in parallel with the introduction of the mathematical framework on which quantum mechanics is based, we introduce a calculational language for describing important mathematical objects and operations, allowing students to do calculations in quantum mechanics, including calculations that cannot be done by hand. Finally, we ask students to use the calculational language to implement a simplified version of the laboratory language, bringing together the theoretical and laboratory ideas.

You won’t find a quantum computer under your Christmas tree this year.

But Haskell + Walck will teach you the basics of quantum mechanics.

You may also want to read:

Structure and Interpretation of Quantum Mechanics – a Functional Framework (2003) by Jerzy Karczmarczuk.

You will have to search for it but “Gerald Jay Sussman & Jack Wisdom (2013): Functional Differential Geometry. The MIT Press.” is out on the net somewhere.

Very tough sledding but this snippet from the preface may tempt you into buying a copy:


But the single biggest difference between our treatment and others is that we integrate computer programming into our explanations. By programming a computer to interpret our formulas we soon learn whether or not a formula is correct. If a formula is not clear, it will not be interpretable. If it is wrong, we will get a wrong answer. In either case we are led to improve our program and as a result improve our understanding. We have been teaching advanced classical mechanics at MIT for many years using this strategy. We use precise functional notation and we have students program in a functional language. The students enjoy this approach and we have learned a lot ourselves. It is the experience of writing software for expressing the mathematical content and the insights that we gain from doing it that we feel is revolutionary. We want others to have a similar experience.

If that interests you, check out courses by Sussman at MITOpenCourseware.

Enjoy!

August 3, 2016

QML: A Functional Quantum Programming Language

Filed under: Functional Programming,Quantum — Patrick Durusau @ 9:29 am

QML: A Functional Quantum Programming Language

From the post:

QML is a functional language for quantum computations on finite types. The language introduces quantum data and quantum control structures, and integrates reversible and irreversible quantum computation. QML is based on strict linear logic, hence weakenings, which may lead to decoherence, have to be explicit.

The design of QML is guided by its categorical semantics: QML programs are interpreted as morphisms in the category FQC of Finite Quantum Computations. This provides a constructive semantics of irreversible quantum computations realisable as quantum gates. The relationships between the category FQC and its classical reversible counterpart, FCC (Finite Classical Computations), are also explored.

The operational semantics of QML programs is presented using standard quantum circuits, while a denotational semantics is given using superoperators.

This research has been supported by the EPSRC, via the MathFIT initiative, grant number GR/S30818/01. We are also involved in the EPSRC research network on the Semantics of Quantum Computation (QNET).

Having closely read Commercial National Security Algorithm Suite and Quantum Computing FAQ from the NSA, or it more popular summary, NSA Warns of the Dangers of Quantum Computing by Todd Jaquith, I know you are following every substantive publication on quantum computing.

By “substantive publication” I mean publications that have the potential to offer some insight into the development or use of quantum computers. The publications listed here qualify as “substantive” by that criteria.

With regard to the “dangers” of quantum computing, I see two choices:

  1. Reliance on government agencies who “promise” to obey the law in the future (who have broken laws in the past), or
  2. Obtain the advantages of quantum computing before such government agencies. (Or master their use more quickly.)

Unless you view “freedom” as being at the sufferance of government, may I suggest pursuit of #2 as much as interest and resources permit?

April 28, 2016

Quantum Shannon Theory (Review Request)

Filed under: Information Theory,Quantum,Shannon — Patrick Durusau @ 2:51 pm

Quantum Shannon Theory by John Preskill.

Abstract:

This is the 10th and final chapter of my book on Quantum Information, based on the course I have been teaching at Caltech since 1997. An early version of this chapter (originally Chapter 5) has been available on the course website since 1998, but this version is substantially revised and expanded. The level of detail is uneven, as I’ve aimed to provide a gentle introduction, but I’ve also tried to avoid statements that are incorrect or obscure. Generally speaking, I chose to include topics that are both useful to know and relatively easy to explain; I had to leave out a lot of good stuff, but on the other hand the chapter is already quite long. This is a working draft of Chapter 10, which I will continue to update. See the URL on the title page for further updates and drafts of other chapters, and please send me an email if you notice errors. Eventually, the complete book will be published by Cambridge University Press.

Prekill tweeted requesting reviews of and comments on this 112 page “chapter” from Quantum Information (forthcoming, appropriately, no projected date).

Be forewarned that Preskill compresses classical information theory into 14 pages or so. 😉

You can find more chapters at: Quantum Computation.

Previous problem sets with solutions are also available.

Quantum computing is coming. Are you going to be the first quantum hacker?

Enjoy!

February 2, 2016

Microsoft Quantum Challenge [Deadline April 29, 2016.]

Filed under: Challenges,Contest,Quantum — Patrick Durusau @ 2:22 pm

Microsoft Quantum Challenge

From the webpage:

Join students from around the world to investigate and solve problems facing the quantum universe using Microsoft’s simulator, LIQUi|>.

Win big prizes, or the opportunity to interview for internships at Microsoft Research.

Objectives of the Quantum Challenge

The Quantum Architectures and Computing Group QuArC is seeking exceptional students!

WE want to find students who are eager to expand their knowledge of quantum computing, and who can translate thoughts into programs. Thereby we will expand the use of Microsoft’s Quantum Simulator LIQUi|>.

How to enter

First of all, REGISTER for the Challenge so that you can receive updates about the contest.

In the challenge you will use the LIQUi|> simulator to solve a novel problem and then report on your findings. So, think of a project. Then, download the simulator from GitHub and work with it to solve your problem. Finally, write a report about your findings and submit it. Your report submission will enter you into the Challenge.

In the report, present a description of the project including goals, methods, challenges, and any result obtained using LIQUi|>. You do not need to submit circuits and the software you develop, however, sample input and output for LIQUi|> must be submitted to show you used the simulator in the project. Your entry must consist of six pages or less, in PDF format.

The Challenge is open to students at colleges and universities world-wide (with a few restrictions) and aged 18+. NO PURCHASE NECESSARY. For full details, see the Official Rules

The prizes

 The Quantum Challenge is your change to win a big prize!

  • First Prize:  $5,000
  • Second Prizes:   Four at $2,500
  • Honorary Mention: Certificates will be presented to runner-up entries

Extra – visits or internship interviews

As a result of the challenge, some entrants could be invited to visit the QuArC team at Microsoft Research in Redmond, or have an opportunity to interview for internships at Microsoft Research. Internships are highly prestigious and involve working with the QuArC team for 12 weeks on cutting edge research.

If you are young enough to enter, just a word of warning about the “big prize.” $5,000 today isn’t a “big prize.” Maybe a nice weekend if you keep it low key but only just.

Interaction with the QuArC team, either by winning or in online discussions is the real prize.

Besides, who need $5,000 if you can break quantum encrypted bank transfer orders? 😉

November 25, 2015

Quantum Walks with Gremlin [Graph Day, Austin]

Filed under: Graphs,Gremlin,Physics,Quantum — Patrick Durusau @ 8:25 pm

Quantum Walks with Gremlin by Marko A. Rodiguez, Jennifer H. Watkins.

Abstract:

A quantum walk places a traverser into a superposition of both graph location and traversal “spin.” The walk is defined by an initial condition, an evolution determined by a unitary coin/shift-operator, and a measurement based on the sampling of the probability distribution generated from the quantum wavefunction. Simple quantum walks are studied analytically, but for large graph structures with complex topologies, numerical solutions are typically required. For the quantum theorist, the Gremlin graph traversal machine and language can be used for the numerical analysis of quantum walks on such structures. Additionally, for the graph theorist, the adoption of quantum walk principles can transform what are currently side-effect laden traversals into pure, stateless functional flows. This is true even when the constraints of quantum mechanics are not fully respected (e.g. reversible and unitary evolution). In sum, Gremlin allows both types of theorist to leverage each other’s constructs for the advancement of their respective disciplines.

Best not to tackle this new paper on Gremlin and quantum graph walks after a heavy meal. 😉

Marko will be presenting at Graph Day, 17 January 2016, Austin, Texas. Great opportunity to hear him speak along with other cutting edge graph folks.

The walk Marko describes is located in a Hilbert space. Understandable because numerical solutions require the use of a metric space.

However, if you are modeling semantics in difference universes of discourse, realize that semantics don’t possess metric spaces. Semantics lie outside of metric space, although I concede that many have imposed varying arbitrary metrics on semantics.

For example, if I am mapping the English term for “black,” as in a color to the term “schwartz” in German, I need a “traverser” that enables the existence of both terms at separate locations, one for each universe in the graph.

You may protest that is overly complex for the representation of synonyms, but consider that “schwartz” occupies a different location in the universe of German and etymology from “black.”

For advertising, subtleties of language may not be useful, but for reading medical or technical works, an “approximate” or “almost right” meaning may be more damaging than helpful.

Who knows? Perhaps quantum computers will come closer to modeling semantics across domains better than any computer to date. Not perfectly but closer.

November 13, 2015

LIQUi|> – A Quantum Computing Simulator

Filed under: Computer Science,Physics,Quantum — Patrick Durusau @ 8:23 pm

With quantum computing simulator, Microsoft offers a sneak peek into future of computing by Allison Linn.

From the post:


Next week, at the SuperComputing 2015 conference in Austin, Texas, Dave Wecker, a lead architect on the QuArC team, will discuss the recent public release on GitHub of a suite of tools that allows computer scientists to simulate a quantum computer’s capabilities. That’s a crucial step in building the tools needed to run actual quantum computers.

“This is the closest we can get to running a quantum computer without having one,” said Wecker, who has helped develop the software.

The software is called Language-Integrated Quantum Operations, or LIQUi|>. The funky characters at the end refer to how a quantum operation is written in mathematical terms.

The researchers are hoping that, using LIQUi|>, computer scientists at Microsoft and other academic and research institutions will be able to perfect the algorithms they need to efficiently use a quantum computer even as the computers themselves are simultaneously being developed.

“We can actually debut algorithms in advance of running them on the computer,” Svore said.

As of today, November 13, 2015, LIQUi|> has only one (1) hit at GitHub. Will try back next week to see what the numbers look like then.

You won’t have a quantum computer by the holidays but you may have created your first quantum algorithm by then.

Enjoy!

April 24, 2015

Mathematicians Reduce Big Data Using Ideas from Quantum Theory

Filed under: Data Reduction,Mathematics,Physics,Quantum — Patrick Durusau @ 8:20 pm

Mathematicians Reduce Big Data Using Ideas from Quantum Theory by M. De Domenico, V. Nicosia, A. Arenas, V. Latora.

From the post:

A new technique of visualizing the complicated relationships between anything from Facebook users to proteins in a cell provides a simpler and cheaper method of making sense of large volumes of data.

Analyzing the large volumes of data gathered by modern businesses and public services is problematic. Traditionally, relationships between the different parts of a network have been represented as simple links, regardless of how many ways they can actually interact, potentially loosing precious information. Only recently a more general framework has been proposed to represent social, technological and biological systems as multilayer networks, piles of ‘layers’ with each one representing a different type of interaction. This approach allows a more comprehensive description of different real-world systems, from transportation networks to societies, but has the drawback of requiring more complex techniques for data analysis and representation.

A new method, developed by mathematicians at Queen Mary University of London (QMUL), and researchers at Universitat Rovira e Virgili in Tarragona (Spain), borrows from quantum mechanics’ well tested techniques for understanding the difference between two quantum states, and applies them to understanding which relationships in a system are similar enough to be considered redundant. This can drastically reduce the amount of information that has to be displayed and analyzed separately and make it easier to understand.

The new method also reduces computing power needed to process large amounts of multidimensional relational data by providing a simple technique of cutting down redundant layers of information, reducing the amount of data to be processed.

The researchers applied their method to several large publicly available data sets about the genetic interactions in a variety of animals, a terrorist network, scientific collaboration systems, worldwide food import-export networks, continental airline networks and the London Underground. It could also be used by businesses trying to more readily understand the interactions between their different locations or departments, by policymakers understanding how citizens use services or anywhere that there are large numbers of different interactions between things.

You can hop over to Nature, Structural reducibility of multilayer networks, where if you don’t have an institutional subscription:

ReadCube: $4.99 Rent, $9.99 to buy, or Purchase a PDF for $32.00.

Let me save you some money and suggest you look at:

Layer aggregation and reducibility of multilayer interconnected networks

Abstract:

Many complex systems can be represented as networks composed by distinct layers, interacting and depending on each others. For example, in biology, a good description of the full protein-protein interactome requires, for some organisms, up to seven distinct network layers, with thousands of protein-protein interactions each. A fundamental open question is then how much information is really necessary to accurately represent the structure of a multilayer complex system, and if and when some of the layers can indeed be aggregated. Here we introduce a method, based on information theory, to reduce the number of layers in multilayer networks, while minimizing information loss. We validate our approach on a set of synthetic benchmarks, and prove its applicability to an extended data set of protein-genetic interactions, showing cases where a strong reduction is possible and cases where it is not. Using this method we can describe complex systems with an optimal trade–off between accuracy and complexity.

Both articles have four (4) illustrations. Same four (4) authors. The difference being the second one is at http://arxiv.org. Oh, and it is free for downloading.

I remain concerned by the focus on reducing the complexity of data to fit current algorithms and processing models. That said, there is no denying that such reduction methods have proven to be useful.

The authors neatly summarize my concerns with this outline of their procedure:

The whole procedure proposed here is sketched in Fig. 1 and can be summarised as follows: i) compute the quantum Jensen-Shannon distance matrix between all pairs of layers; ii) perform hierarchical clustering of layers using such a distance matrix and use the relative change of Von Neumann entropy as the quality function for the resulting partition; iii) finally, choose the partition which maximises the relative information gain.

With my corresponding concerns:

i) The quantum Jensen-Shannon distance matrix presumes a metric distance for its operations, which may or may not reflect the semantics of the layers (or than by simplifying assumption).

ii) The relative change of Von Neumann entropy is a difference measurement based upon an assumed metric, which may or not represent the underlying semantics of the relationships between layers.

iii) The process concludes by maximizing a difference measurement based upon an assigned metric, which has been assigned to the different layers.

Maximizing a difference, based on an entropy calculation, which is itself based on an assigned metric doesn’t fill me with confidence.

I don’t doubt that the technique “works,” but doesn’t that depend upon what you think is being measured?

A question for the weekend: Do you think this is similar to the questions about dividing continuous variables into discrete quantities?

November 6, 2014

Deeper Than Quantum Mechanics—David Deutsch’s New Theory of Reality

Filed under: Information Theory,Philosophy,Quantum — Patrick Durusau @ 8:07 pm

Deeper Than Quantum Mechanics—David Deutsch’s New Theory of Reality

From the post:


Their new idea is called constructor theory and it is both simpler and deeper than quantum mechanics, or indeed any other laws of physics. In fact, Deutsch claims that constructor theory forms a kind of bedrock of reality from which all the laws of physics emerge.

Constructor theory is a radically different way of thinking about the universe that Deutsch has been developing for some time. He points out that physicists currently ply their trade by explaining the world in terms of initial conditions and laws of motion. This leads to a distinction between what happens and what does not happen.

Constructor theory turns this approach on its head. Deutsch’s new fundamental principle is that all laws of physics are expressible entirely in terms of the physical transformations that are possible and those that are impossible.

In other words, the laws of physics do not tell you what is possible and impossible, they are the result of what is possible and impossible. So reasoning about the physical transformations that are possible and impossible leads to the laws of physics.

That’s why constructor theory is deeper than anything that has gone before it. In fact, Deutsch does not think about it as a law of physics but as a principle, or set of principles, that the laws of physics must obey.

If that sounds like heavy sledding, see: arxiv.org/abs/1405.5563 : Constructor Theory of Information.

Abstract:

We present a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible – i.e. in constructor-theoretic terms. Although it includes conjectured laws of physics that are directly about information, independently of the details of particular physical instantiations, it does not regard information as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. It does not suffer from the circularity at the foundations of existing information theory (namely that information and distinguishability are each defined in terms of the other). It explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and entanglement (locally inaccessible information).

The paper runs thirty (30) pages so should give you a good workout before the weekend. 😉

I first saw this in a tweet by Steven Pinker.

October 11, 2014

Microsoft’s Quantum Mechanics

Filed under: Microsoft,Quantum,Semantics — Patrick Durusau @ 11:52 am

Microsoft’s Quantum Mechanics by Tom Simonite.

From the post:

In 2012, physicists in the Netherlands announced a discovery in particle physics that started chatter about a Nobel Prize. Inside a tiny rod of semiconductor crystal chilled cooler than outer space, they had caught the first glimpse of a strange particle called the Majorana fermion, finally confirming a prediction made in 1937. It was an advance seemingly unrelated to the challenges of selling office productivity software or competing with Amazon in cloud computing, but Craig Mundie, then heading Microsoft’s technology and research strategy, was delighted. The abstruse discovery—partly underwritten by Microsoft—was crucial to a project at the company aimed at making it possible to build immensely powerful computers that crunch data using quantum physics. “It was a pivotal moment,” says Mundie. “This research was guiding us toward a way of realizing one of these systems.”

Microsoft is now almost a decade into that project and has just begun to talk publicly about it. If it succeeds, the world could change dramatically. Since the physicist Richard Feynman first suggested the idea of a quantum computer in 1982, theorists have proved that such a machine could solve problems that would take the fastest conventional computers hundreds of millions of years or longer. Quantum computers might, for example, give researchers better tools to design novel medicines or super-efficient solar cells. They could revolutionize artificial intelligence.

Fairly upbeat review of current efforts to build a quantum computer.

You may want to off-set it by reading Scott Aaronson’s blog, Shtetl-Optimized, which has the following header note:

If you take just one piece of information from this blog:
Quantum computers would not solve hard search problems
instantaneously by simply trying all the possible solutions at once. (emphasis added)

See in particular: Speaking Truth to Parallelism at Cornell

Whatever speedups are possible with quantum computers, getting a semantically incorrect answer faster isn’t an advantage.

Assumptions about faster computing platforms include an assumption of correct semantics. There have been no proofs of default correct handling of semantics by present day or future computing platforms.

I first saw this in a tweet by Peter Lee.

PS: I saw the reference to Scott Aaronson’s blog in a comment to Tom’s post.

May 30, 2014

Quantum Computing Playground

Filed under: Algorithms,Computation,Quantum — Patrick Durusau @ 9:03 am

Google’s “Quantum Computing Playground” lets you fiddle with quantum algorithms by Dario Borghino.

From the post:

Google has just launched a new web-based integrated development environment (IDE) that allows users to write, run and debug software that makes use of quantum algorithms. The tool could allow computer scientists to stay ahead of the game and get acquainted with the many quirks of quantum algorithms even before the first practical quantum computer is built.

Homepage: Quantum Computing Playground.

From the homepage:

We strongly recommened to run Quantum Playground in Google Chrome.

I accessed the homepage using, gasp, another browser. 😉 Just happenstance.

The page doesn’t say anything about use of another browser leaving your computer out of phase but probably best not to take chances.

Enjoy!

June 22, 2013

The Quipper Language [Quantum Computing]

Filed under: Computation,Computer Science,Functional Programming,Programming,Quantum — Patrick Durusau @ 5:26 pm

The Quipper Language

From the webpage:

Quipper is an embedded, scalable functional programming language for quantum computing. It provides, among other things:

  • A high-level circuit description language. This includes gate-by-gate descriptions of circuit fragments, as well as powerful operators for assembling and manipulating circuits.
  • A monadic semantics, allowing for a mixture of procedural and declarative programming styles.
  • Built-in facilities for automatic synthesis of reversible quantum circuits, including from classical code.
  • Support for hierarchical circuits.
  • Extensible quantum data types.
  • Programmable circuit transformers.
  • Support for three execution phases: compile time, circuit generation time, and circuit execution time. A dynamic lifting operation to allow circuit generation to be parametric on values generated at circuit execution time.
  • Extensive libraries of quantum functions, including: libraries for quantum integer and fixed-point arithmetic; the Quantum Fourier transform; an efficient Qram implementation; libraries for simulation of pseudo-classical circuits, Stabilizer circuits, and arbitrary circuits; libraries for exact and approximate decomposition of circuits into specific gate sets.

The website has a Quipper tutorial, online documentation and a detailed look at the language itself.

No link for a quantum computer but that isn’t very far off.

Learn Quipper now and perhaps you can lead the first Apache project to develop open source software for a quantum computer.

I first saw this in a tweet by Jose A. Alonso.

June 13, 2012

Faster Ranking As A Goal?

Filed under: PageRank,Quantum — Patrick Durusau @ 1:25 pm

When I read in Quantum Computers Could Help Search Engines Keep Up With the Internet’s Growth:

Most people don’t think twice about how Internet search engines work. You type in a word or phrase, hit enter, and poof — a list of web pages pops up, organized by relevance.

Behind the scenes, a lot of math goes into figuring out exactly what qualifies as most relevant web page for your search. Google, for example, uses a page ranking algorithm that is rumored to be the largest numerical calculation carried out anywhere in the world. With the web constantly expanding, researchers at USC have proposed — and demonstrated the feasibility — of using quantum computers to speed up that process.

“This work is about trying to speed up the way we search on the web,” said Daniel Lidar, corresponding author of a paper on the research that appeared in the journal Physical Review Letters on June 4.

As the Internet continues to grow, the time and resources needed to run the calculation — which is done daily — grow with it, Lidar said.

I thought of my post earlier today about inexact computing and how our semantics are inexact. (On the value of being inexact)

Is it the case that quantum computing is going to help us be more exact more quickly?

I am not sure what the advantage of being wrong more quickly could be? Do you?


The full reference:

Silvano Garnerone, Paolo Zanardi, Daniel Lidar. Adiabatic Quantum Algorithm for Search Engine Ranking. Physical Review Letters, 2012; 108 (23) DOI: 10.1103/PhysRevLett.108.230506

Chance discover of an interesting journal feature:

Abstract:

We propose an adiabatic quantum algorithm for generating a quantum pure state encoding of the PageRank vector, the most widely used tool in ranking the relative importance of internet pages. We present extensive numerical simulations which provide evidence that this algorithm can prepare the quantum PageRank state in a time which, on average, scales polylogarithmically in the number of web pages. We argue that the main topological feature of the underlying web graph allowing for such a scaling is the out-degree distribution. The top-ranked log⁡(n) entries of the quantum PageRank state can then be estimated with a polynomial quantum speed-up. Moreover, the quantum PageRank state can be used in “q-sampling” protocols for testing properties of distributions, which require exponentially fewer measurements than all classical schemes designed for the same task. This can be used to decide whether to run a classical update of the PageRank.

Physics Synopsis:

Although quantum computing has only been demonstrated for small calculations so far, researchers are interested in finding problems where its potentially massive parallelism would pay off if scaled-up versions can be made. In Physical Review Letters, Silvano Garnerone of the Institute for Quantum Computing at the University of Waterloo, Canada, and colleagues simulate the speedup achieved by using a quantum approach to rank websites.

The PageRank method, implemented by Google, assigns each website a score based on how many other sites link to it and what their scores are. Starting with an enormous matrix that represents which sites link to which others, the algorithm evaluates the probability that a steady stream of surfers starting at random sites and following random links will be found at each site. This information helps determine which search results should be listed highest. The PageRank calculation currently requires a time that is roughly proportional to the number of sites. This slowdown with size is not as bad as for many complex problems, but it can still take many days to rank the entire worldwide web.

Garnerone and colleagues propose an approach to page ranking that uses an “adiabatic quantum algorithm,” in which a simple matrix with a known solution is gradually transformed into the real problem, producing the desired solution. They simulated many relatively small networks that had similar link topology to the worldwide web, and found that reconstructing and reading out the most relevant part of the PageRank required a time that grows more slowly than the best classical algorithms available. – Don Monroe

That looks like a really cool feature to me.

Abstract for the initiated. Synopsis for the may be interested.

Are there IR/KD/etc. journals following that model?

Seems like a good way to create “trading zones” where we will become aware of work in other areas.

June 5, 2012

Geometric and Quantum Methods for Information Retrieval

Filed under: Geometry,Information Retrieval,Quantum — Patrick Durusau @ 7:55 pm

Geometric and Quantum Methods for Information Retrieval by Yaoyong Li and Hamish Cunningham.

Abstract:

This paper reviews the recent developments in applying geometric and quantum mechanics methods for information retrieval and natural language processing. It discusses the interesting analogies between components of information retrieval and quantum mechanics. It then describes some quantum mechanics phenomena found in the conventional data analysis and in the psychological experiments for word association. It also presents the applications of the concepts and methods in quantum mechanics such as quantum logic and tensor product to document retrieval and meaning of composite words, respectively. The purpose of the paper is to give the state of the art on and to draw attention of the IR community to the geometric and quantum methods and their potential applications in IR and NLP.

More complex models can (may?) lead to better IR methods, but:

Moreover, as Hilbert space is the mathematical foundation for quantum mechanics (QM), basing IR on Hilbert space creates an analogy between IR and QM and may usefully bring some concepts and methods from QM into IR. (p.24)

is a dubious claim at best.

The “analogy” between QM and IR makes the point:

QM IR
a quantum system a collection of object for retrieval
complex Hilbert space information space
state vector objects in collection
observable query
measurement search
eigenvalues relevant or not for one object
probability of getting one eigenvalue relevance degree of object to query

The authors are comparing apples and oranges. For example, “complex Hilbert space” and “information space.”

A “complex Hilbert space” is a model that has been found useful with another model, one called quantum mechanics.

An “information space,” on the other hand, encompasses models known to use “complex Hilbert spaces” and more. Depends on the information space of interest.

Or the notion of “observable” being paired with “query.”

Complex Hilbert spaces may be quite useful for IR, but tying IR to quantum mechanics isn’t required to make use of it.

Powered by WordPress