Quantum computing NNs

As many who have been following our blog know, AI, Machine Learning (ML) and Deep Learning (DL) (e.g. see our Learning machine learning – part 3, & Industrial revolution deep learning & NVIDIA’s 3U supercomputer, AI reaches a crossroads posts), have become much more mainstream and AI has anointed DL as the best approach for pattern recognition, classification, and prediction, but has applicability beyond that.

One problem with DL has been it’s energy costs. There have been some approaches to address this, but none have been entirely successful (e.g. see Intel new DL Boost, New GraphCore GC2 chips, AI processing at the edge posts) just yet. At one time neuromorphic hardware was the answer but I’ve become disillusioned with that technology over time (see Are neuromorphic chips a dead end post).

This past week we learned of a whole new approach, something called a Quantum Convolutional NN or QCNN (see PhysOrg Introducing QCNN, pre-print of Quantum CNNs, presentation deck on QCNNs, Nature QCNN paper paywall).

Some of you may not know that convolutional neural networks (ConvNets) are the latest in a long line of DL architectures focused on recognizing patterns and classification of data in sequence. DL ConvNets can be used to recognize speech, classify photo segments, analyze ticker tapes, etc.

But why quantum computing

First off, quantum computing (QC) is a new leading edge technology targeted to solving very hard (NP Complete, wikipedia) problems, like cracking Public Key encryption keys, solving the traveling salesperson problem and assembling an optimum Bitcoin block problem (see List of NP complete problems, wikipedia).

QC utilizes quantum mechanical properties of the universe to solve these problems without resorting to brute force searches, such as, going down every path in the traveling salesmen problem (see our QC programming and QC at our doorsteps posts).

At the moment, IBM, Google, Intel and others are all working on the QC and trying to scale it up, by increasing the number of Qubits (quantum bits) their systems support. The more qubits, the more quantum storage you have, and the more sophisticated NP complete problems one can solve. Current qubit counts include: 72 qubits for Google, 42 for Intel, and 50 for IBM. Apparently not all qubits are alike, and they don’t last very long, ~100 microseconds (see Timeline of QC, wikipedia).

What’s a QCNN?

What’s new is the use of quantum computing circuits to create ConvNets. Essentially the researchers have created a way to apply AI DL (ConvNet) techniques to quantum computing data (qubits).

Apparently there are QC [qubit] phases that need to be recognized and what better way to do that than use DL ConvNets. The only problem is that performing DL on QC data with today’s tools, would require reading out the phase into a digital (a pattern recognition problem), converting to digital data, and then processing it via CPU/GPU DL ConvNets, a classic chicken or egg problem. But with QCNNs, one has a DL ConvNet entirely implemented in QC.

DL ConvNets are typically optimized for a specific problem, varying layer counts, nodes/layer, node connectivity, etc. QCNNs match this and also come in various sizes. Above is a QCNN circuit, optimized to recognize the phase (joining?) of two sets of symmetrically-protected topology numbers (SPT, see pre-print article).

I won’t go into the QC technology used in any detail (as I barely understand it), but the researchers have come up with a way to map DL ConvNets into QC circuitry. Assuming this all works, one can then use QC to perform DL pattern recognition on qubit data.

~~~~

Comments?

Photo Credits:

Quantum computer programming

I was on a vendor call last week and they were discussing their recent technological advances in quantum computing. During the discussion they mentioned a number of ways to code for quantum computers. The currently most popular one is based on the QIS (Quantum Information Software) Kit.

I went looking for a principle of operations on quantum computers. Ssomething akin to the System 360 Principles of Operations Manual that explained how to code for an IBM 360 computer. But there was no such manual.

Instead there is a paper, on the Open Quantum Assembly Language (QASM) that describes the Quantum computational environment and coding language used in QIS Kit.

It appears that quantum computers can be considered a special computational co-proccesor engine, operated in parallel with normal digital computation. This co-processor happens to provide a quantum simulation.

QASM coding

One programs a quantum computer by creating a digital program which describes a quantum circuit that uses qubits and quantum registers to perform some algorithm on those circuits. The quantum circuit can be measured to provide a result  which more digital code can interpret and potentially use to create other quantum circuits in a sort of loop.

There are four phases during the processing of a QIS Kit quantum algorithm.

  1. QASM compilation which occurs solely on a digital computer. QASM source code describing the quantum circuit together with compile time parameters are translated into a quantum PLUS digital intermediate representation.
  2. Circuit generation, which also occurs on a digital computer with access to the quantum co-processor. The intermediate language compiled above is combined with other parameters (available from the quantum computer environment) and together these are translated into specific quantum building blocks (circuits) and some classical digital code needed and used during quantum circuit execution.
  3. Execution, which takes place solely on the quantum computer. The system takes as input, the collection of quantum circuits defined above and runtime control parameters,and transforms these using a high-level quantum computer controller into low-level, real time instructions for the quantum computer building the quantum circuits. These are then executed and the results of the quantum circuit(s) execution creates a result stream (measurements) that can be passed back to the digital program for further  processing
  4. Post-Processing, which takes place on a digital computer and uses the results from the quantum circuit(s) execution and other intermediate results and processes these to either generate follow-on quantum circuits or output ae final result for the quantum algorithm.

As qubit coherence only last for a short while, so results from one execution of a quantum circuit cannot be passed directly to another execution of quantum circuits.  Thus these results have to be passed through some digital computations before they can be used in subsequent quantum circuits. A qubit is a quantum bit.

Quantum circuits don’t offer any branching as such.

Quantum circuits

The only storage for QASM are classical (digital) registers (creg) and quantum registers (qreg) which are an array of bits and qubits respectively.

There are limited number of built-in quantum operations that can be performed on qregs and qubits. One described in the QASM paper noted above is the CNOT   operation, which flips a qubit, i.e., CNOT alb will flip a qubit in b, iff a corresponding qubit in a is on.

Quantum circuits are made up of one or more gate(s). Gates are invoked with a set of variable parameter names and quantum arguments (qargs). QASM gates can be construed as macros that are expanded at runtime. Gates are essentially lists of unitary quantum subroutines (other gate invocations), builtin quantum functions or barrier statements that are executed in sequence and operate on the input quantum argument (qargs) used in the gate invocation.

Opaque gates are quantum gates whose circuits (code) have yet to be defined. Opaque gates have a physical implementation may yet be possible but whose definition is undefined. Essentially these operate as place holders to be defined in a subsequent circuit execution or perhaps something the quantum circuit creates in real time depending on gate execution (not really sure how this would work).

In addition to builtin quantum operations,  there are other statements like the measure  or  reset statement. The reset statement sets a qubit or qreg qubits to 0. The measure statement copies the state of a qubit or qreg into a digital bit or creg (digital register).

There is one conditional command in QASM, the If statement. The if statement can compare a creg against an integer and if equal execute a quantum operation. There is one “decision”  creg, used as an integer.  By using IF statements one can essentially construct a case statement in normal coding logic to execute quantum (circuits) blocks.

Quantum logic within a gate can be optimized during the compilation phase so that they may not be executed (e.g., if the same operation occurs twice in a gate, normally the 2nd execution would be optimized out) unless a barrier statement is encountered which prevents optimization.

Quantum computer cloud

In 2016, IBM started offering quantum computers in its BlueMix cloud through the IBM Quantum (Q)  Experience. The IBM Q Experience currently allows researchers access to 5- and 16-qubit quantum computers.

There are three pools of quantum computers: 1 pool called IBMQX5, consists of 8 16-qubit computers and 2 pools of 5 5-qubit computers, IBMQX2 and IBMQX4. As I’m writing this, IBMQX5 and IBMQX2 are offline for maintenance but IBMQX4 is active.

Google has recently released the OpenFermion as open source, which is another software development kit for quantum computation (will review this in another post). Although Google also seems to have quantum computers and has provided researchers access to them, I couldn’t find much documentation on their quantum computers.

Two other companies are working on quantum computation: D-Wave Systems and Rigetti Computing. Rigetti has their Forest 1.0 quantum computing full stack programming and execution environment but I couldn’t easily find anything on D-Wave Systems programming environment.

Last month, IBM announced they have  constructed a 50-Qubit quantum computer prototype.

IBM has also released 20-Qubit quantum computers for customer use and plans to offer the new 50-Qubit computers to customers in the future.

Comments?

Picture Credit(s): Quantum Leap Supercomputer,  IBM What is Quantum Computing Website

QASM control flow, Open Quantum Assembly Language, by A. Cross, et al

IBM’s newly revealed 50-Qubit Quantum Processer …,  Softcares blog post

Quantum computing at our doorsteps

Read an article the other day in MIT’s Technical Review, Google’s new chip is a stepping stone to quantum computing… about Google’s latest endeavor to create quantum computers. Although, digital logic or classical electronic computation has been around since mid last century, quantum logic does things differently and there are many problems that are easier to compute with quantum computing that take much longer to solve with digital computing.

Qubits are weird

Classical or digital electronic computation follows the more physical mechanistic view of the world (for the most part) and quantum computing follows the quantum mechanical view of the world. Quantum computing uses quantum bits or Qubits and the device that Google demonstrated has a 2X3 matrix of qubits, 6 in total.

Unlike a bit, which (theoretically)is a two state system that can only take on the values of 0 and 1, a qubit is a two level system but it can take on an infinitely many number of different states in reality. In practice, with a qubit, there are always two states that are distinguishable from one another but they can be any two states of the infinitely many states they can take on.

Also, reading out the state value of a qubit can be a probabilistic endeavor and can impact the “value” of the qubit that is read out afterwards.

There’s more to quantum computing and I am certainly no expert. So if your interested, I suggest starting with this Arxiv article.

Faster quantum algorithms

In any case some difficult and time consuming arenas of classical computation seem to be easier and faster with quantum computation. For example,

  • Factoring large numbers – in classical computation this process takes an amount of time that is exponential to the number of bits in the “large number”, where “B” is number of bits and “E” epsilon is a constant >0, the best current algorithms take O([1+E]**B) time. But Shor’s quantum factorization algorithm takes only O(B**3) time, which is considerably faster for large numbers. This is important because RSA cryptography and most key exchange algorithms in use today, base their security on the difficulty of factoring large numbers. (See Wikipedia article on Integer Factorization for more information.
  • Searching an unstructured list – in classical computation for a list of N items, it takes on the O(N). But Grover’s quantum search algorithm only takes O(sort[N]) which is considerably faster for large lists. (See Arxiv paper for more information.)

Using the Shor factorization algorithm, they were able to factor the number 15 with 7 qubits.

There are many quantum algorithms available today (see the Quantum Algorithm Zoo at NIST) with more showing up all the time.  Suffice it to say that quantum computing will be a more time efficient and thus, more effective approach to certain problems than classical computing.

Quantum computers starting to scale

Now back to the chip. According to the article the new Googl chip implements a 2X3 matrix of qubits.

For those old enough to remember, this was called an Octal or 3-bit number, ranging from 0 to 7, and two octals can range from 0..64. Octals were used for a long time to represent digital information for some (mostly mini-computers) computers. This is in contrast to most computing nowadays ,which uses Hexadecimal numbers or 4-bit numbers ranging from 0..15, and with two hexadecimal numbers ranging from 0..255.

Why are octals important? Well if quantum computing can scale up multiple octal numbers, then they can start representing really large numbers. According to the article Google chose 2X3 qubit structure because it’s more easy to scale.

I assume all the piping surrounding the chip package in the above photo are cooling ports. It seems that quantum computing only works at very cold temperatures. And if this is a two octals computer, scaling these up to multiple octals is going to take lots of space.

How quickly will it scale?

For some history, Intel introduced their 4004 (4-bit) computing chip in 1971 (Wikipedia), their 8-bit Intel 8008 in 1972 (Wikipedia), their 16-bit Intel 8086 between 1976-78. So in 7 years we went from a 4-bit computer to a 16 bit computer whose (x86) architecture continues on today and rules the world.

Now the Intel 4004 had 16 4-bit registers, had a data/instruction bus that could address 4096 4-bit words, 3-level subroutine stack and was a full fledged 4 bit computer. It’s unclear what’s in Google’s chip. But if we consider that this 2×3-qubit computer, which has multiple 2×3 qubit registers, a qubit storage bus, multi-level qubit subroutine (register) stack, etc. Then we are well on our way to quantum computing being added to the worlds computational capabilities in less than 10 years.

And of course, Googles not the only large organization working on quantum computing.

~~~~

So there you have it, Google and others are in the process of making your cryptography obsolete, rapidly speeding up unstructured searching and doing multiple other computations lots faster than today.

Photo Credit(s): from the MIT Technical Review article.

 

Learning to live with lattices or say goodbye to security

safe 'n green by Robert S. Donovan (cc) (from flickr)
safe ‘n green by Robert S. Donovan (cc) (from flickr)

Read an article the other day in Quantum Magazine: A tricky path to quantum encryption about the problems that will occur in current public key cryptology (PKC) schemes when quantum computing emerges over the next five to 30 years.  With advances in quantum computing our current PKC scheme that depends on the difficulty of factoring large numbers will be readily crackable. At that time, all current encrypted traffic, used by banks, the NSA, the internet, etc. will no longer be secure.

NSA, NIST, & ETSI looking at the problem

So there’s a search on for quantum-resistant cryptology (see this release from ETSI [European Telecommunications Standard Institute], this presentation from NIST [{USA} National Institute of Standards &Technology], and this report from Schneier on Security on NSA’s [{USA} National Security Agency] Plans for Post-Quantum world ). There are a number of alternatives being examined by all these groups but the most promising at the moment depends on multi-dimensional (100s of dimensions) mathematical lattices.

Lattices?

According to Wikipedia a lattice is a 3-dimensional space of equidistant points. Apparently, for security reasons, they had to increase the number of dimensions significantly beyond 3.

A secret is somehow inscribed in a route (vector) through this 500-dimensional lattice between two points: an original  point (the public key) in the lattice and another arbitrary point, somewhere nearby in the lattice. The problem from a cryptographic sense is that finding a route, in a 500 dimensional lattice, is a difficult task when you only have one of the points.

But can it be efficient for digital computers of today to use?

So the various security groups have been working on divising efficient algorithms for multi-dimensional public key encryption over the past decade or so. But they have run into a problem.

Originally, the (public) keys for a 500-dimensional lattice PKC were on the order of MBs, so they have been restricting the lattice computations to utilize smaller keys and in effect reducing the complexity of the underlying lattice. But in the process they have now reduced the security of the lattice PKC scheme. So they are having to go back to longer keys, more complex lattices and trying to ascertain which approach leaves communications secure but is efficient enough to implement by digital computers and communications links of today.

Quantum computing

The problem is that quantum computers provide a much faster way to perform certain calculations like factoring a number. Quantum computing can speed up this factorization, by on the order of the square root of a number, as compared to normal digital computing of today.

Its possible that similar quantum computing calculations for lattice routes between points could also be sped up by an equivalent factor.  So even when we all move to lattice based PKC, it’s still possible for quantum computers to crack the code hopefully, it just takes longer.

So the mathematics behind PKC will need to change over the next 5 years or so as quantum computing becomes more of a reality. The hope is that this change will will at least keep our communications secure, at least until the next revolution in computing comes along, or quantum computing becomes even faster than that envisioned today.

Comments?