Niccolo Somaschi, CEO of Quandela, is interviewed by Yuval Boger. Niccolo describes the company’s approach to quantum computing, emphasizing the use of photonics and semiconductor technologies to manipulate photonic qubits. He highlights Quandela’s focus on building commercially useful quantum computers, the development of a unique programming framework for transforming gate-based circuits into photonic operations, the deployment of their 6-qubit processor, the challenges for scaling photonic quantum computers, and much more.

## Full Transcript

**Yuval Boger:** Hello, Nicolo, and thank you for joining me today.

**Niccolo Somaschi:** Hello, Yuval.

**Yuval**: So who are you and what do you do?

**Niccolo**: I’m the CEO of Quandela, a quantum computing company based in Europe, with head-quarters in Paris, which I co-founded in 2017, together with Valerian Giesz and Pascale Senellart. Pascale Senellart who is a senior scientist at the National Research Centre CNRS, member of the Academy of Science and of the presidential council for science.

Quandela builds quantum computers with increasing complexity, following the quest of building useful, commercially useful quantum computers which implement error correction.

**Yuval**: And what is the underlying technology?

**Niccolo**: Quandela exploits photonics and semiconductors technologies. We manipulate photonic qubits, which are generated through semiconductor nanostructures, semiconductor devices, which also serve as qubits themselves. So it’s a photonic quantum computer that exploits few spin-based qubits, which is a bit of an exotic approach.

**Yuval**: How does one program a photonic quantum computer?

**Niccolo**: Well, photons serve as a very efficient qubit because they allow different kinds of encoding. They can be encoded in “dual-rail” using integrated photonic circuits. That’s the main approach. But it also permits to use “time”, “time bin encoding”, both for logic operation and for qubit synchronization. The combination of optical fibers with photonic integrated circuits permits the development of quantum computing systems which are modular and interconnected. So it allows to expand and distribute the computation overcoming some of the issues, qubit size and space basically, which affects several platform manipulating matter-based qubits.

**Yuval**: So that probably means I can’t just take a QASM program and send it to your computer, right? I have to do something else. If that’s true, what kind of applications or what kind of programs can I run on the computer?

**Niccolo**: It’s also true what you’re saying, indeed. Photons also support an encoding which is purely photonic, similarly to boson sampling, Gaussian boson sampling, or “standard” boson sampling for discrete variables. Quandela works also with this approach. So this is true. We can use photon as photons and work in a subspace of the Hilbert space, the Fock space, where number of parameters scale faster than hilbert space, for the same number of qubits. So this is a peculiar, interesting approach;, but at the same time we can also treat photos as qubits and convert any gate-based circuits into a pure photonic implementation. And that’s something that also Quandela provides in the full stack approach through its programming framework internally developed, which is called Perceval, that uses a graphical language method in order to optimally transform any CNOT or sequence of logic gates as we would implement in Qiskit, for example, into a sequence of linear optical operations to be implement on the hardware, through compilation and transpilation.

**Yuval**: And if I compare your computer today to one in two years, for instance, is the number of qubits the measure that you’re looking at, or is it something else? How do I compare photonic quantum computers?

**Niccolo**: Well, good point. If we look from the user perspective, what will change is surely the computing power and the spectrum of possible algorithms we can implement, both because it will depend, of course, on the number of qubits, but also on the encoding. If we look bottom-up at the infrastructure, that will change a lot. In fact, internally, we are moving to the second generation of machines compared to what we have today. We are moving from manipulation of single photons, as today, as the processor available on the cloud, to the direct manipulation of entangled photons. So, the computing power will scale because the qubit processing and manipulation will be directly entangled photons – graphs – without the need to pre-entangle them before manipulation. And that will allow to increase dramatically the efficiency, reducing the number of resources needed in terms of hardware.

**Yuval**: You mentioned that transpiler from gate-based applications to the photonic way of thinking, but if I want to focus and get the maximum out of the computer, do I program in gate-based mode, or should I think about the program in a different way?

**Niccolo**: Today, with some industrial partners we work on trying to bring out the power of the photonic platform directly conceiving photonic algorithms. Certain gate-based algorithms can hardly be transferred into the photonic language, some other more efficiently, especially machine learning based ones, and VQ, variational quantum type because we can manipulate general phases on the circuits. And that’s where we are looking at. Instead of trying to find algorithms for today’s quantum computing platform or simulator, transferring a gate-based circuit into photonics, we take the orthogonal approach and look into classical algorithms which maps directly into photonic quantum algorithms. This goes through mathematical problems based on permanents.

**Yuval**: You have some computers already deployed, right?

**Niccolo**: Yes, indeed. Since a year ago, we have Ascella, which is a processor of 6 qubits, available on the cloud, a proprietary cloud. at the same time we are working for upgrading the platform on the cloud, and to provide on-premise machines to data centers. That’s what we have been doing the past year, providing a machine to OVH cloud datacentre, which is a cloud provider based in Europe.

**Yuval**: What is involved in on-prem deployment? Do you need a dilution refrigerator? Do you need a lot of space or power supply? How does a deployment of an optical quantum computer look?

**Niccolo**: Today the on-premise quantum computing machines we provide to customers, are rack-based, the standard server tower rack, which makes them seamless integratable in data center – as they take the shape of standard CPUs or GPUs. They should not be seen as such because, indeed, we use cryogenics, cryogenics for the qubit generators and detectors that work at 3 Kelvin, which is a much, much higher temperature than superconducting qubits. This means that in terms of cryogenic technology, we can use rack size cryostats and rack size air-cooled compressors. This entire machine with compressor itself – consumes a the total power below 3 kilowatts plugged into the standard sockets. These are the current machines. While we will scale the number of qubits, we will multiply the number of these racks.

**Yuval**: When you think about a 6-qubit machine, obviously there’s learning that you can do with it; therefore, larger qubit machines have larger qubit counts. Is there something truly useful one could do with a 6-qubit machine? And I’m sure the same question could be asked about 16 or 26 qubits. But what are people doing with it?

**Niccolo**: Today we try to, of course, have utility in the sense of scientific discovery, approximation of some algorithms, and run toy models, of course. However, one focus we had since the beginning was also related to cryptography. And that’s what the first customer purchased the first machine for. the application that we run on this computer is to generate quantum-certified random numbers. Something similar to what Quantinuum and IBM have developed with their platform. What we have done is develop the framework, the theoretical framework in order to use contextuality, to perform Bell inequality measurements on-chip, and still validate this theorem and therefore extract quantum-certified random numbers. This is a simple application, but powerful itself. And we can even call it a quantum advantage in the sense it’s not a quantum computing advantage, but it’s a quantum advantage in cryptography where quantum helps to increase the level of security compared to other classical methods.

**Yuval**: You mentioned IBM and Quantinuum, and I think certainly IBM, when they think about a larger number of qubits, they say at some point we’re going to need to use interconnects. IonQ, I think the same thing. They have optical interconnects. Now, given that your system is optical from the get-go, how does it scale? Do you need multiple machines and somehow interconnect them, or do you just add more fibers?

**Niccolo**: A bit of both. As I was saying in the beginning, today we use semiconductors a matter-based qubits, to generate light, to generate the qubits. So the photon itself, the qubits, part of the computation, gets out of the parts of the machine and runs into fibers. So, for us, qubits are already shared over different modules, which helps in the conception of the larger number of qubits machines because, for us, upgrading the number of qubits is adding the number of modules.

Of course, there is a lot of attention to be put into the stabilization of the qubits through fiber and optimization of the optical links. But yes, this is indeed the approach, and we see it as a good way to start because we bypass some issues that, of course, are related to manufacturing a large size processor itself.

**Yuval**: I believe you said you started the company in 2017, and maybe you were in quantum a little bit before that. So you’re definitely an expert. What have you learned? What’s new in the last six or 12 months that you didn’t know prior to that?

**Niccolo**: clearly many things, related to the business and technology, that we should expect surprises coming along. Not scientific surprises because we know where we are going, surprising the sense of players or approaches that could disrupt both the market and the technology along the way. This could be by merging technologies, by changing materials, and therefore allowing the use of more algorithms. From here the idea that we do need to build quantum computers because if we put them out there, “someone will use them”. So instead of just waiting to have the error corrected 1 million, 2 million, 5 million qubits quantum computers, we want to build the ones in between because someone will innovate through them. From what we see, some surprises could come up from the user that will start to test these intermediate machines.

**Yuval**: You mentioned 1 million qubits, and that conjures some other optical company. Are all optical computers the same in terms of approach, or is there something substantially different between one optical manufacturer and another?

**Niccolo**: There are many differences. That’s also why photonics has been for a while put aside. because it’s complex to understand and conceptually different and in many flavors. Just as an example, photonic quantum computers can use discrete variables or can use continuous variables. This is the main difference. If we scratch the surface, then we look directly into discrete variables, and there we can find different kinds of architecture. Some are inspired by measurement-based quantum computing, which is different than gate based ones. Some use different materials, and different technologies in order to generate qubits. As we do, for example, we use semiconductor nanostructure- quantum dots, while most of the other players use non-linear processes that, of course, are built into silicon, silicon nitride, or other waveguides. There is an internal kind of creativity, let’s say, in the photonic quantum computing community. And probably surprising will come also from emerging or different technologies and approaches.

**Yuval**: When you install a system, I think you mentioned OVH, just one of your first customers. Is the computer standalone, or is it used often together with classical compute resources?

**Niccolo**: So this one is standalone. But Perceval, the programming framework, has everything needed as APIs to be connected to the internal cloud, for example, OVH, which is not opening this machine to their customer because it’s used for internal research purposes. This means that we could directly connect Perceval to other machines, the GPUs or the Atos simulator, and can be optimized to run parts of the algorithms on those. This is something we already do with the simulator, so the quantum simulator shares part of the algorithm with the classical GPU. We see this as very important for the machine to come especially from the user perspective.

**Yuval**: What is the limiting factor in scaling the machine? Is it laser power? Is it miniaturization? What would it take to take this machine and make it much, much, much larger?

**Niccolo**: The main engineering problem is losses. Is the transparency of the elements, is the capability to generate one qubit and make it pass through the entire chain of systems and modules and be detected. So what we call the entire losses threshold, has to be very high in order for a certain number of qubits to reach the threshold for error correction. That’s the hardest engineering problem all the companies, to face. Knowing that photonic qubits don’t undergo decoherence itself, the error is not into the decoherence or face flip or bit flip, etc. But losses. One photon is either there or not. And losing a photon is very easy. So in Quandela, what we say is that “every photon counts”.

**Yuval**: As we get close to the end of our conversation, I’m curious, professionally speaking, what keeps you up at night?

**Niccolo**: Many things. There are many nights a week, so I share my thoughts. It goes by period and I would say engineering is, of course, the main one, how to allow the teams to get always the best providers, the best technologies, and to let the team do their research without caring about anything else. For example, recently, we’ve been launching our own internal clear room to start developing the fabrication process of most of the devices internally. This, of course, has a lot of infrastructure, a lot of problems related to making these fast and transfer fabrications chain of production from one of the facilities to another. This, of course, needs a lot of attention. And this is an example of the thing we were very careful in doing properly.

**Yuval**: One thing I wanted to go back to, I think I understand the use in random number generation, but you mentioned machine learning. What makes photonic computers particularly useful for machine learning applications?

**Niccolo**: As I mentioned before, the fact that machine learning allows the manipulation of a large set of parameters but not controlling each of the parameters. So the photonic approach allows the manipulation of sets of qubits, manipulating the general phase, for example, without going into the control of each one. And there are also many similarities in the computer science term in this approach, from neural network to the large interferometer, which is the main manipulating object in the photonic circuits. That’s why we’re trying to find a smart way to map this kind of mathematical problem to permutation and find a related one, similar one in standard computer science in some small or large set of problems that can still impact industries.

**Yuval**: And last hypothetical, if you could have dinner with one of the quantum greats, dead or alive, who would that person be?

**Niccolo**: Oh, wow. Let’s say Heisenberg.

**Yuval**: And why?

**Niccolo**: Because there are many questions that I would ask on the general fundamental sense, which I couldn’t find in many of his books.

**Yuval**: Very good. Nicolo, thank you so much for joining me today.

**Niccolo**: Thank you, Yuval.