We consider the asymptotic regularization of the maximal fidelity for the generalized Pauli channels, which is a problem similar to the classical channel capacity. In particular, we find the formulas for the extremal channel fidelities and the maximal output $\infty$-norm. For wide classes of channels, we show that these quantities are multiplicative. Finally, we find the regularized maximal fidelity for the channels satisfying the time-local master eqeuations.

We extend the spherical code based key distribution protocols to qudits with dimensions 4 and 16 by constructing equiangular frames and their companions. We provide methods for equiangular frames in arbitrary dimensions for Alice to use and the companion frames, that has one antipode to eliminate one of the possibilities, made up of qudits with $N=4,16$ as part of Bob's code. Non-orthogonal bases that form positive operator valued measures can be constructed using the tools of frames (overcomplete bases of a Hilbert space) and here we apply them to key distribution that are robust due to large size of the bases making it hard for eavesdropping. We demonstrate a method to construct a companion frame for an equiangular tight frame for $\complexes^{p-1}$ generated from the discrete Fourier transform, where $p$ is any odd prime. The security analysis is based on the assumption restricting possible attacks to intercept/resend scenario highlighting the advantages of a qudit over qubit-based protocols.

The ability to generate light in a pure quantum state is essential for advances in optical quantum technologies. However, obtaining quantum states with control in the photon-number has remained elusive. Optical light fields with zero and one photon can be produced by single atoms, but so far it has been limited to generating incoherent mixtures, or coherent superpositions with a very small one-photon term. Here, we report on the on-demand generation of quantum superpositions of zero, one, and even two photons, via pulsed coherent control of a single artificial atom. Driving the system up to full atomic inversion leads to the generation of quantum superpositions of vacuum and one photon, with their relative populations controlled by the driving laser intensity. A stronger driving of the system, with $2\pi$-pulses, results in a coherent superposition of vacuum, one and two photons, with the two-photon term exceeding the one-photon component, a state allowing phase super-resolving interferometry. Our results open new paths for optical quantum technologies with access to the photon-number degree-of-freedom.

The quest for an integrated quantum optics platform involving solid state waveguides and cavities combined with semiconductor quantum dots has motivated the field of quantum semiconductor research for two decades. Demonstrations of discrete quantum light sources, single photon switches, transistors, and quantum memory-photon interfaces have become very advanced. Yet the fundamental problem that every quantum dot is different prevents integration and scaling beyond a few quantum dots. Here, we address this challenge by patterning strain via local phase transitions to selectively tune individual quantum dots that are embedded in a photonic architecture. The patterning is implemented with in operando micro-laser crystallization of a thin HfO_2 film "sheath" on the surface of a GaAs waveguide. Using this approach, we tune InAs quantum dot emission energies over the full inhomogeneous distribution with a step size down to the homogeneous linewidth and a spatial resolution better than 1 \mum. As a demonstration, we tune three quantum dots to the same energy within a waveguide. This approach is general, and the same principles extend to other solid-state quantum emitters and photonic structures.

The phase factor $(-1)^{2s}$ that features in the exchange symmetry for identical spin-$s$ fermions or bosons contributes to but is not identical to the phase factor one may observe when one actually exchanges two such particles. The observable phase contains {\em single-particle} geometric and dynamical phases as well, induced by both spin and spatial exchange transformations. Extending the analysis to (abelian) anyons by incorporating the Aharonov-Bohm effect shows that the topological phase two anyons pick up under an interchange is at bottom a single-particle effect, in contrast to the fermion/boson exchange phase factor $(-1)^{2s}$, which is a two-particle effect.

A central tenet of theoretical cryptography is the study of the minimal assumptions required to implement a given cryptographic primitive. One such primitive is the one-time memory (OTM), introduced by Goldwasser, Kalai, and Rothblum [CRYPTO 2008], which is a classical functionality modeled after a non-interactive 1-out-of-2 oblivious transfer, and which is complete for one-time classical and quantum programs. It is known that secure OTMs do not exist in the standard model in both the classical and quantum settings.

Here, we propose a scheme for using quantum information, together with the assumption of stateless (i.e., reusable) hardware tokens, to build statistically secure OTMs. Via the semidefinite programming-based quantum games framework of Gutoski and Watrous [STOC 2007], we prove security for a malicious receiver, against a linear number of adaptive queries to the token, in the quantum universal composability framework. We prove stand-alone security against a malicious sender, but leave open the question of composable security against a malicious sender, as well as security against a malicious receiver making a polynomial number of adaptive queries. Compared to alternative schemes derived from the literature on quantum money, our scheme is technologically simple since it is of the "prepare-and measure" type. We also show our scheme is "tight" according to two scenarios.

In this paper, we construct a new scheme for delegating a large circuit family, which we call "C+P circuits". "C+P" circuits are the circuits composed of Toffoli gates and diagonal gates. Our scheme is non-interactive, only requires small quantum resources on the client side, and can be proved secure in the quantum random oracle model, without relying on additional assumptions, for example, the existence of fully homomorphic encryption. In practice the random oracle can be replaced by an appropriate hash functions, for example, SHA-3, AES.

This protocol allows a client to delegate the most expensive part of some quantum algorithms, for example, Shor's algorithm. The previous protocols that are powerful enough to delegate Shor's algorithm require either many rounds of interaction or the existence of FHE. The quantum resources required by the client are fewer than when it runs Shor's algorithm locally.

Different from many previous protocols, our scheme is not based on quantum one time pad, but on a new encoding called "entanglement encoding". We then generalize the garbled circuit to reversible garbled circuit to allow computation on this encoding.

To prove the security of this protocol, we study key dependent message(KDM) security in the quantum random oracle model. Then as a natural generalization, we define and study quantum KDM security. KDM security was not previously studied in quantum settings.

Color centers in solids are the fundamental constituents of a plethora of applications such as lasers, light emitting diodes and sensors, as well as the foundation of advanced quantum information and communication technologies. Their photoluminescence properties are usually studied under Stokes excitation, in which the emitted photons are at a lower energy than the excitation ones. In this work, we explore the opposite Anti-Stokes process, where excitation is performed with lower energy photons. We report that the process is sufficiently efficient to excite even a single quantum system, namely the germanium-vacancy center in diamond. Consequently, we leverage the temperature-dependent, phonon-assisted mechanism to realize an all-optical nanoscale thermometry scheme that outperforms any homologous optical method employed to date. Our results frame a promising approach for exploring fundamental light-matter interactions in isolated quantum systems, and harness it towards the realization of practical nanoscale thermometry and sensing.

Simulation of fermionic many-body systems on a quantum computer requires a suitable encoding of fermionic degrees of freedom into qubits. Here we revisit the Superfast Encoding introduced by Kitaev and one of the authors. This encoding maps a target fermionic Hamiltonian with two-body interactions on a graph of degree $d$ to a qubit simulator Hamiltonian composed of Pauli operators of weight $O(d)$. A system of $m$ fermi modes gets mapped to $n=O(md)$ qubits. We propose Generalized Superfast Encodings (GSE) which require the same number of qubits as the original one but have more favorable properties. First, we describe a GSE such that the corresponding quantum code corrects any single-qubit error provided that the interaction graph has degree $d\ge 6$. In contrast, we prove that the original Superfast Encoding lacks the error correction property for $d\le 6$. Secondly, we describe a GSE that reduces the Pauli weight of the simulator Hamiltonian from $O(d)$ to $O(\log{d})$. The robustness against errors and a simplified structure of the simulator Hamiltonian offered by GSEs can make simulation of fermionic systems within the reach of near-term quantum devices. As an example, we apply the new encoding to the fermionic Hubbard model on a 2D lattice.

We speculate whether the second law of thermodynamics has more to do with Turing machines than steam pipes. It states the logical reversibility of reality as a computation, i.e., the fact that no information is forgotten: nature computes with Toffoli-, not NAND gates. On the way there, we correct Landauer's erasure principle by directly linking it to lossless data compression, and we further develop that to a lower bound on the energy consumption and heat dissipation of a general computation.

Within ordinary ---unitary--- quantum mechanics there exist global protocols that allow to verify that no definite event ---an outcome to which a probability can be associated--- occurs. Instead, states that start in a coherent superposition over possible outcomes always remain as a superposition. We show that, when taking into account fundamental errors in measuring length and time intervals, that have been put forward as a consequence of a conjunction of quantum mechanical and general relativity arguments, there are instances in which such global protocols no longer allow to distinguish whether the state is in a superposition or not. All predictions become identical as if one of the outcomes occurs, with probability determined by the state. We use this as a criteria to define events, as put forward in the Montevideo Interpretation of Quantum Mechanics. We analyze in detail the occurrence of events in the paradigmatic case of a particle in a superposition of two different locations. We argue that our approach provides a consistent (C) single-world (S) picture of the universe, thus allowing an economical way out of the limitations imposed by a recent theorem by Frauchiger and Renner showing that having a self-consistent single-world description of the universe is incompatible with quantum theory. In fact, the main observation of this paper may be stated as follows: If quantum mechanics is extended to include gravitational effects to a QG theory, then QG, S, and C are satisfied.

We introduce here a new axiomatisation of the rational fragment of the ZX-calculus, a diagrammatic language for quantum mechanics. Compared to the previous axiomatisation introduced in [8], our axiomatisation does not use any metarule , but relies instead on a more natural rule, called the cyclotomic supplementarity rule, that was introduced previously in the literature. Our axiomatisation is only complete for diagrams using rational angles , and is not complete in the general case. Using results on diophantine geometry, we characterize precisely which diagram equality involving arbitrary angles are provable in our framework without any new axioms, and we show that our axiomatisation is continuous, in the sense that a diagram equality involving arbitrary angles is provable iff it is a limit of diagram equalities involving rational angles. We use this result to give a complete characterization of all Euler equations that are provable in this axiomatisation.

We theoretically propose and experimentally implement a method to measure a qubit by driving it close to the frequency of a dispersively coupled bosonic mode. The separation of the bosonic states corresponding to different qubit states begins essentially immediately at maximum rate, leading to a speedup in the measurement protocol. Also the bosonic mode can be simultaneously driven to optimize measurement speed and fidelity. We experimentally test this measurement protocol using a superconducting qubit coupled to a resonator mode. For a certain measurement time, we observe that the conventional dispersive readout yields above 100 % higher measurement error than our protocol. Finally, we use an additional resonator drive to leave the resonator state to vacuum if the qubit is in the excited state during the measurement protocol. This suggests that the proposed measurement technique may become useful in unconditionally resetting the resonator to a vacuum state after the measurement pulse.

We present the pedagogical method of Tridiagonal representation approach,an algebraic method for the solution of Schrodinger equation in nonrelativistic quantum mechanics for conventional potential functions. However, we solved a new three parameters potential function recently encountered using it as an example

For multipartite states we consider a notion of D-symmetry. For a system of $N$ qubits it concides with usual permutational symmetry. In case of $N$ qudits ($d\geq 3$) the D-symmetry is stronger than the permutational one. For the space of all D-symmetric vectors in $(\mathbb{C}^d)^{\otimes N}$ we define a basis composed of vectors $\{|R_{N,d;k}\rangle: \,0\leq k\leq N(d-1)\}$. The aim of this paper is to discuss the problem of separability of D-symmetric states which are diagonal in the basis $\{|R_{N,d;k}\rangle\}$. We show that if $N$ is even and $d\geq 2$ is arbitrary then a PPT property is necessary and sufficient condition of separability for D-invariant diagonal states. In this way we generalize results obtained by Yu for qubits. Our strategy is to use some classical mathematical results on a moment problem.

Topological phases of matter have attracted much attention over the years. Motivated by analogy with photonic lattices, here we examine the edge states of a one-dimensional trimer lattice in the phases with and without inversion symmetry protection. In contrast to the Su-Schrieffer-Heeger model, we show that the edge states in the inversion-symmetry broken phase of the trimer model turn out to be chiral, i.e., instead of appearing in pairs localized at opposite edges they can appear at a $\textit{single}$ edge. Interestingly, these chiral edge states remain robust to large amounts of disorder. In addition, we use the Zak phase to characterize the emergence of degenerate edge states in the inversion-symmetric phase of the trimer model. Furthermore, we capture the essentials of the whole family of trimers through a mapping onto the commensurate off-diagonal Aubry-Andr\'{e}-Harper model, which allow us to establish a direct connection between chiral edge modes in the two models, including the calculation of Chern numbers. We thus suggest that the chiral edge modes of the trimer lattice have a topological origin inherited from this effective mapping. Also, we find a nontrivial connection between the topological phase transition point in the trimer lattice and the one in its associated two-dimensional parent system, in agreement with results in the context of Thouless pumping in photonic lattices.

A complete characterization of the set of states that can be achieved through Thermal Processes (TP) is given by describing all vertices, edges and facets of the allowed set of states in the language of thermomajorization curves. TPs are linked to transportation matrices, which leads to the existance of extremal TPs that are not required in implemenation of any transition allowed by TPs, for every dimension $d\geq 4$ of the state space. A property of the associated graphs, biplanarity, which differentiates between these extremal TPs and the necessary ones, is identified.

Quantum algorithms can deliver asymptotic speedups over their classical counterparts. However, there are few cases where a substantial quantum speedup has been worked out in detail for reasonably-sized problems, when compared with the best classical algorithms and taking into account realistic hardware parameters and overheads for fault-tolerance. All known examples of such speedups correspond to problems related to simulation of quantum systems and cryptography. Here we apply general-purpose quantum algorithms for solving constraint satisfaction problems to two families of prototypical NP-complete problems: boolean satisfiability and graph colouring. We consider two quantum approaches: Grover's algorithm and a quantum algorithm for accelerating backtracking algorithms. We compare the performance of optimised versions of these algorithms, when applied to random problem instances, against leading classical algorithms. Even when considering only problem instances that can be solved within one day, we find that there are potentially large quantum speedups available. In the most optimistic parameter regime we consider, this could be a factor of over $10^5$ relative to a classical desktop computer; in the least optimistic regime, the speedup is reduced to a factor of over $10^3$. However, the number of physical qubits used is extremely large, and improved fault-tolerance methods will likely be needed to make these results practical. In particular, the quantum advantage disappears if one includes the cost of the classical processing power required to perform decoding of the surface code using current techniques.

Optimal finite-time thermodynamic processes can be constructed by introducing a metric on the thermodynamic state space, known as thermodynamic length. Here, we show how to construct a thermodynamic metric for processes involving a small quantum system, whose Hamiltonian can be externally controlled, in contact with a thermal bath. First, we consider that the Hamiltonian of the system is modified by a sequence of quenches, followed by equilibrations with the bath, and a linear-response approach shows that the metric is induced by the Kubo-Mori-Bogoliubov (KMB) inner product. Second, we consider that the Hamiltonian of the system is continuously modified and the action of the bath is captured by a Lindbladian master equation, and find the corresponding thermodynamic metric. The latter approach has the KMB metric as a particular case, and in general enables the design of better finite-time protocols since it takes into account the nature of the dissipative dynamics. These results are illustrated in a paradigmatic quantum open two-level system.

It is well-known that the violation of a local uncertainty relation can be used as an indicator for the presence of entanglement. Unfortunately, the practical use of these non-linear witnesses has been limited to few special cases in the past. However, new methods for computing uncertainty bounds became available. Here we report on an experimental implementation of uncertainty-based entanglement witnesses, benchmarked in a regime dominated by strong local noise. We combine the new computational method with a local noise tomography in order to design noise-adapted entanglement witnesses. This proof-of-principle experiment shows that quantum noise can be successfully handled by a fully quantum model in order to enhance entanglement detection efficiencies.