We prove a characterization of $t$-query quantum algorithms in terms of the unit ball of a space of degree-$2t$ polynomials. Based on this, we obtain a refined notion of approximate polynomial degree that equals the quantum query complexity, answering a question of Aaronson et al. (CCC'16). Our proof is based on a fundamental result of Christensen and Sinclair (J. Funct. Anal., 1987) that generalizes the well-known Stinespring representation for quantum channels to multilinear forms. Using our characterization, we show that many polynomials of degree four are far from those coming from two-query quantum algorithms. We also give a simple and short proof of one of the results of Aaronson et al. showing an equivalence between one-query quantum algorithms and bounded quadratic polynomials.

Traditional method for measuring continuous-variable quantum entanglement relies on balanced homodyne detections, which are sensitive to vacuum quantum noise coupled in through losses resulted from many factors such as detector's quantum efficiency and mode mismatching between detected field and local oscillator. In this paper, we propose and analyze a new measurement method, which is realized by assisting the balanced homodyne detections with a high gain phase sensitive parametric amplifier. The employment of the high gain parametric amplifier helps to tackle the vacuum quantum noise originated from detection losses. Moreover, because the high gain parametric amplifier can couple two fields of different types in a phase sensitive manner, the proposed scheme can be used to reveal quantum entanglement between two fields of different types by using only one balanced homodyne detection. Furthermore, detailed analysis shows that in the multi-mode case, the proposed scheme is also advantageous over the traditional method. Such a new measurement method should find wide applications in quantum information and quantum metrology involving measurement of continuous variables.

We realize Surface Code quantum memories for nearest-neighbor qubits with always-on Ising interactions. This is done by utilizing multi-qubit gates that mimic the functionality of several gates. The previously proposed Surface Code memories rely on error syndrome detection circuits based on CNOT gates. In a two-dimensional planar architecture, to realize a two-qubit CNOT gate in the presence of couplings to other neighboring qubits, the interaction of the target qubit with its three other neighbors must cancel out. Here we present a new error syndrome detection circuit utilizing multi-qubit parity gates. In addition to speed up in the error correction cycles, in our approach, the depth of the error syndrome detection circuit does not grow by increasing the number of qubits in the logical qubit layout. We analytically design the system parameters to realize new five-qubit gates suitable for error syndrome detection in nearest-neighbor two-dimensional array of qubits. The five-qubit gates are designed such that the middle qubit is the target qubit and all four coupled neighbors are the control qubits. In our scheme, only one control parameter of the target qubits must be adjusted to realize controlled-unitary operations. The gate operations are confirmed with a fidelity of >99.9% in a simulated system consists of nine nearest-neighbor qubits.

The Wigner's friend type of thought experiments manifest the conceptual challenge on how different observers can have consistent descriptions of a quantum measurement event. In this paper, we analyze the extended version of Wigner's friend thought experiment (Frauchiger and Renner, Nature Comm. 3711, 9 (2018)) in detail and show that the reasoning process from each agent that leads to the no-go theorem is inconsistent. The inconsistency is with respect to the requirement that an agent should make use of updated information instead of outdated information. We then apply the relational formulation of quantum measurement to resolve the inconsistent descriptions from different agents. In relational formulation of quantum mechanics, a measurement is described relative to an observer. Synchronization of measurement result is a necessary requirement to achieve consistent descriptions of a quantum system from different observers. Thought experiments, including EPR, Wigner's Friend and it extended version, confirm the necessity of relational formulation of quantum measurement when applying quantum mechanics to composite system with entangled but space-like separated subsystems.

Output probability distributions of several sub-universal quantum computing models cannot be classically efficiently sampled unless some unlikely consequences occur in classical complexity theory, such as the collapse of the polynomial-time hierarchy. These results, so called quantum supremacy, however, do not rule out possibilities of super-polynomial-time classical simulations. In this paper, we study "fine-grained" version of quantum supremacy that excludes some exponential-time classical simulations. First, we focus on two sub-universal models, namely, the one-clean-qubit model (or the DQC1 model) and the HC1Q model. Assuming certain conjectures in fine-grained complexity theory, we show that for any $a>0$ output probability distributions of these models cannot be classically sampled within a constant multiplicative error and in $2^{(1-a)N+o(N)}$ time, where $N$ is the number of qubits. Next, we consider universal quantum computing. For example, we consider quantum computing over Clifford and $T$ gates, and show that under another fine-grained complexity conjecture, output probability distributions of Clifford-$T$ quantum computing cannot be classically sampled in $2^{o(t)}$ time within a constant multiplicative error, where $t$ is the number of $T$ gates.

We report generation and measurement of a squeezed vacuum from a semi-monolithic Fabry-Perot optical parametric oscillator (OPO) up to 100 MHz at 1550 nm. The output coupler of the OPO is a flat surface of a nonlinear crystal with partially reflecting coating, which enables direct coupling with waveguide modules. Using the OPO, we observed 6.2dB of squeezing at 2 MHz and 3.0 dB of squeezing at 100 MHz. The OPO operates at the optimal wavelength to minimize propagation losses in silica waveguides and looks towards solving a bottleneck of downsizing these experiments: that of coupling between a squeezer and a waveguide.

We show that a class of random all-to-all spin models, realizable in systems of atoms coupled to an optical cavity, gives rise to a rich dynamical phase diagram due to the pairwise separable nature of the couplings. By controlling the experimental parameters, one can tune between integrable and chaotic dynamics on the one hand, and between classical and quantum regimes on the other hand. For two special values of a spin-anisotropy parameter, the model exhibits rational-Gaudin type integrability and it is characterized by an extensive set of spin-bilinear integrals of motion, independent of the spin size. More generically, we find a novel integrable structure with conserved charges that are not purely bilinear. Instead, they develop `dressing tails' of higher-body terms, reminiscent of the dressed local integrals of motion found in Many-Body Localized phases. Surprisingly, this new type of integrable dynamics found in finite-size spin-1/2 systems disappears in the large-$S$ limit, giving way to classical chaos. We identify parameter regimes for characterizing these different dynamical behaviors in realistic experiments, in light of the limitations set by cavity dissipation.

We discuss the procedure for gauging on-site $\mathbb{Z}_2$ global symmetries of three-dimensional lattice Hamiltonians that permute quasi-particles and provide general arguments demonstrating the non-Abelian character of the resultant gauged theories. We then apply this general procedure to lattice models of several well known fracton phases: two copies of the X-Cube model, two copies of Haah's cubic code, and the checkerboard model. Where the former two models possess an on-site $\mathbb{Z}_2$ layer exchange symmetry, that of the latter is generated by the Hadamard gate. For each of these models, upon gauging, we find non-Abelian subdimensional excitations, including non-Abelian fractons, as well as non-Abelian looplike excitations and Abelian fully mobile pointlike excitations. By showing that the looplike excitations braid non-trivially with the subdimensional excitations, we thus discover a novel gapped quantum order in 3D, which we term a "panoptic" fracton order. This points to the existence of parent states in 3D from which both topological quantum field theories and fracton states may descend via quasi-particle condensation. The gauged cubic code model represents the first example of a gapped 3D phase supporting (inextricably) non-Abelian fractons that are created at the corners of fractal operators.

We investigate an optimal distance of two components in an entangled coherent state for quantum phase estimation in lossy interferometry. The optimal distance is obtained by an economical point, representing the quantum Fisher information that we can extract per input energy. Maximizing the formula of the quantum Fisher information over an input mean photon number, we show that the more loss there is in the interferometry, the less entanglement of the entangled coherent state we need to prepare initially. It represents that the optimal distance of the two-mode components decreases with more loss in the interferometry. Under the constraint of the input mean photon number, we obtain that the optimal entangled coherent state is more robust than a separable coherent state, even in a high photon loss rate. The optimal entangled coherent state preserves quantum advantage over the standard quantum limit of the separable coherent state. We also show that the corresponding optimal measurement is not a simple detection scheme but it is necessary to have correlation measurement bases.

Although highly successful, the truncated Wigner approximation (TWA) leaves out many-body quantum interference between mean-field Gross-Pitaevskii solutions as well as other quantum effects, and is therefore essentially classical. Turned around, this implies that if a system's quantum properties deviate from TWA, they must be exhibiting some quantum phenomenon, such as localization, diffraction, or tunneling. Here, we consider in detail a particular interference effect arising from discrete symmetries, which can lead to a significant enhancement of quantum observables with respect to the TWA prediction, and derive an augmented version of the TWA in order to incorporate them. Using the Bose-Hubbard model for illustration, we further show strong evidence for the presence of dynamical localization due to remaining differences between the TWA predictions and quantum results.

Electromagnetic fields possess zero point fluctuations (ZPF) which lead to observable effects such as the Lamb shift and the Casimir effect. In the traditional quantum optics domain, these corrections remain perturbative due to the smallness of the fine structure constant. To provide a direct observation of non-perturbative effects driven by ZPF in an open quantum system we wire a highly non-linear Josephson junction to a high impedance transmission line, allowing large phase fluctuations across the junction. Consequently, the resonance of the former acquires a relative frequency shift that is orders of magnitude larger than for natural atoms. Detailed modelling confirms that this renormalization is non-linear and quantum. Remarkably, the junction transfers its non-linearity to about 30 environmental modes, a striking back-action effect that transcends the standard Caldeira-Leggett paradigm. This work opens many exciting prospects for longstanding quests such as the tailoring of many-body Hamiltonians in the strongly non-linear regime, the observation of Bloch oscillations, or the development of high-impedance qubits.

An author (arXiv:1709.09262 [quant-ph] (2017), Nanoscale Research Letters (2017) 12:552) has recently questioned the security of two-way quantum key distribution schemes by referring to attack strategies which leave no errors in the (raw) key shared by the legitimate parties. We argue that the article is based on a flawed understanding of the actual workings of a two-way protocol, thus the erroneous conclusions drawn thereof.

The minimum parameterization of the wave function is derived for the time-independent many-body problem of identical fermions. It is shown that the exponential scaling with the number of particles plaguing all other correlation methods stems from the expansion of the wave function in one-particle basis sets. It is demonstrated that using a geminal basis, which fulfill a Lie algebra, the parametrization of the exact wave function becomes independent of the number of particles and only scale quadratic with the number of basis functions in the optimized basis. The resulting antisymmetrized geminal power wave function is shown to fulfill the necessary and sufficient conditions for the exact wavefunction, treat all electrons and electron pairs equally, be invariant to all orbital rotations and virtual-virtual and occupied-occupied geminal rotations, be the most compact representation of the exact wave function possible and contain exactly the same amount of information as the two-particle reduced density matrix. These findings may have severe consequences for quantum computing using identical fermions since the amount of information stored in a state is very little. A discussion of how the most compact wave function can be derived in general is also presented. Due to the breaking of the scaling wall for the exact wave function it is expected that even systems of biological relevance can be treated exactly in the near future.

Over 50 years ago, Lov\'{a}sz proved that two graphs are isomorphic if and only if they admit the same number of homomorphisms from any graph [Acta Math. Hungar. 18 (1967), pp. 321--328]. In this work we prove that two graphs are quantum isomorphic (in the commuting operator framework) if and only if they admit the same number of homomorphisms from any planar graph. As there exist pairs of non-isomorphic graphs that are quantum isomorphic, this implies that homomorphism counts from planar graphs do not determine a graph up to isomorphism. Another immediate consequence is that determining whether there exists some planar graph that has a different number of homomorphisms to two given graphs is an undecidable problem, since quantum isomorphism is known to be undecidable. Our characterization of quantum isomorphism is proven via a combinatorial characterization of the intertwiner spaces of the quantum automorphism group of a graph based on counting homomorphisms from planar graphs. This result inspires the definition of "graph categories" which are analogous to, and a generalization of, partition categories that are the basis of the definition of easy quantum groups. Thus we introduce a new class of "graph-theoretic quantum groups" whose intertwiner spaces are spanned by maps associated to (bi-labeled) graphs. Finally, we use our result on quantum isomorphism to prove an interesting reformulation of the Four Color Theorem: that any planar graph is 4-colorable if and only if it has a homomorphism to a specific Cayley graph on the symmetric group $S_4$ which contains a complete subgraph on four vertices but is not 4-colorable.

Bell inequalities are an important tool in device-independent quantum information processing because their violation can serve as a certificate of relevant quantum properties. Probably the best known example of a Bell inequality is due to Clauser, Horne, Shimony and Holt (CHSH), which is defined in the simplest scenario involving two dichotomic measurements and whose all key properties are well understood. There have been many attempts to generalise the CHSH Bell inequality to higher-dimensional quantum systems, however, for most of them the maximal quantum violation---the key quantity for most device-independent applications---remains unknown. On the other hand, the constructions for which the maximal quantum violation can be computed, do not preserve the natural property of the CHSH inequality, namely, that the maximal quantum violation is achieved by the maximally entangled state and measurements corresponding to mutually unbiased bases. In this work we propose a novel family of Bell inequalities which exhibit precisely these properties, and whose maximal quantum violation can be computed analytically. In the simplest scenario it recovers the CHSH Bell inequality. These inequalities involve $d$ measurements settings, each having $d$ outcomes for an arbitrary prime number $d\geq 3$. We then show that in the three-outcome case our Bell inequality can be used to self-test the maximally entangled state of two-qutrits and three mutually unbiased bases at each site. Yet, we demonstrate that in the case of more outcomes, their maximal violation does not allow for self-testing in the standard sense, which motivates the definition of a new weak form of self-testing. The ability to certify high-dimensional MUBs makes these inequalities attractive from the device-independent cryptography point of view.

We introduce a new quantum optimization algorithm for dense Linear Programming problems, which can be seen as the quantization of the Interior Point Predictor-Corrector algorithm \cite{Predictor-Corrector} using a Quantum Linear System Algorithm \cite{DenseHHL}. The (worst case) work complexity of our method is, up to polylogarithmic factors, $O(L\sqrt{n}(n+m)\overline{||M||_F}\bar{\kappa}^2\epsilon^{-2})$ for $n$ the number of variables in the cost function, $m$ the number of constraints, $\epsilon^{-1}$ the target precision, $L$ the bit length of the input data, $\overline{||M||_F}$ an upper bound to the Frobenius norm of the linear systems of equations that appear, $||M||_F$, and $\bar{\kappa}$ an upper bound to the condition number $\kappa$ of those systems of equations. This represents a quantum speed-up in the number $n$ of variables in the cost function with respect to the comparable classical Interior Point algorithms when the initial matrix of the problem $A$ is dense: if we substitute the quantum part of the algorithm by classical algorithms such as Conjugate Gradient Descent, that would mean the whole algorithm has complexity $O(L\sqrt{n}(n+m)^2\bar{\kappa} \log(\epsilon^{-1}))$, or with exact methods, at least $O(L\sqrt{n}(n+m)^{2.373})$. Also, in contrast with any Quantum Linear System Algorithm, the algorithm described in this article outputs a classical description of the solution vector, and the value of the optimal solution.

The momentum spectrum and number density of created bosons for two types of arbitrarily polarized electric fields are calculated and compared with those of created fermions, employing the equal-time Feshbach-Villars-Heisenberg-Wigner formalism which is confirmed that for an uniform and time-varying electric field it is completely equivalent to the quantum Vlasov equation in scalar QED. For an elliptically polarized field, it is found that the number density of created bosons is a square root of the number density of spin-up electrons times that of spin-down ones for a circularly polarized multicycle field. Moreover, the degree of spin polarization roughly grows as the Keldysh adiabaticity parameter increases for arbitrarily polarized multicycle fields. For a field constituted of two circularly polarized fields with a time delay, it is shown that momentum vortices also exist in boson pair creation and are induced only by the orbital angular momentum of particles. However, the vortices can reproduce the quantum statistic effect due to the effect of spin of particles. These results further deepen the understanding of some significant signatures in pair production.

Constrained quantum annealing (CQA) is a quantum annealing approach that is designed so that constraints are satisfied without penalty terms. There is an analogy between the model for the CQA of graph coloring and a set of disordered spin chains. In the model for the CQA of graph coloring, disorder corresponds to the fluctuation of effective local fields that increase in a CQA process. Numerical simulations of effective fields and entanglement demonstrate how localization appears in the CQA. Some notable features appear in the concurrence, which is a measure of entanglement, plotted as a function of the fluctuation of effective fields.

We demonstrate high fidelity two-qubit Rydberg blockade and entanglement in a two-dimensional qubit array. The qubit array is defined by a grid of blue detuned lines of light with 121 sites for trapping atomic qubits. Improved experimental methods have increased the observed Bell state fidelity to $F_{\rm Bell}=0.86(2)$. Accounting for errors in state preparation and measurement (SPAM) we infer a fidelity of $F_{\rm Bell}^{\rm -SPAM}=0.88$. Accounting for errors in single qubit operations we infer that a Bell state created with the Rydberg mediated $C_Z$ gate has a fidelity of $F_{\rm Bell}^{C_Z}=0.89$. Comparison with a detailed error model based on quantum process matrices indicates that finite atom temperature and laser noise are the dominant error sources contributing to the observed gate infidelity.

We investigate quantum violation of macrorealism for multilevel spin systems under the condition of coarsening of measurement times -- i.e., when measurement times have experimental indeterminacy. This is studied together with the effect of coarsening of measurement outcomes for which individual outcomes cannot be unambiguously discriminated. In our treatment, along with different measurement outcomes being clubbed together into two groups in order to model the coarsening of measurement outcomes, importantly, varying degrees of coarsening of measurement time intervals have also been considered. This then reveals that while for a given dimension, the magnitude of quantum violation of macrorealism decreases with the increasing degree of coarsening of measurement times, interestingly, this effect of coarsening of measurement times can be annulled by increasing the dimension of the spin system so that in the limit of large spin, the quantum violation of macrorealism continues to persist. Thus, the result obtained demonstrates that classicality for large spins does not emerge from quantum mechanics in spite of the coarsening of measurement times.