Quantum Physics (quant-ph) updates on the arXiv.org e-print archive

We present an in-depth study of the non-equilibrium statistics of the irreversible work produced during sudden quenches in proximity of the structural linear-zigzag transition of ion Coulomb crystals in 1+1 dimensions. By employing both an analytical approach based on a harmonic expansion and numerical simulations, we show the divergence of the average irreversible work in proximity of the transition. We show that the non-analytic behaviour of the work fluctuations can be characterized in terms of the critical exponents of the quantum Ising chain. Thanks to the technological advancements in trapped ion experiments, our results can be readily verified.

We analyze correlations between pairs of particle detectors quadratically coupled to a real scalar field. We find that, while a single quadratically coupled detector presents no divergences, when one considers pairs of detectors there emerge unanticipated persistent divergences (not regularizable via smooth switching or smearing) in the entanglement they acquire from the field. We have characterized such divergences, discussed whether a suitable regularization can allow for fair comparison of the entanglement harvesting ability of the quadratic and the linear couplings, and finally we have found a UV-safe quantifier of harvested correlations. Our results are relevant to future studies of the entanglement structure of the fermionic vacuum.

An exact reduced dynamical map along with its operator sum representation is derived for a central spin interacting with a thermal spin environment. The dynamics of the central spin shows high sustainability of quantum traits like coherence and entanglement in the low temperature regime. However, for sufficiently high temperature and when the number of bath particles approaches the thermodynamic limit, this feature vanishes and the dynamics closely mimics Markovian evolution. The properties of the long time averaged state and the trapped information of the initial state for the central qubit are also investigated in detail, confirming that the non-ergodicity of the dynamics can be attributed to the finite temperature and finite size of the bath. It is shown that if a certain stringent resonance condition is satisfied, the long time averaged state retains quantum coherence, which can have far reaching technological implications in engineering quantum devices. An exact time local master equation of the canonical form is derived . With the help of this master equation, the non-equilibrium properties of the central spin system are studied by investigating the detailed balance condition and irreversible entropy production rate. The result reveals that the central qubit thermalizes only in the limit of very high temperature and large number of bath spins.

We describe the hardware, gateware, and software developed at Raytheon BBN Technologies for dynamic quantum information processing experiments on superconducting qubits. In dynamic experiments, real-time qubit state information is fedback or fedforward within a fraction of the qubits' coherence time to dynamically change the implemented sequence. The hardware presented here covers both control and readout of superconducting qubits. For readout we created a custom signal processing gateware and software stack on commercial hardware to convert pulses in a heterodyne receiver into qubit state assignments with minimal latency, alongside data taking capability. For control, we developed custom hardware with gateware and software for pulse sequencing and steering information distribution that is capable of arbitrary control flow on a fraction superconducting qubit coherence times. Both readout and control platforms make extensive use of FPGAs to enable tailored qubit control systems in a reconfigurable fabric suitable for iterative development.

Here we study the possibilities of creating a bad cavity laser on forbidden transition in cold ions forming large Coulomb crystal in linear Paul trap. We consider micromotion-induced shifts and coupling strengths, and perform quantitative estimations of the attainable laser power for lasing on the ${^3D_2} \rightarrow {^1S_0}$ transition in ${\rm ^{176}Lu^+}$ ions in a spherical-symmetric trap.

We study out-of-time order correlators (OTOCs) of the form $\langle\hat A(t)\hat B(0)\hat C(t)\hat D(0)\rangle$ for a quantum system weakly coupled to a dissipative environment. Such an open system may serve as a model of, e.g., a small region in a disordered interacting medium coupled to the rest of this medium considered as an environment. We demonstrate that for a system with discrete energy levels the OTOC saturates exponentially $\propto \sum a_i e^{-t/\tau_i}+const$ to a constant value at $t\rightarrow\infty$, in contrast with quantum-chaotic systems which exhibit exponential growth of OTOCs. Focussing on the case of a two-level system, we calculate microscopically the decay times $\tau_i$ and the value of the saturation constant. Because some OTOCs are immune to dephasing processes and some are not, such correlators may decay on two sets of parametrically different time scales related to inelastic transitions between the system levels and to pure dephasing processes, respectively. In the case of a classical environment, the evolution of the OTOC can be mapped onto the evolution of the density matrix of two systems coupled to the same dissipative environment.

We introduce a biparametric Fisher-R\'enyi complexity measure for general probability distributions and we discuss its properties. This notion, which is composed of two entropy-like components (the R\'enyi entropy and the biparametric Fisher information), generalizes the basic Fisher-Shannon measure and the previous complexity quantifiers of Fisher-R\'enyi type. Then, we illustrate the usefulness of this notion by carrying out a information-theoretical analysis of the spectral energy density of a $d$-dimensional blackbody at temperature $T$. It is shown that the biparametric Fisher-R\'enyi measure of this quantum system has a universal character in the sense that it does not depend on temperature nor on any physical constant (e.g., Planck constant, speed of light, Boltzmann constant), but only on the space dimensionality $d$. Moreover, it decreases when $d$ is increasing, but exhibits a non trivial behavior for a fixed $d$ and a varying parameter, which somehow brings up a non standard structure of the blackbody $d$-dimensional density distribution.

Suppose a large scale quantum computer becomes available over the Internet. Could we delegate universal quantum computations to this server, using only classical communication between client and server, in a way that is information-theoretically blind (i.e., the server learns nothing about the input apart from its size, with no cryptographic assumptions required)? In this paper we give strong indications that the answer is no. This contrasts with the situation where quantum communication between client and server is allowed --- where we know that such information-theoretically blind quantum computation is possible. It also contrasts with the case where cryptographic assumptions are allowed: there again, it is now known that there are quantum analogues of fully homomorphic encryption. In more detail, we observe that, if there exist information-theoretically secure classical schemes for performing universal quantum computations on encrypted data, then we get unlikely containments between complexity classes, such as ${\sf BQP} \subset {\sf NP/poly}$. Moreover, we prove that having such schemes for delegating quantum sampling problems, such as Boson Sampling, would lead to a collapse of the polynomial hierarchy. We then consider encryption schemes which allow one round of quantum communication and polynomially many rounds of classical communication, yielding a generalization of blind quantum computation. We give a complexity theoretic upper bound, namely ${\sf QCMA/qpoly} \cap {\sf coQCMA/qpoly}$, on the types of functions that admit such a scheme. This upper bound then lets us show that, under plausible complexity assumptions, such a protocol is no more useful than classical schemes for delegating ${\sf NP}$-hard problems to the server. Lastly, we comment on the implications of these results for the prospect of verifying a quantum computation through classical interaction with the server.

Mutually unbiased bases (MUBs) and symmetric informationally complete (SIC) positive operator-valued measurements (POVMs) are two related topics in quantum information theory. They are generalized to mutually unbiased measurements (MUMs) and general symmetric informationally complete (GSIC) measurements, respectively, that are both not necessarily rank 1. We study the quantum separability problem by using these measurements and present separability criteria for bipartite systems with arbitrary dimensions and multipartite systems of multi-level subsystems. These criteria are proved to be more effective than previous criteria especially when the dimensions of the subsystems are different. Furthermore, full quantum state tomography is not needed when these criteria are implemented in experiment.

We present an argument which purports to show that the use of the standard Legendre transform in non-additive Statistical Mechanics is not appropriate. For concreteness, we use as paradigm, the case of systems which are conjecturally described by the (non-additive) Tsallis entropy. We point out the form of the modified Legendre transform that should be used, instead, in the non-additive thermodynamics induced by the Tsallis entropy. We comment on more general implications of this proposal for the thermodynamics of "complex systems".

There is a long history of representing a quantum state using a quasi-probability distribution: a distribution allowing negative values. In this paper we extend such representations to deal with quantum channels. The result is a convex, strongly monoidal, functorial embedding of the category of trace preserving completely positive maps into the category of quasi-stochastic matrices. This establishes quantum theory as a subcategory of quasi-stochastic processes. Such an embedding is induced by a choice of minimal informationally complete POVM's. We show that any two such embeddings are naturally isomorphic. The embedding preserves the dagger structure of the categories if and only if the POVM's are symmetric, giving a new use of SIC-POVM's. We also study general convex embeddings of quantum theory and prove a dichotomy that such an embedding is either trivial or faithful. The results of this paper allow a clear explanation of the characteristic features of quantum mechanics coming from being epistemically restricted (no-cloning, teleportation) and having negative probabilities (Bell inequalities, computational speed-up).

We investigate the initial-boundary value problem for the integrable spin-1 Gross-Pitaevskii (GP) equations with a 4x4 Lax pair on the half-line. The solution of this system can be obtained in terms of the solution of a 4x4 matrix Riemann-Hilbert (RH) problem formulated in the complex k-plane. The relevant jump matrices of the RH problem can be explicitly found using the two spectral functions s(k) and S(k), which can be defined by the initial data, the Dirichlet-Neumann boundary data at x=0. The global relation is established between the two dependent spectral functions. The general mappings between Dirichlet and Neumann boundary values are analyzed in terms of the global relation.

We investigate the initial-boundary value problem for the general three-component nonlinear Schrodinger (gtc-NLS) equation with a 4x4 Lax pair on a finite interval by extending the Fokas unified approach. The solutions of the gtc-NLS equation can be expressed in terms of the solutions of a 4x4 matrix Riemann-Hilbert (RH) problem formulated in the complex k-plane. Moreover, the relevant jump matrices of the RH problem can be explicitly found via the three spectral functions arising from the initial data, the Dirichlet-Neumann boundary data. The global relation is also established to deduce two distinct but equivalent types of representations (i.e., one by using the large k of asymptotics of the eigenfunctions and another one in terms of the Gelfand-Levitan-Marchenko (GLM) method) for the Dirichlet and Neumann boundary value problems. Moreover, the relevant formulae for boundary value problems on the finite interval can reduce to ones on the half-line as the length of the interval approaches to infinity. Finally, we also give the linearizable boundary conditions for the GLM representation.

Let $V=\bigotimes_{k=1}^{N} V_{k}$ be the $N$ spin-$j$ Hilbert space with $d=2j+1$-dimensional single particle space. We fix an orthonormal basis $\{|m_i\rangle\}$ for each $V_{k}$, with weight $m_i\in \{-j,\ldots j\}$. Let $V_{(w)}$ be the subspace of $V$ with a constant weight $w$, with an orthonormal basis $\{|m_1,\ldots,m_N\rangle\}$ subject to $\sum_k m_k=w$. We show that the combinatorial properties of the constant weight condition imposes strong constraints on the reduced density matrices for any vector $|\psi\rangle$ in the constant weight subspace, which limits the possible entanglement structures of $|\psi\rangle$. Our results find applications in the overlapping quantum marginal problems, quantum error-correcting codes, and the spin-network structures in quantum gravity.

Planar photonic nanostructures have recently attracted a great deal of attention for quantum optics applications. In this article, we carry out full 3D numerical simulations to fully account for all radiation channels and thereby quantify the coupling efficiency of a quantum emitter embedded in a photonic-crystal waveguide. We utilize mixed boundary conditions by combining active Dirichlet boundary conditions for the guided mode and perfectly-matched layers for the radiation modes. In this way, the leakage from the quantum emitter to the surrounding environment can be determined and the spectral and spatial dependence of the coupling to the radiation modes can be quantified. The spatial maps of the coupling efficiency, the $\beta$-factor, reveal that even for moderately slow light, near-unity $\beta$ is achievable that is remarkably robust to the position of the emitter in the waveguide. Our results show that photonic-crystal waveguides constitute a suitable platform to achieve deterministic interfacing of a single photon and a single quantum emitter, which has a range of applications for photonic quantum technology.

We consider the radiative properties of a system of two identical correlated atoms interacting with the electromagnetic field in its vacuum state in the presence of a generic dielectric environment. We suppose that the two emitters are prepared in a symmetric or antisymmetric superposition of one ground state and one excited-state and we evaluate the transition rate to the collective ground state, showing distinctive cooperative radiative features. Using a macroscopic quantum electrodynamics approach to describe the electromagnetic field, we first obtain an analytical expression for the decay rate of the two entangled two-level atoms in terms of the Green's tensor of the generic external environment. We then investigate the emission process when both atoms are in free space and subsequently when a perfectly reflecting mirror is present, showing how the boundary affects the physical features of the superradiant and subradiant emission by the two coupled emitters. The possibility to control and tailor radiative processes is also discussed.

We construct $d\times d$ dimensional bound entangled states, which violate for any $d>2$ a bipartite Bell inequality introduced in this paper. We conjecture that the proposed class of Bell inequalities act as dimension witnesses for bound entangled states: for any $d>2$, there exists a Bell inequality from this class, which can be violated with bound entangled states only if their Hilbert space dimension is at least $d\times d$. Numerics supports this conjecture up to $d=8$.

Tower of States analysis is a powerful tool for investigating phase transitions in condensed matter systems. Spontaneous symmetry breaking implies a specific structure of the energy eigenvalues and their corresponding quantum numbers on finite systems. In these lecture notes we explain the group representation theory used to derive the spectral structure for several scenarios of symmetry breaking. We give numerous examples to compute quantum numbers of the degenerate groundstates, including translational symmetry breaking or spin rotational symmetry breaking in Heisenberg antiferromagnets. These results are then compared to actual numerical data from Exact Diagonalization.

We provide a new way to bound the security of quantum key distribution using only the diagrammatic behavior of complementary observables and essential uniqueness of purification for quantum channels. We begin by demonstrating a proof in the simplest case, where the eavesdropper doesn't noticeably disturb the channel at all and has no quantum memory. We then show how this case extends with almost no effort to account for quantum memory and noise.

Quantum computing is moving rapidly to the point of deployment of technology. Functional quantum devices will require the ability to correct error in order to be scalable and effective. A leading choice of error correction, in particular for modular or distributed architectures, is the surface code with logical two-qubit operations realised via "lattice surgery". These operations consist of "merges" and "splits" acting non-unitarily on the logical states and are not easily captured by standard circuit notation. This raises the question of how best to reason about lattice surgery in order efficiently to use quantum states and operations in architectures with complex resource management issues. In this paper we demonstrate that the operations of the ZX calculus, a form of quantum diagrammatic reasoning designed using category theory, match exactly the operations of lattice surgery. Red and green "spider" nodes match rough and smooth merges and splits, and follow the axioms of a dagger special associative Frobenius algebra. Some lattice surgery operations can require non-trivial correction operations, which are captured natively in the use of the ZX calculus in the form of ensembles of diagrams. We give a first taste of the power of the calculus as a language for surgery by considering two operations (magic state use and producing a CNOT) and show how ZX diagram re-write rules give lattice surgery procedures for these operations that are novel, efficient, and highly configurable.