Inflation: a Python library for classical and quantum causal compatibility

We introduce Inflation, a Python library for assessing whether an observed probability distribution is compatible with a causal explanation. This is a central problem in both theoretical and applied sciences, which has recently witnessed significant advances from the area of quantum nonlocality, namely, in the development of inflation techniques. Inflation is an extensible toolkit that is capable of solving pure causal compatibility problems and optimization over (relaxations of) sets of compatible correlations in both the classical and quantum paradigms. The library is designed to be modular and with the ability of being ready-to-use, while keeping an easy access to low-level objects for custom modifications.


Motivation
One of the main challenges in any scientific discipline is identifying which are the causes behind some observed correlations.Is a vaccine effective against a disease?Does raising salaries encourage spending?Is the increase of carbon dioxide in the atmosphere responsible for the increase in the average temperature of the Earth?These questions and their like can all be phrased and analyzed using the tools of causal inference (CI) [1].However, despite the wide relevance of causal inference, current CI algorithms involving latent variables are typically incapable of analyzing structures with more than a small number of nodes [2][3][4][5][6].
The field of quantum nonlocality [7] has recently put its attention in causality, in light of the fact that Bell's theorem [8] can be understood in terms of the compatibility of probability distributions with a given causal structure [9,10].This view has propelled the study of quantum correlations beyond the traditional bipartite scenario (see, e.g., [11][12][13][14][15], and the review [16]), and the development of techniques for characterizing the quantum and classical probability distributions that can be generated in such causal scenarios [17][18][19][20][21].A particularly successful tool is the inflation method [22][23][24], which consists of a series of increasingly strict necessary conditions that can be tested via linear or semidefinite programming.Despite its broad applicability within and outside the field of quantum nonlocality, available implementations of the inflation technique are typically limited in terms of the type of causal structures it applies to, or in the type of inflations considered (see, e.g., [25,26]).This means that researchers must code their own programs every time they seek to analyze a different structure or try a different solution, adding an extra level of difficulty to the application of the technique.
Here we present Inflation [27], an open-source library, written in Python, that implements the inflation framework for causal compatibility.It allows both for solving feasibility problems (i.e., answering the question "can I generate this distribution in this causal structure?")and for bounding optimal values of functionals over distributions compatible with a causal structure.These include all network scenarios considered in the field of quantum nonlocality [16], and structures that have recently gained attention in that field, such as the so-called instrumental scenario [15].
Currently, Inflation implements the quantum inflation hierarchy of Ref. [23].This means that the main focus is the characterization of distributions generated by measuring quantum systems.However, the library can also be used, by setting the corresponding flags, for assessing compatibility with distributions generated when all the latent nodes represent sources of classical shared randomness (thereby extending the ideas in [28,29]), and when all the latent nodes represent sources of quantum entanglement and at the same time all parties are correlated through a global source of classical shared randomness.
This paper presents the package and illustrates simple use cases, which are extended in the library documentation.It is structured as follows: in Sec. 2 we provide a brief description of the theoretical ideas behind inflation and point to the relevant literature.In Sec. 3 we show how to get started with the library and describe its main components and features.In Secs. 4 and 5 we demonstrate with code snippets the different types of problems that can be addressed with Inflation.We discuss further software details and library information in Sec.6 including future development, contribution guidelines, and planned maintenance and support, and we provide some concluding remarks in Sec. 7.

The inflation framework
The aim of this section is to provide a brief description of the general ideas behind inflation.For the reader interested in the details of the different variants, we refer to the original publications [22,23,30].
Inflation is a general framework for analyzing correlations that can be generated in causal scenarios.These scenarios feature latent nodes, corresponding to physical systems (sources of shared classical randomness, quantum states, or states of more general physical theories), and visible nodes, which represent random variables that describe the outcomes of measurements performed on such systems.The nodes are connected by arrows that denote which systems are sent to which parties.In order to avoid causal paradoxes, like the outcome of a measurement determining which measurement is performed in the first place, the paths created by following the arrows must not contain closed loops.These graphs with directed arrows and no loops are known as directed acyclic graphs, or DAGs.Examples of such DAGs can be found in Figs. 1 and 2.
In order to characterize the distributions that can be generated in a particular causal scenario, inflation considers the gedankenexperiment where one has access to multiple copies of the physical systems (recall, the latent nodes in the corresponding DAG) and operations (the visible nodes) used in it.By connecting these elements in such a way that the parents of a copy of a visible node are copies of the parents of that node in the original scenario, one constructs inflation scenarios where the characterization of compatible distributions is simpler than in the original scenario.Furthermore, constraints satisfied by compati-ble distributions can be translated into necessary constraints of distributions compatible with the original scenario.The distributions compatible with inflated scenarios satisfy two main properties.First, they are highly symmetric, due to the fact that the elements of the inflation scenarios are copies of the elements of the original scenario.Namely, distributions compatible with inflations are invariant under permutations of copies of a same latent node.Second, when marginalized over sets of nodes that reproduce parts of the original scenario, they coincide with the corresponding marginals of the distribution in the original scenario.These two properties can be encoded by means of linear equalities and inequalities.Therefore, probability distributions that can be generated in inflations can be characterized by means of linear programming in the case where the physical systems are classical [22] or described by a generalized physical theory [24], or with hierarchies of semidefinite programs in the case where the systems measured by the parties are quantum mechanical [23].In all these cases, efficient algorithms exist that allow to solve the corresponding problems with standard computing resources.
Inflation is typically used for two families of problems: optimizing quantities over distributions compatible with some causal scenario, and determining whether a particular distribution admits a realization in a given scenario.For the former, inflation provides bounds to the actual optima (which can be nevertheless arranged in monotonic sequences, see e.g.Refs.[23,30]).For the latter, finding an inflation where all the symmetry and marginal constraints implied cannot be satisfied is a proof that the original premise, namely that the distribution under scrutiny admitted a realization in the causal structure, is false.Proving the opposite, namely that the original distribution admits a realization in the original causal scenario, requires finding an explicit realization (in terms of classical shared randomness, quantum systems, or elements of a generalized probabilistic theory, as corresponding) of the distribution.Exploring such constructions is a task outside the scope inflation, and thus of this library.
Inflation currently implements the quantum inflation hierarchy of Ref. [23].This is, it considers that all sources in the scenario distribute quantum systems with an unbounded dimension.As such, whenever a correlation can be achieved using positive operator values measurements (POVMs) the same correlation can also be realized by projective-valued measures (PVMs), as any POVM can be dilated to a PVM through Naimark's dilation theorem.As such, Inflation models all measurements as projective without loss of generality.This leads to a hierarchy that is monotonic (so each level is, at least, as constraining as the previous one), but that until now it has not been proven to converge except in concrete scenarios [31].There also exist alternative hierarchies that have been proven to converge [32], although in a non-monotonic way.
3 The library

Requirements and installation
Inflation is a Python library that can be installed on Mac, Windows, and Linux operating systems via pip by executing the instruction below at a command line.

pip install inflation
The core requirements of Inflation are NumPy [33] (used for general numerical procedures), SymPy [34] (used to make the input format more user-friendly), and SciPy [35] (used for handling sparse matrices).It can also use Numba [36] as a just-in-time compiler to speed up core calculations.For solving the generated relaxations, the library uses the MOSEK Fusion API [37] to solve the linear and semidefinite programming problems.It also allows for writing the problem in a human-readable form as a comma-separated values file, to a MATLAB-compatible .matfile, or to SDPA data format for further manipulation in other interfaces such as Yalmip [38].
To test installation, one can run the following.--------------- The source code for Inflation is hosted on GitHub at https://github.com/ecboghiu/inflationand is distributed with an open-source software license: GNU GPL version 3.0.More details about the software, packaging information, and guidelines for contributing to Inflation are included in Sec. 6.

Components
There are two main layers in Inflation.First, the basic characterization of the causal scenario and its desired inflation are stored in an InflationProblem.This includes the DAG describing the causal structure, the number of inputs and outputs of each of its visible nodes, and the number of copies of each latent node in the desired inflation.If the causal structure furthermore contains visible-to-visible connections, then the procedures in, e.g., [30,Sec.V], [23, Sec.V] and [32, Sec.IV.C] are executed in order to find the network and suitable constraints that generate the same distributions as the original causal structure.
The second layer takes the characterization provided by an InflationProblem object and sets up and solves the compatibility or optimization problem of interest.Currently, the library only supports the quantum inflation hierarchy described in Ref. [23] via the InflationSDP object.However, the implementation also supports the analysis of distributions generated from classical latent nodes by imposing suitable commutation constraints (see Ref. [28] and [23, Sec.VI]).Sec.5.1 showcases how this can be achieved in Inflation.
In the quantum inflation hierarchy, the characterization of the set of probability distributions is given by a list of operators, in the spirit of the Navascués-Pironio-Acín (NPA) hierarchy [39][40][41].This is input to the InflationSDP object via the function generate_relaxation(), which admits either a generic list-of-lists notation for arbitrary lists of operators, or string-based notations for operator sets that are routinely used in the literature: npa# for the sets describing the NPA hierarchy (denoted as T n in Ref. [39] and as S n in Ref. [40]), local# for the so-called local levels (denoted as L n in [23, App.C], see also Ref. [42]), and physical# for the sets of operators of bounded length whose expectation value is non-negative for any quantum state.
The function generate_relaxation() automatically imposes the equality constraints that are derived from the invariance of the inflation under permutation of copies of a same original element (this is, it imposes the constraints described in [23, Eq. ( 10)]).Furthermore, it identifies which marginals of distributions in the inflation must coincide with marginals of distributions in the original scenario.After this, the user can specify either a probability distribution over the visible nodes for a feasibility problem (i.e., to determine whether the distribution can be identified as incompatible with the causal structure) using the function set_values(), or a combination of operators whose expectation value will be optimized, by using the function set_objective().When using set_values(), the user can choose to set also the so-called linearized polynomial constraints, which constrain the set of compatible distributions further at the expense of obtaining certificates with more limited applicability [26,43].

Reductions of the feasible region
In general, the fact that one must consider multiple copies of the elements in the original causal structure leads to a large computational load when storing and solving the relevant problems.Inflation implements a number of additional constraints not included in the original definitions of the hierarchies, either automatically or at the user's choice, that give a tighter relaxation for the same level of computational resources.These constraints are: Non-negativity of physical moments.This is a feature only relevant for the implementations of inflation that characterize distributions generated by measuring quantum systems.Implementations of quantum inflation require the use of the NPA hierarchy [39] in order to assess whether a compatible inflation exists.The main object in the NPA hierarchy are the so-called moment matrices, whose rows and columns are indexed by products of (a priori unknown) projection operators.Each cell of the moment matrix contains the expectation of the product of the operators in the row and the operators in the column under an also unknown quantum state.Despite all elements being unknown a priori, one can derive constraints for some of them in certain situations.For instance, it is known that eigenpotent operators have a non-negative expectation value under any possible quantum state, and thus these non-negativity constraints are always imposed in Inflation.In fact, using operator products which have a non-negative expectation value under any quantum state in the generating set for InflationSDP leads to drastic reductions in problem size for certain problems, as we explicitly show in Sec.4.1.
Sandwich-nonnegativity.The operator products described above have a non-negative expectation value also if they appear sandwiched between another product of operators and its conjugate.Indeed, if O 2 is a product of operators that has non-negative expectation value with any state, then |ψ .An example of these products are those that correspond to subsequent measurements on a same state, whose expectation values represent the probabilities of sequences of outcomes.This feature is also automatically imposed when generating an instance of InflationSDP.

Linearized polynomial constraints. Both in the inflation methods for compatibility
with quantum (based on semidefinite programming) and classical and generalized physical models (based on linear programming), it is possible to transform certain non-linear relations between the unknowns in the problems into linear ones when one assesses the compatibility of a given distribution with the scenario.When a subgraph of the inflation contains more than one connected components, and (at least) one of these components can be associated a numerical value from the distribution under scrutiny (these are known as injectable components in the terminology of Ref. [22]), the variables associated to the subgraph can be related to those corresponding to its non-injectable connected components via linear relations, reducing the feasible region.Linearized polynomial constraints are used, for instance, in Ref. [26] in the case of inflation for compatibility with classical models.For compatibility with quantum models, [43, App.D.2] contains a demonstration of its advantage in several scenarios.It should be noted that when using linearized polynomial constraints, in case of infeasibility, the dual of the semidefinite (or linear) program can only serve as a certificate of infeasibility for the tested distribution, given that for other distributions the feasible region reduction under linearized polynomial constraints can be different [26].Inflation allows to impose linearized polynomial constraints by setting the flag use_lpi_constraints=True in set_values().

Main functionality
We showcase here the user experience in using Inflation to solve a series of problems that routinely appear in causal inference scenarios.All these examples can be downloaded as ready-to-run Jupyter notebooks from the repository of the library (see Sec. 6.3).

Feasibility problems and extraction of certificates
The first example we will consider is that of demonstrating that a particular distribution can not be generated in a specific network arrangement when the sources (recall, the latent nodes in the scenario characterized by the DAG) distribute quantum systems, and the parties (the visible nodes) perform quantum measurements in the systems received.
We illustrate this type of problems by showing that the so-called W distribution, defined as where a, b, c ∈ {0, 1}, is incompatible with the quantum triangle causal scenario of Fig. 1(b).The first step is to specify the original scenario and its desired inflation by creating an instance of InflationProblem.This requires specifying (i) the DAG of the original scenario as a dictionary where the keys are parent nodes and the values are lists of the corresponding children, (ii) the number of possible outcomes and number of possible measurement settings of each of the visible nodes in the scenario, and (iii) the inflation levels, which represent the number of copies of each latent node that will be considered in the inflated scenario: " rho_BC " : [ " B " , " C " ] , " rho_AC " : The order of the parties is specified via the order optional argument, and the order of the sources is taken to be the same as insertion order in the dag dictionary.The generated instance of InflationProblem is fed to an InflationSDP instance.This object controls all the features related to the semidefinite relaxation of the problem considered.For instance, one can easily specify the relaxation obtained when using as generating monomials all products of operators of length at most 2 (this is, what is known as the second level in the NPA hierarchy [39,40]): The last step is setting the constraints corresponding to observing the target probability distribution in marginals of the inflation distribution that can be identified with marginals of the original scenario.If one wants to do this for all marginals, this can be achieved with the function set_distribution(), but if more granularity is needed one must use the function set_values() instead (see Sec. 5.3 for an explicit example using this function).The distribution must be input as a multidimensional NumPy array P[out 1 ,...,out n ,in 1 ,...,in n ], where each cell contains the corresponding probability, and where the i-th output and input, out i and in i , correspond to the party in the i-th position in the order keyword argument.

sdp . set_distribution ( P_W )
Then, running sdp.solve() executes the semidefinite program and stores its status in sdp.status.For the problem at hand, this is infeasible, meaning that the W distribution of Eq. (1) does not admit a second-order (because of our specification of nr_copies) quantum inflation of the triangle scenario, and thus by the inflation arguments (see Sec. 2 and Ref. [23]) it cannot be generated in the triangle causal scenario when the latent nodes represent sources of bipartite quantum states.
Once the problem is determined to be infeasible, semidefinite programming provides a certificate of such infeasibility.These certificates of infeasibility, which witness incompatibility with a given inflation, can be transformed into polynomial Bell-like inequalities that witness distributions incompatible with the original causal scenario (see, e.g., [23, Sec.VII.B]).Inflation automatically computes these certificates and provides them in a variety of useful forms.For instance, running sdp .certificate_as_probs ( clean = True ) produces as output a SymPy object of the following form −0.476p A (0|0) 2 + 0.059p A (0|0)p AB (00|00) + 0.476p A (0|0)p ABC (000|000) + • • • + 0.563, which signals distributions that produce a negative value as incompatible with the triangle scenario.This object can be further manipulated easily with SymPy's built-in functions.

Feasibility as optimization.
In terms of numerical stability, it is often advised to frame feasibility problems as optimization problems.This is specially relevant when the distribution whose compatibility we are testing is close to the boundary of the feasible set, where analytically feasible problems can be reported as infeasible.
Inflation allows for this framing by optimizing the smallest eigenvalue of the problem's moment matrix 1 .This is achieved by passing one argument at solving time: The optimal (largest) value of the smallest eigenvalue of the moment matrix is stored in sdp.objective_value and by inspecting its value one can determine whether the original problem is feasible (if sdp.objective_value ≥ 0) or not (if sdp.objective_value < 0).Furthermore, the quantity in sdp.objective_value can provide a rudimentary notion of distance to the feasible set, and help estimating how the size of the feasible set changes when adding extra constraints to the problem or when changing the generating set.Importantly, the extraction of certificates is unaffected, and can still be performed even when treating feasibility problems as optimization problems.

Physical moments as generating set.
It is known that, when considering the distribution in Eq. (1) subject to white noise, quantum inflation of order 2 can certify its incompatibility with the triangle with quantum latent nodes at least down to ν = 0.8038.This result was obtained in Ref. [23] by using a moment matrix of size 1175×1175.By using as monomials indexing the rows and columns of the moment matrix only those with non-negative expectation values under any quantum state as mentioned in Sec. 3, one can recover the same ν crit but with a much smaller moment matrix, of size 287×287.Due to its notable gains in memory (and its consequent gains in speed), using the non-negative monomials in the generating set is made very easy in Inflation.In order to recover the result mentioned above with the much smaller moment matrix, one just needs to run: This improvement is most notable when dealing with problems that involve distributions without settings.In our experience, in the case where the parties in the problem have a choice of different measurements to perform on the states received by the sources, gains are more moderate.

Optimization of Bell operators
The second large class of problems that can be solved within Inflation is obtaining bounds on expectation values of Bell operators.For example, let us consider Mermin's operator [44]: where A x B y C z = a,b,c∈{0,1} (−1) a+b+c p(a, b, c|x, y, z).Ref. [23] bounds this quantity in the quantum triangle scenario by using its second-order inflation and a generating set composed of the union of the second level in the associated NPA hierarchy and the first local level, obtaining that its maximum value cannot be larger than 3.085.
In order to reproduce these results in Inflation, one needs to pass the corresponding generating set to generate_relaxation() and set the objective function.The former can be easily done since generate_relaxation() also admits an explicit list of operators as argument: Note that, as mentioned in [23, Sec.VII.C.2], one can also optimize polynomial expressions of distributions compatible with the original scenario as long as they can be written as linear combinations of products of the operators in the inflation.For example, one can optimize the mean squared distance to a target distribution in a second-order inflation by using operators corresponding to non-overlapping inflation copies.In Inflation, these operators are stored in the second dimension of sdp.measurements.

Bounds on critical parameter values
A third large class of problems of interest in quantum information theory that can be solved in Inflation is the calculation of critical values of some parameter that characterize the family of distributions under scrutiny.Examples of this are the estimation of the maximum amount of noise [17], the maximum angle between the measurements of a party [25], or the maximum probability of detection failure [45] beyond which nonlocality cannot be certified.This type of problems can be handled within Inflation by using the function max_within_feasible.This function takes as input an instance of an InflationSDP that characterizes the set of feasible distributions, and a mapping (in the form of a Python dictionary) from cells in the corresponding moment matrix to arbitrary symbolic expressions depending on the variable to be optimized.
Currently Inflation features two ways for obtaining critical parameter values, which are specified by setting the corresponding flag in max_within_feasible.The first one, method="bisection", is a bisection algorithm, which takes increasingly small steps in the direction of the critical value of the parameter, taking n = log 2 ∆ − log 2 ε iterations to reach a solution within accuracy ε (where ∆ is the width of the interval that the variable is constrained to lie in).The second method, method="dual", exploits the certificates of infeasibility in order to reduce the number of iterations required.The certificates of infeasibility are surfaces that always leave the feasible region in the half-space that takes positive values.This second method, instead of modifying the parameter by a fixed value, chooses as next candidate the value that lies in the boundary of the certificate, typically leading to fewer evaluations of semidefinite programs.Furthermore, it is possible to use the functions set_values() and set_distribution() as shortcuts to obtain complete dictionaries of assignments that are stored in InflationSDP.known_moments.

Further features
In this section we collect additional functionality of Inflation when restricting to specific kinds of problems and situations.All the functionality described in this section can be combined seamlessly, both among them and with that described earlier.

Characterization of classical causal models
In quantum mechanics, operators corresponding to different parties commute, but this is not necessarily the case for operators corresponding to different measurements performed by a same party.If different measurement operators for a same party commute, one can define a basis in which all operators are diagonal, hence obtaining results that can be recovered by measuring classical systems.This means that, in analogy with works done in the multipartite Bell scenario [28], imposing that all operators in a quantum inflation problem commute gives a relaxation of the set of correlations compatible with the given causal scenario where the latent nodes represent classical physical systems instead.For a fixed inflation, the resulting NPA-like hierarchy converges to the linear program posed by the inflation technique for causal compatibility with classical latent nodes [22], enabling the study of such problems with lower memory requirements.
In order to restrict the characterizations generated to classical distributions in Inflation, one just needs only to set the commuting flag to True when creating an instance of the InflationSDP object: The rest of the functions, such as for setting distributions and objective functions, or for extracting certificates, remains exactly the same.
Given that, within quantum inflation, it is possible to consider compatibility with quantum and classical causal scenarios, a natural question is whether it is possible to assess compatibility with hybrid scenarios where some of the sources are classical and the remaining are quantum.It is indeed possible to consider such problems within the formalism of quantum inflation by enforcing commutations over certain subsets of operators.Analyzing hybrid scenarios is a feature not currently implemented in Inflation, but that is expected to be supported in the future.We refer the reader to Sec. 6.2 for further details.

Standard NPA hierarchy
Since multipartite Bell scenarios (namely those where all parties receive states distributed by the same source) are particular instances of causal scenarios, Inflation can also be used for analyzing standard multipartite quantum correlations using the NPA hierarchy [39,40].In order to do so, one can call InflationProblem without specifying its dag argument.For instance, the code to optimize the CHSH inequality in the bipartite Bell scenario is:

Scenarios with partial information
Inflation can also handle scenarios where not all the information about a particular distribution in the original scenario is known.An important example is the analysis of cryptographic scenarios, where the honest parties may know what is their joint distribution but they can not know what is the joint distribution with a potential adversary.One simple such scenario is considered in Ref. [23,Sec. VIII].Specifying particular elements of a distribution in an InflationSDP object is achieved via the use of the function set_values(), which admits as input a dictionary where the keys are the variables to be assigned numerical quantities, and the corresponding values are the quantities themselves.In order to address the problem in Ref. [23,Sec. VIII] in Inflation, one would write:

Scenarios beyond networks
So far, all the examples described have involved causal scenarios known as networks.These are bipartite DAGs with a layer of visible variables denoting the parties outcomes, and a layer of both latent and visible variables that denote the sources of physical systems and the measurements performed by the parties on them, respectively.Importantly, there are no connections between the nodes in each of the layers.Not all DAGs fall in this category, and some non-network DAGs have received considerable attention recently in the literature.The most important example is the so-called instrumental scenario [1], which has been extensively studied in the quantum information literature [15,[46][47][48] (see Fig. 2(a)).Inflation is capable of handling with causal inference problems in these scenarios, by internally considering equivalent network-type scenarios such as that depicted in Fig. 2(b) (see [23,30] for the details on the equivalence).The user experience in considering these problems is no different to that of considering causal inference over network-type DAGs.As an illustration, the following snippet recovers the bounds of Bonet's inequality in [15,Eq. (23)].

Feasibility based on distribution supports
The violation of Bell-type inequalities is the main method for the certification of nonclassical behavior.However, there exist even simpler certifications of non-classicality, that instead rely on possibilistic arguments: instead of setting bounds on combinations of the elements of compatible probability distributions, they only assume whether certain events are possible (positive probability) or impossible (zero probability) in order to reach a contradiction.These certificates are known as Hardy-type paradoxes [49].
Inflation can handle proofs of non-classicality and non-quantumness in arbitrary DAGs based on possibilistic arguments.It does so by assessing whether a quantum inflation (with commuting or non-commuting operators, respectively) exists where the probability elements inside the support are constrained to lie in the interval [1, ∞) while those outside the support are given the value 0. This represents a re-scaled version of a standard quantum inflation moment matrix, Γ * = Γ/ , that does not suffer of floating-point instabilities when determining if a probability element is outside the support or has assigned a very small value.
In order to deal with possibilistic feasibility problems in Inflation one sets the argument supports_problem=True when instantiating InflationSDP.As an example, in order to recover the result that that the distribution from [15, Eq. ( 19)] is incompatible with the scenario of Fig. 2(a), one would run the following code.sdp = InflationSDP ( inst , supports_problem = True ) sdp .generate_relaxation ( " local1 " ) sdp .set_distribution ( P_Eq19 ) sdp .solve () # " infeasible " Note that, in contrast with other functionalities discussed in this section, optimization of objective functions is not possible when assessing the feasibility of distribution supports.

Additional library information
Here we provide further information concerning the development of the Inflation library.

Computational considerations
Inflation has been developed with speed and efficiency in mind.It uses just-in-time compilation through Numba [36] in order to speed up core calculations, and dictionary caching to avoid needless function calls.This results in all examples in the documentation being executable on a standard laptop with 8 GB of RAM.Moment matrices of around 200 columns and 1500 free scalar variables can be generated in 5 seconds and the SDP solved in 3 seconds; those of around 2000 columns and around 2500 variables can be generated in 8 minutes and the SDP solved in 10 minutes, and those of around 20000 columns can be generated in 17 hours.In this last case, however, the SDP is too large to be solvable on a traditional desktop computer. 2In order to solve these larger problems, a promising venue to pursue is using symmetries to block diagonalize the moment matrix.In the Documentation we provide an example using the MATLAB software RepLAB [50].

Future extensions
Inflation is a general technique that comprises a collection of routines specific to different types of physical systems.Moreover, the fact that it is a young technique makes it not unreasonable to expect that refinements and alternative hierarchies will be developed in the future.For these reasons, Inflation is built having modularity in mind, so that new functionalities are easy to implement.
The main feature of the current implementation of Inflation is the characterization of sets of quantum correlations in networks and certain non-network causal structures.Moreover, it is also capable of handling the characterization of sets of classical correlations, and it contains the necessary equipment to handle simple non-network scenarios.Subsequent releases will be focused on consolidating these capabilities, as well as adding functionalities to improve user experience and increase the range of problems and scenarios it can handle.Planned additions to the library include: Arbitrary causal scenarios.One of the most exciting features of inflation methods is their ability to handle scenarios with latent-to-latent, visible-to-visible and visible-tolatent connections.By the application of unpacking and exogenization algorithms (see Refs. [23,30] for their descriptions), probability distributions compatible with any causal structure encoded in a DAG can be transformed into and analyzed as distributions compatible with an equivalent network-form DAG and satisfying additional equality constraints.While the library already allows handling scenarios with visible-to-visible connections, in future versions we plan to add support for scenarios with visible-to-latent connections.The automatic handling of this type of networks will mostly be integrated in the InflationProblem object, although it is expected that, due to the need of handling differently the cases of classical and quantum latent nodes (see [23,Fig. 8]), processing is needed also further down in the pipeline.
Inflation based on linear programming.As mentioned in Sec.5.1, the current implementation of Inflation allows for the characterization of sets of classical probability distributions in network scenarios by means of semidefinite relaxations involving commuting operators, in the spirit of Ref. [28].A more direct way of performing such characterization is implementing the classical inflation hierarchy described in Refs.[22,30], based on linear programming.Fulfilment of this task involves the creation of a new fundamental object in the library, InflationLP, that will moreover be used for the characterization of distributions generated by measurements on systems described by generalized probabilistic theories (note that these distributions are a superset of the quantum distributions, and cannot be analyzed using quantum inflation).

Hybrid scenarios. Most research in the field investigating correlations in networks has
thus far focused on networks in which all sources and operations are described by the same physical theory, be it classical mechanics, quantum mechanics or a generalized probabilistic theory.Studying the correlations that can arise in networks where different sources (and corresponding measurements) are described by different physical models is therefore an interesting question, with consequences both at the theoretical [25] and at the practical levels.Subsequent versions of the library will admit the specification of whether each source in the scenario is classical or quantum, and implement these using the fact that measurements on classical systems only reveal pre-existing properties and the order in which these are revealed is unimportant.The commutation relations between the operators in a scenario with classical and quantum sources are thus as follows: when a party receives systems only from classical sources, then all operators associated with that party commute with each other, including operators with different settings or acting on partially-overlapping sets of systems.For a more generic party, connected to some number (including zero) of classical sources as well as some number of quantum sources, the commutation rules are the following: if two operators act on non-overlapping sets of states, they commute as usual, since these are operators acting on different Hilbert spaces; if they act on exactly the same set of states, then they commute only if they are identical or orthogonal (i.e., referring to the same setting); if they act on sets of states with partial overlap, then they commute only if all the states in the overlap come from classical sources, since this represents the situation where the party measures different quantum systems and the same classical systems; otherwise (i.e., if the operators act on sets of states with partial overlap, and any of the states in the overlap is quantum), the operators do not commute, even if they refer to same settings and/or outcomes.
Interfacing.Future plans include adding support for other optimizers widely available such as SDPT3 [51], CVXPY [52], SCS [53] and Gurobi [54] (this last one is especially interesting since it is not restricted to linear and semidefinite programming problems), and translating the problems generated to forms compatible with other optimization libraries such as PICOS [55] and CVXOPT [56].Interfacing with other tools, such as SDPSymmetryReduction.jl [57] for reducing the memory and computational load of the problems via exploitation of symmetries, and scalar extension [17] for imposing the independence of variables in the inflation scenarios, will also prove useful.Currently, it is possible to export the problem in a MATLAB-compatible form that can be directly read out by RepLAB [50] in order to block diagonalize it.

Documentation for Inflation
The documentation of the library, which includes a user's guide and an API glossary, can be found online at https://ecboghiu.github.io/inflation.The user's guide contains more information on the installation and the topics covered in this manuscript, as well as subjects not covered here; for example, more applications, tips on improving performance, and indepth tutorials.The API glossary is automatically generated from the documentation comments written in the code and contains information about the public functions and classes defined.

Contribution guidelines
We welcome contributions to Inflation from the larger community interested in causality, quantum nonlocality, and software for quantum information theory.Contributions can come in the form of feedback about the library, feature requests, bug fixes, or code contributions (pull requests).Feedback and feature requests can be done by opening an issue on the Inflation GitHub repository.Bug fixes and other pull requests can be done by forking the Inflation source code, making changes, and then opening a pull request to the Inflation GitHub repository.Pull requests are peer-reviewed by Inflation's core developers to provide feedback and/or request changes.
Contributors are expected to adhere to Inflation development practices including style guidelines and unit tests.Tests are written with the UnitTest Python framework and are implemented outside the module.To test installation or changes, one can download the source code from the repository, and use standard UnitTest functions.For example, executing the following in a Unix terminal in the test folder runs all the tests: python -m unittest -v More details can be found in the Contribution guidelines documentation.

Concluding remarks
We have presented the first open-source implementation of the inflation framework for causal compatibility.Its focus is put in user experience and modularity, with the goal of being easy to use off the shelf while allowing for modifications needed by expert users.
While the current core implements the quantum inflation technique of Ref. [23], the library can be used off the shelf to characterize the sets of classical and quantum correlations in any network-type DAG, and non-network scenarios with visible-to-visible connections.After briefly mentioning the principles behind inflation, we described the main components of the library and techniques to achieve tighter characterizations, and we illustrated its use in multiple problems of interest.Finally, we outlined different ways in which the library can be extended to accommodate problems of interest for the broad community interested in quantum nonlocality and causality, and described additional software information including support and contribution guidelines.We strongly encourage any willing user to contribute to the development of the library via its repository.

Figure 1 :
Figure 1: (a) The bipartite Bell scenario written as a DAG.The yellow circles denote visible nodes representing random variables, while the blue circles denote a latent node representing a physical system.This DAG represents the scenario where two parties perform measurements denoted by x ∼ X and y ∼ Y , respectively, on shares of a bipartite physical system distributed to them.The results of the measurements are a ∼ A and b ∼ B. (b) The triangle scenario, where three parties perform (in this case, fixed) measurements on shares of bipartite physical systems.(c) The second-order quantum inflation of the triangle scenario.Each of the sources is duplicated, and each party has a choice of the shares in which to perform their measurements.Whereas NPA-based convex relaxations of scenario (b) are indistinguishable from those of single common cause scenarios, the underlying symmetries in (c) allow to constrain correlations beyond the common-cause scenario.

Figure 2 :
Figure 2: (a) The instrumental scenario written as a DAG.Because of the arrow originating at A and pointing to B, the scenario is not a network.(b)The interruption of (a).This is a network scenario, where the input to B and the output of A are restricted to coincide.The similarity of this scenario with the Bell scenario (recall Fig.1(a)) has motivated its investigation within quantum information theory.