A new quantum machine learning algorithm: split hidden quantum Markov model inspired by quantum conditional master equation

The Hidden Quantum Markov Model (HQMM) has significant potential for analyzing time-series data and studying stochastic processes in the quantum domain as an upgrading option with potential advantages over classical Markov models. In this paper, we introduced the split HQMM (SHQMM) for implementing the hidden quantum Markov process, utilizing the conditional master equation with a fine balance condition to demonstrate the interconnections among the internal states of the quantum system. The experimental results suggest that our model outperforms previous models in terms of scope of applications and robustness. Additionally, we establish a new learning algorithm to solve parameters in HQMM by relating the quantum conditional master equation to the HQMM. Finally, our study provides clear evidence that the quantum transport system can be considered a physical representation of HQMM. The SHQMM with accompanying algorithms present a novel method to analyze quantum systems and time series grounded in physical implementation.


Introduction
The significant increase of data and information has emphasized that classical algorithms have dif-Qin-Sheng Zhu † : Corresponding:zhuqinsheng@uestc.edu.cnficulties in meeting computation demands for efficiency and speed.Therefore, quantum computing, capable of efficient computation, has become a viable solution.Unlike classical computing, quantum computing employs quantum bits' superposition for data storage, reading, and Efficient computing.As a result, quantum computing promises to solve problems that are too complex for classical computers.
Quantum computing has made great improvements in hardware implementations [1,2] and algorithmic approaches [3,4], propelled the field of quantum computing into the "Noisy intermediate-scale quantum (NISQ) algorithms" era in the last decade.During that era, many hybrid framework [5] algorithms emerged that combined quantum and classical approaches to cope with suboptimal hardware conditions.Initial applications of quantum computing in chemistry [6], Hamiltonian simulation [7], biology [8], pharmaceutical [9], finance [10], materials [11] and various other fields demonstrate its potential advantages over classical computing.
In the field of machine learning, the Hidden Markov Model (HMM) algorithm plays a vitall role and has been extensively and effectively utilized in domains such as stock market forecasting [12,13], natural language processing [14,15], protein sequencing [16,17].
The classical HMM [18] has three main parts: training, decoding, and learning.When the dimensionality of the hidden state is not large, the Baum-Welch algorithm [19], the Viterbi algorithm [20], and the EM algorithm [21] can be used to solve these problems efficiently.However, when the dimensions of the hidden state and the observation space increase simultaneously, the solution speed and accuracy of the classical algorithms become weak.
Quantum algorithms were introduced in response to address such problems and find more effective solutions.Monras et.al.[22] gave the mathematical definition of the hidden quantum Markov process in terms of Kraus operators.This key work offers insight into HQMM and a method for studying this model from the perspective of quantum open systems, as well as creating heuristic quantum computing algorithms.Compared to the classical HMM, the quantum version also functions as a stochastic probability graph model, and it includes the same three main parts.Crucially, the advantages of HQMM are gradually becoming apparent with the study of problem solving the model parameters.
Srinivasan et.al. [23] proposed an algorithm based on the Norm Observable Operator Model's learning algorithm, originally presented by Jaeger et.al.[24].This study demonstrates that HQMM offers advantages over traditional algorithms with regard to model complexity and accuracy.Nevertheless, this model is only suitable for situations where the hidden state dimension is relatively small, and it is inclined to succumb to local optimal solutions.Liu et.al.[25] analytically demonstrated the superiority of the quantum stochastic model in comparison to the classical model, providing qualitative evidence of the quantum advantage that HQMM possesses.In further work [26], they also highlight that the quantum implementation of the HMM could both mitigate thermal dissipation and achieve an advantage in memory compression.In 2020, Adhikary et.al.[27] proposed a learning algorithm grounded in optimization theory on the manifold [28] with goal of solving the Kraus, which provides an algorithmic basis for substantial applications of HQMM.The work of Markov et.al.[29] in 2022 shows that HQMM has a significant advantage in state space complexity over stochastic process languages.In 2023, Li et.al.[30] presents a new algorithm for modeling the dynamics of Markovian open quantum systems that is cleaner and more efficient than previous algorithms.
Taken together, the above work show that HQMM has potential advantages that need to be explored.The primary motivation of this paper is to further explore HQMM from a physical perspective, providing new tools for a nuanced understanding of the intricate states of quantum systems while ensuring algorithmic performance.
We developed a new HQMM learning algorithm, namely SHQMM, using the quantum conditional master equation on an open quantum system described by the quantum master equation [31,32,33] building upon the work of Clark et al. [34].As a result, SHQMM has the capability to function with open systems and nonunitary quantum algorithms, rendering it applicable to the noise-related issues that arise in quantum computers during NISQ.Another aspect, SHQMM provides an understanding of the quantum state of the system under fine balance condition while guaranteeing algorithmic performance, provides HQMM with interpretability that corresponds to quantum transport systems [35].Additionally, our work is expected to provide new ideas for the physical implementation of quantum neural network (QNN).
The contribution of this paper is as follows: 1.A new SHQMM learning algorithm is constructed on open quantum systems by using the quantum conditional master equation, which is able to handle open systems and non-unitary quantum algorithms.
2. SHQMM with the introduction of periodic boundary conditions guarantees the performance of the algorithm under fine balance conditions, which can be physically related to quantum transport systems.Model possesses a clear physical representation, well-defined dimensional relationships, and robust interpretability.
3. Numerical experiments show that SHQMM achieves better results on both quantum and classical data sets and is robust to random initialization.Exploration of the quantum system did not degrade the performance of the model.
The paper is organized as follows: Sec. 2 provides an introduction of related work, including the basic concepts of HMM, HQMM and quantum conditional master equation.In Sec. 3, we formally introduce the SHQMM, a novel stochastic probabilistic graphical model.The implementation of SHQMM is demonstrated using a quantum transport system as an illustrative example.Sec. 4 presents the results of numerical experiments on the SHQMM model, demonstrating its adaptability on both quantum and classical data, robust to random initialization, and performance on DA.In Sec. 5, we compare SHQMM to previous work, analyze its complexity, and discuss the benefits.Finally, in Sec.6, we summarize our work.
2 Related work

Hidden quantum Markov model
The Hidden Markov Model (HMM) is a type of probabilistic graph model that describes the evolutionary properties of Markov dynamics.It consists of two important parameters: the transition matrix T and the observation matrix C, which are constant matrices.A HMM can be defined as λ = (T, C, x 0 ), where x 0 is the initial state vector.The update of the hidden state and the observable results can be obtained from Eq.1, which represents a state-emitting (Moore) hidden Markov model.
where the variable y represents an output symbol in the observable space O.
A Hierarchical Quantum Markov Model (HQMM) can be defined using a set of parameters λ Q = (ρ 0 , K y ), similar to the classical Markov process.Here, ρ 0 corresponds to the initial state vector x 0 of the classical Markov model, and the Kraus operator K y corresponds to the matrices T and C. In comparison to the classical Markov model, the Kraus operator K y plays a dual role as both the evolution state and the observable output result.It satisfies the condition When the system is measured (assuming the measurement or read-out result is y), the density matrix can be expressed as follows [27]: where ω y denotes the auxiliary dimension of Kraus operator.
The difference between the HMM and the HQMM is shown in Table .1.
To calculate the parameters {K} of the HQMM, Adhikary et.al.[27] proposed a maximum likelihood estimation algorithm.This algo- rithm assumes that a set of observation sequences y 1 , y 2 , y 3 , • • • , y T is known, and constructs the maximum likelihood function based on this data.This is a particular case where ω = 1: (3) Then the parameter solving problem of the HQMM is transformed into a constrained optimization problem: Stack K m by column to form a new matrix κ = [K 1 , K 2 , • • • , K m ] T with dimension nm × n, the constraint condition in Eq.4 can be rewritten as According to Ref. [27], κ in Eq.5 lies on the Stiefel manifold, and the following gradient descent method can be used to solve Eq.4: and τ is a positive real number.

The quantum conditional master equation
As a general type of equation describing the evolution of open quantum systems, the quantum conditional master equation adopts the specific form of the Lindbladian equation [36], with particular consideration given to the interaction with the environment.
For an open quantum system, the Hamiltonian form can be applied as H S and H E represent the Hamiltonians of the quantum system and the environment, respectively.The Hamiltonian H ′ describes the coupling effect between the quantum system and the environment.In the case of weak coupling between the quantum system and the environment, H ′ can be treated as a perturbation.Using the expansion of the second-order cumulant, we can obtain a description of the evolution of the reduced density matrix.
Here, the Liouvillian super operator is defined as ) is the Green's function related to H S .The reduced density matrix is obtained by partially tracing the density matrix of the composite system, that is, In experiments, measurement results are typically linked to changes in the internal state of a system.Therefore, unlike the method used to derive Eq.8, Li et.al. [35] introduced the "detailed balance" condition to illustrate the relationship among different system states when studying the current ( the measurement result) of a quantum transport system.This allowed them to derive the quantum conditional master equation (QCME) in Equation 12and to obtain some interesting results.By adopting the approach of the QCME described in reference [35] and considering the detailed balance among different system states, the general QCME can be expressed as following when the environment space is divided into different subspaces as shown in Figure 1.
Here, the proposed initial conditions for the quantum conditional master equation are (0), and ρ (Mq) denotes conditional density matrix of the quantum system corresponding to the environment ρ (Mq) E associated with the subspace M q .Note that ρ (Mq) also satisfies positive semi-definite, and Tr(ρ (Mq) ) ≤ 1, Tr[ Mq ρ (Mq) ] = 1.The use of quantum conditional master equation enables the representation of the relationship between the different subspaces M q , which provides a better understanding of the open quantum system being studied.
3 The SHQMM based on QCME

The quantum master equation of quantum transport system
Since any quantum computing needs an actual physical system to implement, we need search for an open quantum system that can be described by the conditional master equation to establish HQMM.We found that the quantum transport system is suited for implementing our HQMM based on previous work [27,35].
As a result, in this section, we present the quantum conditional master equation for the quantum transport system.The Hamiltonian of this quantum system is expressed as follows [35]: where H S is the Hamiltonian of the quantum dots system, L and R represent the left and right electrodes respectively, d † αµk and d αµk represent the creation and annihilation operators of electrons in the electrode, respectively, and t αµk represents the coupling strength between the electrode and the quantum dot system.The master equation for the quantum transport system can be derived through some calculations [35] based on Eq.8: If the state space where the electrode is located, without any electrons passing through the quantum dot system, is denoted as E (0) , it is formed by the wave function of the two isolated electrodes on the left and right E (0) = span{|ψ L ⟩ ⊗ |ψ R ⟩}.If there are n electrons from the state space where the right electrode passes through the quantum dot to the left electrode, it is denoted as Then the electrode state space E in the equation can be decomposed as E = ⊕ n E (n) , which leads to the quantum conditional master equation [35] with an initial condition of ρ T (0) is the conditional density matrix of the quantum dot system [35], which means that there are n electrons passing through the quantum dot system within time t.
Here, the number of electrons n corresponds to the subspace M q in Eq. 9.

The relationship between the QCME equation and HQMM
Based on the contents in Sec.2.2 and 3.1, we derive the hidden quantum Markov model from a quantum master equation and propose a novel stochastic graph model from a quantum conditional master equation.After some calculations (the detailed proof and calculation process are shown in supplemental material, we obtain: (1) For quantum master equation of Eq.8, the evolution density matrix of quantum dot system is (2) For quantum conditional master equation of Eq.12, the evolution density matrix of quantum dot system is where i = 0, 1, 2.
Comparing Eq.2(ω y = 1) and Eq.13, we concluded that there is a close relationship between a quantum Markov model and a quantum master equation.However, the Kraus operators K i,µ of Eq.14 are involved with the related ρ (n) , ρ (n−1) , and ρ (n+1) .This difference arises from the division of the Hilbert space of the environment, which gives rise to a new HQMM called the split hidden quantum Markov model.

Split hidden quantum Markov model
In this section, we present the SHQMM inspired by quantum transport systems, which is the main theoretical improvement in this work.Similar to the HQMM, the SHQMM is defined by applying a set of parameters λ SQ = (ρ (0) , ρ (1) , ρ (2) Firstly, the evolution conditional density matrix of quantum system H S is written as where, q denotes the values of subspace for environment M and i,µ K † i,µ K i,µ = I.The parameter y represents the read-out of information symbols from the open quantum system.Eq.15 represents a comprehensive expression that correlates to the relationship between ρ (Mq) .
Secondly, when we read out or measure a certain value y ′ for ρ (Mq) (t), the conditional density matrix ρ (Mq) (t + ∆t) is rewritten as follows: Thirdly, the probability of obtaining the measurement result y ′ is given by: Here, Eq.17 describes the contribution of different ρ ′(Mq) to the probability P (y ′ ), and this process reveals the concept of "detailed balance" in physics, as described in Eq.15.
The concretely implement example of our SHQMM To calculate the parameters of the SHQMM, assuming that a set of sequences y 0 , y 1 • • • , y T are known, the conditional density matrix evolution under the measurement result y i is shown in Fig. 2 based on transport system.It can be seen that Fig. 2 is similar to a neural network and shows the process of forward propagation through time t.This suggests that SHQMM is promising as a physical realization pathway for QNN, which will be our further research.This demonstrates the connection of quantum state evolution among the different subspaces n in QCME and the conversion relationship among the probabilities T r(ρ (n) ).Compared to previous work on HQMM (the probabilities for the measurement value y i depend on ρ = n ρ (n) ), the property illustrated in Fig. 2 also displays the contribution variance of different ρ (n) to the probabilities of obtaining the measurement value y i at time t i .Therefore, our model produces a more stable and robust model structure (as seen in experimental results).
Here, based on the QCME (Eq.14), we can write a probability function using the following equations.
where K i,µ , K 3,µ and K 4,µ of Eq.14 denote K y i , R y i and A y i , respectively.N max denotes the maximum value of n.The probability of y i is From Eq.18, the probability of sequences y 0 , y 1 • • • , y T can be easily obtained To compute the parameters of the SHQMM, we propose a maximum likelihood estimation method, based on the results in [23,27].Firstly, we use the probability function to derive all possible Kraus operators, and then use the gradient descent algorithm to find the matrix form of the Kraus operator that satisfies the minimum probability function of the given sequence.This turns parameter-solving into an optimization problem.
m is the dimension of the Kraus operator) .So the constraint condition in Eq.20 can be rewritten as We summarize all the above steps into an algorithm for solving the Kraus operator.The specific steps are shown in the Algorithm1.
Algorithm 1 Leaning SHQMM using gradient descent method on Stiefel manifold Require: Training data D ∈ N M ×l , W is number of sequences and l is the length of sequence.Ensure: Complex orthogonal matrix on Stiefel κ ∈ C 3dimO•m×m and ρ (0) , ρ (1) , ρ (2) and require that ρ (i) is positive semi-definite and

Compute gradient
Compute the like-hood function end for

12:
Update learning rate τ = ατ 13: end for 14: Compute the DA function by using the value of probability function and DA In Algorithm1, τ (learning rate), α(decay factor) and β(momentum parameter) are the hyperparameters, and DA is a function that describes the quality of the model.
As one of the core evaluation metrics of HQMM, DA [37] is able to measure the performance among models with different structures and functions in a relatively fair way [38], defined as: Where, D is data, M is model.l is the length of the sequence, and ι is the number of output symbol in the sequence.The function f (•) is a non-linear segmented function that can map any argument in (−∞, 1] to (−1, 1], defined as: The model can perfectly predict the Markov sequence if DA = 1 and the model better than random model for DA > 0.
In the SHQMM, different models can produce different prediction effects for the same sequences, depending on the number Mq of initialized conditional density matrices and the connections between them.Eq.18 describes the closest connection between the conditional density matrix and the number Mq, which is equal to n.
The optimal solution of HQMM is one of the optimal solutions of Eq.20.Therefore, periodic boundary conditions are applied to the first and last conditional density matrices, similar to the arrangement of atoms in a crystal.
(24) Different from Eq.18, Eq.24 includes periodic boundary conditions.This empirical improvement is due to the fact that the Kraus operators (R, A) often converge to zero during the optimization process, which means that the model has essentially degenerated into a HQMM.This improvement ensures the stability of the learning process.
More detailed cases are presented in supplemental material.
Extend the expandability ability of the SHQMM If we need to further improve the complexity of the model, we can set the parameters N max = 4, 5, 6, • • • and apply a more complicated connection defined as k-local.Thus, the general SHQMM can be defined as a tuple ) with the following conditions: (2) For every Kraus operator, K j y : C m → C m and y,j K j y † K j y = I.
where the periodic boundary conditions should be applied and conditional density matrix beyond index range should be zeroed, that is, (4) The probability of observation symbol y is Tr( ), (26) where, k-local represents the relationship between different conditional density matrix and j represents the number of Kraus operator classes.
In summary, enhancing state correlations and moving beyond simple neighbor connections enhances SHQMM's applicability to diverse time series problems.

Comparison of properties of SHQMM and HQMM
We use a simple case to illustrate the correlation and difference between SHQMM and HQMM.For 1-local model, by summing the conditional density matrix ρ (n) with index n in Eq.18, we obtain: The second equal sign in Eq.27 shows a formal relationship between SHQMM and HQMM, but our model has clear physical implications compared to the auxiliary dimension ω y in Eq.2 (HQMM with ω y = 3).This indicates that the SHQMM is a valid HQMM at the same time.
The differences between SHQMM and HQMM lie in the following aspects: • SHQMM makes density matrix have a hierarchy structure as shown in Fig. 2, and the density matrix evolves through multiple channels.
• SHQMM can be derived from actual physical systems, such as quantum transport systems, as shown in Fig. 2.
The SHQMM can reflect the relationship between hidden states and is more suitable for handling more complex data than the HQMM.From a physical system implementation point of view, the quantum conditional master equation may differ from the quantum transport system for other open quantum systems [31,32,33], resulting in different split hidden quantum Markov models.The Bayesian rule for SHQMM is: Eq.28 expresses, under the principle of conditional probability, the quantum state corresponding to the system when the system is observed as x at t and as y at t + ∆t.Table 2 shows the properties of the SHQMM and HQMM.

Experiment and results
In this section, we applied quantum and classical data to train and test our SHQMM.All experiments were performed on an experimental plat- Quantum data Firstly, we used the quantum data generated by a quantum mechanical process in Ref. [23].The quantum data has six hidden states and six observational values.The size of the quantum data is 40 × 3000.

Training and validation
We use 20 × 3000 data to train our model, generating a total of 20 models.Simultaneously, we used 10 × 3000 data to verify the model, and the remaining data were used to test the model.The results are shown in Fig. 3 with hyperparameters τ = 0.95, α = 0.95, β = 0.90 (More calculation results can be found in supplemental material).Fig. 3 shows several models λ SQ for single, double and three qubits quantum system which are used to construct the SHQMM under the different parameters N and k.It found : (1) With the increase of the qubit number, the value of DA also increases and reaches a stable value at about 20 epochs (three qubits exceed 20 epochs).( 2) Apart from the single bit, the connection mode (different klocal) and N have little effect on the DA value.
(3) While a higher number of quantum bits results in a relatively larger value of DA, the standard deviation (STD) decreases as the number of qubits increases during evaluation with alternative data.This suggesting potential overfitting.
To further test the reliability of our model, we will evaluate it from several perspectives.
Initialize Kraus In Ref. [27], it was stated that the training outcome of HQMM is susceptible to the initial Kraus operators in smaller models.Thus, this study investigates the effect of the initial position of Kraus operators on Stiefel manifolds for SHQMM.Eq. 29 is utilized to evaluate the distance between various initial positions.
When κ 1 = κ 2 , D = 0.The initialization method for the Kraus operator is presented in Algorithm 2. The varying behaviors of the DA are depicted in Figure 4 for different random initialization seeds (RS).Results clearly shows that our proposed SHQMM remains stable regardless of the initial Kraus values, thus confirming its validity.

Algorithm 2 Initial Kraus operator on Stiefel manifolds
Require: the dimension of Kraus operator m, the class of Kraus operator j, the dimension of observable space dimO and random seed RS Ensure: Kraus operator {K j y } 1: Initialize: if all column vectors in κ and ⃗ v s are orthogonal do 5: else do

7:
Schmidt Orthogonalization of κ and ⃗ v s 8: end if 10: end for 11: construct new matrix κ ′ = κ(:, Selection of effective models Given the various methods available to construct models, selecting the most effective one for a given dataset is a critical challenge.Typically, the expressiveness of a model is directly linked to the number of its parameters.The number of parameters for SHQMM is as follows: The corresponding results are shown in Fig. 5 to obtain best training results, we should change the dimension m of Kraus operator firstly and then adjust the parameter j for a given sequence.

Hyperparameters selection for the model
To obtain the optimal DA, Algorithm 1 employs three hyperparameters, namely τ , α, and β.Fig. 6 demonstrates the impact of varying hyperparameters on DA, indicating that DA is more sensitive to changes in α as compared to τ .Moreover, the existence of multiple local optima in SHQMM is evident.To identify the global optimum, we investigated the effect of momentum parameter β on DA for the best case (τ = 0.95, α = 0.95) and the worst case (τ = 0.65, α = 0.65), as presented in Fig. 7.It was observed that DA may reach the global optimum at τ = 0.95, α = 0.95, and β = 0.90, and that DA can be further enhanced by selecting a different β.However, after computing the distance of Kraus solutions between different hyperparameters (τ = 0.95, α = 0.95, β = 0.90 and τ = 0.65, α = 0.65, β = 0.60) using Eq.29, we discovered that their DA values were comparable despite locating at different positions on the Stiefel manifold.

Classical data
To conduct a thorough evaluation of the model, classical data generated by a hidden Markov process with transition matrix T and emission matrix C were utilized to compute the Kraus operator and determine DA.The results obtained from the classical data are presented in Fig. 8, where hyperparameters were set to τ = 0.95, α = 0.95, and β = 0.90.
Similar to the quantum case, the value of DA also increases and reaches a stable state after approximately 20 epochs (for three qubits, it took 20 epochs to stabilize).The different values of klocal have little impact on the DA value, and the standard deviation (STD) continues to decrease as the qubit number increases for the testing data, possibly due to model overfitting.Additional test results can be found in supplemental material.
Result The numerical experimental results of SHQMM on quantum and classical data are displayed in supplemental material.The current results are likely to be overfitted given the complexity of the relationship between complex model and simple data with lower number and dimensionality of hidden states [26].Keeping the hyperparameters constant, the training step converges more and more slowly as the number of qubits increases.SHQMM exhibits good robustness in numerical experiments and maintains stable performance in multiple task scenarios.For different sequences, the model maintains STD(DA) < 0.01.Different initial values of the Kraus operator have a negligible effect on the convergence of the model, due to the hierarchical structure of the conditional density matrix.

Analysis and discussion
In this section, the performance of SHQMM is analyzed from both experimental and theoretical perspectives.
Performance comparisons It is imperative to acknowledge the existence of structural differences between different models.Remarkably, the model presented here represents a groundbreaking development in our recognized field, as it is uniquely capable of articulating a meticulous representation of the internal state of a quantum system.Consequently, the task of identifying models that are precisely aligned to the same baseline for performance evaluation becomes a formidable challenge.To address this, we have wisely chosen to compare models that exhibit maximum congruence in both structure and functionality.Table .3 compares the DA of quantum data obtained by the SHQMM, HMM and HQMM in [23] as follows: As can be seen from Table .3, SHQMM improves DA by about 23.35% over the HQMM algorithm, while the quantum versions of the algorithm all have better performance in model quality over the classical algorithm.

Complexity analysis
Complexity serves as a theoretical measure encompassing both the computational and time costs associated with an algorithm.To ensure consistency between various models and their original literature in terms of description, we have retained their respective symbol systems in this chapter.The complexity of HMM, HQMM and SHQMM is analyzed as follows: • Building upon previous research [23], a classical HMM algorithm can be succinctly characterized as (n, s), where n denotes the number of hidden states and s represents the number of observations.In the context of an HMM with a prediction sequence of length T , the complexity arises as each state transition involves n states undergoing transi-tions, with each state having n possible transitions, resulting in a complexity denoted as O(T n 2 ) [39].
• Similarly, the HQMM is denoted by the ternary (n, s, w).Introducing the parameter w, alongside n and s, accounts for the number of Kraus operators per observable.In a manner akin to HMM, where a HQMM prediction sequence of length T is considered, there exist n states undergoing transitions, with each state producing w observations.Consequently, the complexity of HQMM is denoted as O(T nw).In scenarios of comparable scale, the complexity of HQMM is notably influenced by the parameter w, which can be independently set based on the application scenario.Generally, in smaller-scale tasks, n < w may occur, while in longer sequence tasks, n > w is more likely.
• The model structure of SHQMM is denoted as (N max , s, k), where N max represents the number of hidden states in the model, equivalent to n in other models.k represents the connectivity state which is related to the number of Kraus operators.Due to structural constraints, 2k + 1 ≤ N max .Similarly to HMM, SHQMM exhibits parallels in complexity analysis.During the transition process among N max states, each state yields 2k + 1 observational outputs.Therefore, the complexity is expressed as O(T N max (2k + 1)).Notably, due to the constraints of the connection method, imposing an upper limit on the complexity of SHQMM, expressed as Discussion of Quantum Advantage Exploring quantum advantages is a shared pursuit among all quantum machine learning algorithms.
In prior research, a substantial body of work has focused on examining the benefits of quantum versions of HMM.Noteworthy contributions, exemplified by [22] and [23], have highlighted the advantages observed in numerical experiments of Quantum HMM (HQMM), particularly in the DA.Additionally, certain studies have undertaken theoretical analyses to elucidate these quantum advantages.Notably, [25] and [26] provide insights into the advantages of HQMM, emphasizing its physical performance and memory efficiency.Furthermore, [29] illustrates the quantum approach's superiority in terms of state space complexity.Experiments in this work align with the aforementioned research, further substantiating the merits associated with quantum-based approaches.
According to the analysis of complexity, it is evident that HQMM and SHQMM demonstrate advantages when n is larger.This implies that quantum solutions are more suitable for scenarios involving long sequence tasks.It should be acknowledged that in numerical experiments on classical computers, the results did not fully reflect this advantage.This is attributed to the additional computational cost incurred by the classical computer's quantum simulation process.
Nevertheless, the experimental results of our work, [23] and [27] still showcase the potential performance advantages of the quantum approach, namely the ability to convey more information with fewer hidden states.This high expressivity has been mathematically proven in [29], representing a potential advantage of quantum approaches.In terms of encoding efficiency, a substantial amount of information can be encoded onto a smaller number of quantum states, thereby enhancing the utilization of informational resources.This implies that SHQMM can be applied to large-scale tasks such as weather forecasting.
Furthermore, the superiority of SHQMM is underscored by its capacity to correspond to a tangible physical system, specifically the quantum transport system.This attribute imparts a lucid interpretability to SHQMM, enabling continuous trackigng of the internal intricacies of the quantum system throughout the evolutionary process further than HQMM.Also, owing to its association with an authentic quantum physical system, SHQMM demonstrates an increased affinity for quantum data, a trait supported by the results of numerical experiments.Together with other quantum versions of HMM models, SHQMM enriches the available selection of models for the study of quantum Markov processes.
It is noteworthy, however, that SHQMM's performance with some classical datasets appears suboptimal.Further analysis, informed by the insights of Michael et.al.'s work [36] thereby constituting a focal point for our forthcoming research endeavors.

Conclusion
In this paper, the novel stochastic probabilistic graphical model, SHQMM, is introduced as our main theoretical contribution using quantum conditional master equations.An empirical improvement has also been implemented by introducing periodic boundary conditions to ensure the stability of the learning process.
Numerical experiments underscore that the model's performance on DA remains robust following the introduction of the new structure, displaying heightened insensitivity to the initial state and increased overall robustness.
SHQMM emerges as a valuable tool for elucidating the intricate relationships among hidden states within quantum systems, providing support and additional choices for addressing HMM challenges through quantum methods.The model's correspondence with quantum transport systems further enhances its appeal, offering promising prospects for physical implementation.Additionally, given the structural similarities, SHQMM assumes a pivotal role as a theoretical foundation for the physical realization of QNN.

3 :
ρ total is density matrix.2: for epoch = 1 : E do split the data D into B batches D B 4: for batch = 1 : B do 5:

Figure 2 := 3 N = 2 j = 5 N = 3 j = 3 N = 3 j = 5 NN = 2 j = 3 N = 2 j = 5 N = 3 j = 3 N = 3 j = 5 N = 3 j = 7 (
Figure 2: The expanded calculation diagram of ρ n (t) for a set of sequences y 0 , y 1 • • • , y T : The red line represents the Kraus operator K y in {K}, the black line represents the Kraus operator R y in {R}, and the blue line represents the Kraus operator A y in {A}

Figure 3 :
Figure 3: The training result of the different SHQMM for quantum data under different parameters N and j = 2k + 1.The subfigures (a), (b), (c) represent the training results for choosing single, double and three qubits quantum system, respectively.

Figure 5 :Figure 6 :
Figure 5: The relationship between the training outcome of SHQMM and the number of parameters.(a) represents the variation of DA with the dimension m of the Kraus operator.(b) represents the variation of DA with the j of the conditional density matrix.The training results are more sensitive to m than j

Table 1 :
The difference between the HMM and the HQMM

Table 2 :
The properties of HQMM and sHQMM

Table 3 :
[23]arison of HMM, HQMM and SHQMM on DA.The presentation of complete data under different parameters is shown in[23]and supplemental material.stands for the classical Hidden Markov model with two hidden states and two observables, and L stands for the fact that the value is obtained by learning.Similarly, 2, 2, 1-HQMM(L) denotes the quantum hidden Markov model having two hidden states, two observables, with one Kraus operator per observable.For SHQMM, 3,AR means that the number of hidden states is 3, and AR means that the current model has 3 hidden states and connectivity is adjacent.≥ indicates that the value did not converge at the number of iterations set.