ARDS Mathematical Equations
ARDS Markov equations
2024-04-29
By Samuel Y Huang and Alexander Huang
Modeling Acute Respiratory Distress Syndrome (ARDS) with a stochastic modeling process that utilizes transition diagrams and first step analyses
{X0,X1,X2,…}, where Xt is the state at time t.
In terms of ARDS this means {XARDSStage(t=1),XARDSStage(t=2),XARDSStage(t=3),…}, where XARDSStage(t=t) is the state at time t.
Where ARDS is defined via stages as below ARDS stage={Mildif PaO2/FiO2<100,Moderateif 100≤PaO2/FiO2≤200,Severeif 200<PaO2/FiO2≤300. The Markov Chains have a crucial property
Xt+1 depends only on Xt.
Questions of Interest - Starting from Mild Acute Respiratory Distress Syndrome, what is the probability of decompensating to moderate or severe or death, what is the probability of downgrading to the floors
Starting from Moderate Acute Respiratory Distress Syndrome, what is the probability of decompensating to severe or death, what is the probability of downgrading to the floors or improving to mild ARDS
Starting from Severe Acute Respiratory Distress Syndrome, what is the probability of decompensating to death, what is the probability of downgrading to the floors or improving to mild or moderate ARDS
For patients on the floor, what is the probability of decompensating to mild, moderate, or severe ARDS or death
For patients that are dead it is a closed loop
Definition: The state of a Markov chain at time T is the value of
XARDSStage(t=t)
if
XARDSStage(t=6)=mild we say that at time 6 they are in Mild ARDS
Definition: The state space of a Markov Chain S, is the set of values that each XARDSStage(t)
can take. For example, S={mild,moderate,severe,floors,death} Let S have size N = 5
Definition: A trajectory of a Markov chain is a particular set of values for
{X0,X1,X2,…} For example if in our ARDS case {XARDSStage(t=1)=Floors,XARDSStage(t=2)=Mild,XARDSStage(t=3)=Moderate,XARDSStage(t=4)=Severe,XARDSStage(t=5)=Death,…} Then the trajectory up to time t = 5 is Floors -> Mild -> Moderate -> Severe -> Death
Trajectory just means Path
Markov Property
The basic property of a Markov Chain is that the only the most recent point in the trajectory affects that happens next. P(Xt+1=s∣Xt=st,Xt−1=st−1,…,X0=s0)=P(Xt+1=s∣Xt=st),
P(Xt+1) depends on P(Xt) but does not depend on Xt-1, …X2, X1, X0.
The Transition Matrix
(StateMildModerateSevereFloorsDeathMildPmild,mildPmild,moderatePmild,severePmild,floorsPmild,deathModeratePmoderate,mildPmoderate,moderatePmoderate,severePmoderate,floorsPmoderate,deathSeverePsevere,mildPsevere,moderatePsevere,severePsevere,floorsPsevere,deathFloorsPfloors,mildPfloors,moderatePfloors,severePfloors,floorsPfloors,deathDeathPdeath,mildPdeath,moderatePdeath,severePdeath,floorsPdeath,death)
The row represents Now from or from X(t) The columns represent Next or To (Xt+1) Entry (i,j) is the conditional probability that next = j given that now = i: the probability of going from state i to state j
pij=P(Xt+1=j∣Xt=i).
N∑j=1pij=N∑j=1P(Xt+1=j∣Xt=i)=N∑j=1P{Xt=i}(Xt+1=j)=1.
Tne transition matrix P list all possible states in the state space S. P is a square matrix (NxN), because Xt+1 and Xt both take values in the same state space S (of size N) The rows of P each sum to 1
Let’s define the distribution of X at a certain time
{XARDSStage(t=1),XARDSStage(t=2),XARDSStage(t=3),…}, where XARDSStage(t=t) is the state at time t.
Where the state space is as below: S={mild,moderate,severe,floors,death} What we need to determine the probability distribution of each time point. Let’s say X0, this will be the chance that in individual starts off from time 0 at each end point from being admitted from the emergency department. Pi is an Nx1 vector denoting the probability distribution of X0
π=(π1π2⋮πN)=(P(X0=1)P(X0=2)⋮P(X0=N)) We will write X0∼πT to denote the row vector of probabilities is given by the row vector pi row transposition
How do we calculate the probability of the current state? at the current time? using the partition rule below Use the Partition Rule, conditioning on X0:P(X1=j)=N∑i=1P(X1=j∣X0=i)P(X0=i)=N∑i=1pijπiby definitions=N∑i=1πipij=(πTP)j.(pre-multiplication by a vector from Section 8.5).This shows that P(X1=j)=(πTP)j for all j.The row vector πTP is therefore the probability distribution of X1:X0∼πTX1∼πTP.
Clinical Correlate Let’s say we are on our second timepoint t = 1 let’s say moderate ARDS. What is the probability they had the specific trajectory mild - > moderate ARDS.
Now we want to describe the probability distribution after time 0. How do we get the states at time T?
Trajectory Probability Can characterize the trajectory of ARDS by probabilities.
For example Transition of the trajectory from Severe -> Moderate -> Mild -> Floors
Of great interest will be the probability of regressing from ARDS within one day or 7 days
Do most patients deescalate in a predictable pattern? or is this more of a random disease where there will be a uniform distribution of trajectories?
Recall a trajectory is the sequence of values.
Because of the Markov Property (Memoryless) we can find the proability of any trajectory by multiplying together the starting proability and all subsequent single step properties
Let’s envision a 7 day course (trajectory) of ARDS.
What is the probability of Mild -> Severe -> Moderate -> Mild -> Floors -> Severe -> Death?
Where the starting probability for Mild is as follows X0∼πT
T∼(StateMildModerateSevereFloorsDeathMildPmild,mildPmild,moderatePmild,severePmild,floorsPmild,deathModeratePmoderate,mildPmoderate,moderatePmoderate,severePmoderate,floorsPmoderate,deathSeverePsevere,mildPsevere,moderatePsevere,severePsevere,floorsPsevere,deathFloorsPfloors,mildPfloors,moderatePfloors,severePfloors,floorsPfloors,deathDeathPdeath,mildPdeath,moderatePdeath,severePdeath,floorsPdeath,death)
Let’s define the probability of each of the different states Let X0∼(Mild). What is the probability of the trajectory Mild → Severe → Moderate → Mild → Floors → Severe → Death?P(Mild, Severe, Moderate, Mild, Floors, Severe, Death)=P(X0=Mild)×pmild, severe×psevere, moderate×pmoderate, mild×pmild, floors×pfloors, severe×psevere, death
This is theoretical.
A Markov Chain maps the changes of the states in the aggregate population.
Comments
Post a Comment