[R] Markov Decision Process

Lorenzo Isella lorenzo.isella at gmail.com
Sat Sep 14 12:56:35 CEST 2013


Dear All,
I am struggling with the conceptual aspects of a problem.
I am sure that someone on this list must be familiar with this.
Let's say that you have some cancer data for your patients.
In particular, every patient may undergo up to [i.e. the cycles may stop  
earlier for various reasons] 6 cycles of therapy (hormonal or  
chemotherapy) whose durations and starting times are known. There are  
plenty of other data available, but let us keep it simple for now.
At the end of the therapy cycles, you know if the patient is dead or alive  
(in reality, the final states are more as the patient may be dead  
with/without cancer or alive with/without cancer, but again, let's keep it  
simple for now).
Of course, you want to develop a policy which maximizes the probability of  
the patient to be alive at the end of the cycles of therapies.
Does anybody know how to tackle this in a Markov decision approach?
There are so many R packages dealing with Markov chains that it is almost  
confusing for a beginner.
Any suggestion is welcome.
Many thanks

Lorenzo



More information about the R-help mailing list