[R] How to protect two jobs running on the same directory at the same time not to corrupt each other results:

Patrick Burns pburns at pburns.seanet.com
Fri Feb 9 12:58:02 CET 2007


You can 'save' the objects produced by each BATCH
job in a file whose name relates to the job.  With this
technique, you can run as many BATCH jobs on the same
data as you like.

Once the jobs are done, then you can 'load' the files that
were saved.

Patrick Burns
patrick at burns-stat.com
+44 (0)20 8525 0696
http://www.burns-stat.com
(home of S Poetry and "A Guide for the Unwilling S User")

Aldi Kraja wrote:

>Hi,
>
>I have a large group of jobs, some of them are running on the same 
>directory.  All of them in batch mode.
>What are the best ways to protect from corrupting the results two or 
>more jobs running on the same directory.
>One, I would think can be to run each job in a separate directory, 
>collect the results and after remove the directories. But I have 
>thousands of jobs that will run in parallel and I have placed about 100 
>of them in each directory. They all do the same process, but on 
>different variables, replications etc.
>
>Is there any other solution better than creating separate directories in 
>R? I am thinking if there is any option in R to create a unique id which 
>has its own unique .Rdata, although in the same directory?
>
>SAS for example to each batch job it assigns a different ID and a 
>separate temp space, and does not mix it with another job running in 
>parallel.
>
>Thanks,
>
>Aldi
>
>--
>
>______________________________________________
>R-help at stat.math.ethz.ch mailing list
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.
>
>
>  
>



More information about the R-help mailing list