[R] Running R under Mosix
B.Rowlingson at lancaster.ac.uk
Thu May 3 12:05:09 CEST 2001
Mosix is a cluster operating system that is a set of
kernel patches to Linux on i386 machines. It allows
processes to migrate to other nodes on the cluster
transparently. See www.mosix.org for details. However,
my R processes were refusing to migrate.
Using strace, and delving into the R code, I found
that it was due to a large number of calls to the system
setjmp and longjmp routines - the setjmp is in applyClosure()
in eval.c and the longjmps could be just about anywhere.
The problem is that setjmp and longjmp preserve the signal
handling context, and any change in signal handling has to
be done on the machine that the R process started on. If the
process has migrated there is some overhead as the remote
process communicates back to the original. Because R was
doing so many setjmp/longjmp calls Mosix decided it wasn't
worth migrating the process (even to a machine with twice
the CPU power of the home node).
I solved the problem by redefining the SETJMP and LONGJMP
macros in src/include/Defn.h to call _setjmp and _longjmp
instead. These functions dont preserve the signal context.
The result was an R executable that migrated and got 99.9%
CPU time on the twice-as-powerful machine.
But, you R-wizards out there, is there any side-effect of
this change? The man page for setjmp says that the POSIX
standard doesn't define whether or not the signal context
is saved so hopefully not. True or false?
Maths and Stats
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
More information about the R-help