jeanhee.chung at yale.edu
Thu Jan 9 07:09:03 CET 2003
I'm fairly new to R so please excuse me if I am asking something obvious.
I have looked in the FAQ, Introduction, and help pages, and searched the
archives, but I don't know much about graphics yet.
I'm running Red Hat Linux 2.14.18 on a machine blessed with dual 1.5 Xeon
processors and 3.7GB of RAM. I have a very large dataset with 27 variables,
and in exploring the data I want to take snapshots using pairs(). The lower
matrix and diagonal are filled with other graphics. (Please don't suggest
that I cut down the variable number! This is in fact the trimmed-down,
must-have set of variables.)
Of course, even with all that memory, I get a crash about 2/3 of the way
through. This is one of those cases where it's hard to troubleshoot since
everything works fine for small datasets. It is tantalizing because the
process takes over two hours to display most of the figure before the
However, it seems to me that the crash is more related to the kind of
graphics device that I'm using and the size of the device.For instance, if
I'm using X11 it crashes slower than using png, and right now I'm trying
bitmap to produce a png file (it hasn't crashed after a half hour now, but
there's always time for that later.) The plot also gets further along if I
set a small area for the device, but of course then the plots are
ridiculously tiny and hard to interpret. I have 729 little plots, and I'd
be satisfied if they were at least .75 inches on each side... about 21 in.
What can I do to increase the chances that I'll be able to produce a
viewable, printable image?
Suppose that bitmap works-- can I raise the resolution up from 72 without
More information about the R-help