[R] Ubuntu vs. Windows

George N. White III aa056 at chebucto.ns.ca
Sat Apr 26 21:18:42 CEST 2008


On Tue, 22 Apr 2008, Doran, Harold wrote:

> Dear List:
>
> I am very much a unix neophyte, but recently had a Ubuntu box installed
> in my office. I commonly use Windows XP with 3 GB RAM on my machine and
> the Ubuntu machine is exactly the same as my windows box (e.g.,
> processor and RAM) as far as I can tell.
>
> Now, I recently had to run a very large lmer analysis using my windows
> machine, but was unable to due to memory limitations, even after
> increasing all the memory limits in R (which I think is a 2gig max
> according to the FAQ for windows). So, to make this computationally
> feasible, I had to sample from my very big data set and then run the
> analysis. Even still, it would take something on the order of 45 mins to
> 1 hr to get parameter estimates. (BTW, SAS Proc nlmixed was even worse
> and kept giving execution errors until the data set was very small and
> then it ran for a long time)
>
> However, I just ran the same analysis on the Ubuntu machine with the
> full, complete data set, which is very big and lmer gave me back
> parameter estimates in less than 5 minutes.
>
> Because I have so little experience with Ubuntu, I am quite pleased and
> would like to understand this a bit better. Does this occur because R is
> a bit friendlier with unix somehow? Or, is this occuring because unix
> somehow has more efficient methods for memory allocation?

On the same hardware the differences between windows and linux
performance are generally minor, but there are many things that
can cause very poor performance on either platform.

> I wish I knew enough to even ask the right questions. So, I welcome any
> enlightenment members may add.

I have seen very big differences in performance on computational 
benchmarks for hardware with similar basic specifications (CPU type and 
clock, RAM, etc).  Often the difference is a symptom of broken hardware or 
some misconfiguration.  Can you see the difference in performance in other 
applications?  Here are some things to consider:

1. anti-virus scanning and other background tasks -- I've seen
    systems configured to scan gigabyte network drives.  Windows
    task manager and linux "top", etc. can give an idea of what
    is using a lot of CPU, but they are not so helpful if the issue
    involves I/O bottlenecks.

2. incorrect hardware configuration in the system BIOS.  This happens
    far too often, even with big name vendors.  I like to run some
    benchmarks on every new system to make sure there aren't some
    basic configuration errors, and to have as a reference if I
    suspect problems after the systems have been in use.

3. network problems.  Where I work, so PC's (both linux and Windows)
    get the ethernet duplex setting wrong when booted.  This can
    result in poor performance when using networked disks without
    other symptoms.  On windows, the "repair network connection"
    button often clears the problem.  On linux, ethtool can display
    and change ethernet settings.

4. all sorts of hardware issues -- sometimes useful data appear in
    the system logs.  Use "event viewer" on Windows, look at
    /var/log/messages and /var/log/dmesg on linux.

5. does the slow system exhibit a lot more disk activity?  Sometimes
    this is hard to detect, but most systems do provide some statistics.
    Try running some I/O intensive benchmark at the same time your R
    job is running.

-- 
George N. White III  <aa056 at chebucto.ns.ca>



More information about the R-help mailing list