[Rd] Performance issues with R2.15.1 built from source on Solaris

Eberle, Anthony aeber at allstate.com
Fri Aug 10 18:50:04 CEST 2012

Yes, the T4 physical host that the zone is configured on is SPARC.  Are
you saying that this is a function of the hardware or would possibly
updating to a newer version of gcc help?  I'm actually going to see if I
can build a newer version of gcc anyway just to eliminate that variable.

I picked the "sd" and "R-Benchmark" tests as I am new to R and
functioning more as an administrator than a modeler or statistical
engineer.  I too have seen that some operations do perform better on
certain systems than others.  This makes sense as I know different
hardware and CPU's are better at some things than others.  However I'm
not familiar enough with R yet to know what the functions are doing
under the hood and which would be more performing on what hardware.

However I will try to check into the specs for the T4 and see if
anything is mentioned about the "long double" or something along these
lines.  Thanks.


-----Original Message-----
From: r-devel-bounces at r-project.org
[mailto:r-devel-bounces at r-project.org] On Behalf Of Radford Neal
Sent: Friday, August 10, 2012 9:49 AM
To: r-devel at r-project.org
Subject: [Rd] Performance issues with R2.15.1 built from source on

Is this a SPARC system?  On at least some SPARC systems, the "long
type in C is implemented very slowly in software, and it seems that it
is used for the sums done when calculating standard deviations with

   Radford Neal

> Date: Wed, 8 Aug 2012 18:55:37 -0500
> From: "Eberle, Anthony" <aeber at allstate.com>
> To: <r-devel at r-project.org>
> Subject: [Rd] Performance issues with R2.15.1 built from source on
> 	Solaris?

> I have a question about building R (2.15.1) from source on a Solaris 
> 10 64bit virtual server with 2 cores and 16GB memory that is running 
> on an Oracle T4 server.  Based on several tests I have done this 
> configuration has been several orders of magnitude slower than any 
> other configuration I've tested.
> A simple test of some code to calculate the standard deviation 10000 
> times (simple code to consume resources) takes on average 121.498 
> seconds on the Solaris server where the next worst system (Redhat 
> Linux) takes 1.567 seconds:

R-devel at r-project.org mailing list

More information about the R-devel mailing list