[R] Improving data processing efficiency

Gabor Grothendieck ggrothendieck at gmail.com
Fri Jun 6 20:32:47 CEST 2008


On Fri, Jun 6, 2008 at 2:28 PM, Greg Snow <Greg.Snow at imail.org> wrote:
>> -----Original Message-----
>> From: r-help-bounces at r-project.org
>> [mailto:r-help-bounces at r-project.org] On Behalf Of Patrick Burns
>> Sent: Friday, June 06, 2008 12:04 PM
>> To: Daniel Folkinshteyn
>> Cc: r-help at r-project.org
>> Subject: Re: [R] Improving data processing efficiency
>>
>> That is going to be situation dependent, but if you have a
>> reasonable upper bound, then that will be much easier and not
>> far from optimal.
>>
>> If you pick the possibly too small route, then increasing the
>> size in largish junks is much better than adding a row at a time.
>
> Pat,
>
> I am unfamiliar with the use of the word "junk" as a unit of measure for data objects.  I figure there are a few different possibilities:
>
> 1. You are using the term intentionally meaning that you suggest he increases the size in terms of old cars and broken pianos rather than used up pens and broken pencils.
>
> 2. This was a Freudian slip based on your opinion of some datasets you have seen.
>
> 3. Somewhere between your mind and the final product "jumps/chunks" became "junks" (possibly a microsoft "correction", or just typing too fast combined with number 2).
>
> 4. "junks" is an official measure of data/object size that I need to learn more about (the history of the term possibly being related to 2 and 3 above).
>

5. Chinese sailing vessel.
http://en.wikipedia.org/wiki/Junk_(ship)



More information about the R-help mailing list