[Rd] Is it a good choice to increase the NCONNECTION value?

Martin Maechler m@ech|er @end|ng |rom @t@t@m@th@ethz@ch
Tue Aug 24 22:53:50 CEST 2021


>>>>> GILLIBERT, Andre 
>>>>>     on Tue, 24 Aug 2021 09:49:52 +0000 writes:

  > RConnection is a pointer to a Rconn structure. The Rconn
  > structure must be allocated independently (e.g. by
  > malloc() in R_new_custom_connection).  Therefore,
  > increasing NCONNECTION to 1024 should only use 8
  > kilobytes on 64-bits platforms and 4 kilobytes on 32
  > bits platforms.

You are right indeed, and I was wrong.

  > Ideally, it should be dynamically allocated : either as
  > a linked list or as a dynamic array
  > (malloc/realloc). However, a simple change of
  > NCONNECTION to 1024 should be enough for most uses.

There is one important other problem I've been made aware
(similarly to the number of open DLL libraries, an issue 1-2
years ago) :

The OS itself has limits on the number of open files
(yes, I know that there are other connections than files) and
these limits may quite differ from platform to platform.

On my Linux laptop, in a shell, I see

  $ ulimit -n
  1024

which is barely conformant with your proposed 1024 NCONNECTION.

Now if NCONNCECTION is larger than the max allowed number of
open files and if R opens more files than the OS allowed, the
user may get quite unpleasant behavior, e.g. R being terminated brutally
(or behaving crazily) without good R-level warning / error messages.

It's also not at all sufficient to check for the open files
limit at compile time, but rather at R process startup time 

So this may need considerably more work than you / we have
hoped, and it's probably hard to find a safe number that is
considerably larger than 128  and less than the smallest of all
non-crazy platforms' {number of open files limit}.

  > Sincerely
  > Andr� GILLIBERT

  [............]



More information about the R-devel mailing list