[R] can I call user-created functions without source() ?

Joerg van den Hoff j.van_den_hoff at fz-rossendorf.de
Mon Jun 19 16:19:51 CEST 2006


Duncan Murdoch wrote:
> Just a few comments below on alternative ways to do the same things:
> 
> On 6/19/2006 8:19 AM, Joerg van den Hoff wrote:
> 
>> for short term usage of some specialized functions I have added some 
>> lines to the `.Rprofile' in my home(!) directory as follows (probably 
>> there are smarter solutions, but at least it works):
>>
>> #source some temporary useful functions:
>> fl <- dir(path='~/rfiles/current',patt='.*\\.R$',full.names=TRUE)
>> for (i in fl) {cat(paste('source("',i,'")\n',sep="")); source(i)}
>> rm(i,fl)
> 
> Another way to do this without worrying about overwriting some existing 
> variables is
> 
> local({
> fl <- ...
> for (i in fl) ...
> })

> 
> No need to remove fl and i at the end; they were created in a temporary 
> environment, which was deleted at the end.
> 
sure, that's better (just one more case, where I did'nt know of the 
existence of a certain function). but what is the difference (with 
regards to scope) of `i' or `fl' and the functions defined via sourcing? 
are'nt both objects defined within `local'? why _are_ the functions 
visible in the workspace? probably I again don't understand the 
`eval'/environment intricacies.
>>
>> here, I have put all the temporary stuff in a single dedicated dir 
>> `~/rfiles/current', but of course you can use several dirs in this 
>> way. all files in this dir with names ending in `.R' are sourced on 
>> startup of R. this roughly works like one of the directories on 
>> MATLAB's search path: every function definition in this directory is  
>> 'understood' by R (but everything is loaded into the workspace on 
>> startup, no matter, whether you really need it in the end: no real 
>> `load on demand'). 
> 
> It's possible to have load on demand in R, and this is used in packages. 
>  It's probably not worth the trouble to use it unless you're using a 
> package.
> 
> 
> one
>> important difference, though: this is only sensible for function 
>> definitions, not scripts ('executable programms' (which would directly 
>> be executed on R startup, otherwise).
>> and, contrary to matlab/octave, this is not dynamic: everything is 
>> read in at startup, later modifications to the directories are not 
>> recognized without explicitely sourcing the files again.
> 
> There isn't really any reasonable way around this.  I suppose some hook 
> could be created to automatically read the file if the time stamp 
> changes, but that's not really the R way of doing things:  generally in 
> R active things are in the workspace, not on disk.  A good way to work 
> is prepare things on disk, then when they are ready, explicitly import 
> them into R.
> 
>>
>> if you in addition you want to load definitions from the startup 
>> directory where you launch R (your project dir), the above could be 
>> modified to:
>>
>> #source some temporary useful functions from startup dir:
>> fl <- dir(path=getwd(),patt='.*\\.R$',full.names=TRUE)
>> for (i in fl) {cat(paste('source("',i,'")\n',sep="")); source(i)}
>> rm(i,fl)
>>
>> in this way you at least don't need a separate `.Rprofile' in each 
>> project dir.
> 
> Another alternative if you want something special in the project is to 
> create a .Rprofile file there, and put source("~/.Rprofile") into it, so 
> both the local changes and the general ones get loaded.
> 
> Duncan Murdoch
>>
>>
>>
>> joerg
>>
>> ______________________________________________
>> R-help at stat.math.ethz.ch mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide! 
>> http://www.R-project.org/posting-guide.html
>



More information about the R-help mailing list