[R] problem with package development and older defs earlier in search order

Martin J Reed mjreed at essex.ac.uk
Sat Nov 10 22:05:58 CET 2012


Rolf,

Re version control: I use SVN and Git depending on the project I am working on and what others are using. Years ago I used RCS, as you say its great for a local repository (as is Git). The point I was making was not about version control but that others like me might get caught out by saving environments  on session quit and those environment having older versions of a function definition in it. I appreciate that you can get around this by never saving a session (but I suspect I am not the only one that finds saving a session useful). 

The point of my "shaganappi" (great, and very apt word!) strategy was:  if you are developing code and are debugging with fast change/recompile/test cycles then you probably to not want to rebuild the package and load it into R every time, it is much quicker to just source the R files and dynload the compiled library. It took me a few hours of testing to realise that doing this is a problem if you save the session as then when you do get round to producing stable code and testing the loading of the package it does not overwrite these older versions and is later in the search order. The functions I listed are just useful as part of the debugging of package loading for me. Also some of the people I shared my earlier (non-packaged) version of the code need to clean their saved sessions of the old versions - hence a utility function to do this. If you never save a session then clearly this is completely unnecessary, but I have a number of users that have saved sessions and need this fix.

I quite understand that doing what I suggested in .onAttach would not be acceptable for any production code and I should have made it clearer in my email that this was the case - thanks for doing this for other readers. If I do submit this to CRAN I would of course not include these functions. I still have quite a lot to learn before I would feel confident of doing this anyway…

Thanks again for the helpful comments and guidance on good practice.

Regards,

Martin


On 10 Nov 2012, at 02:50, Rolf Turner <rolf.turner at xtra.co.nz> wrote:

> On 10/11/12 12:08, Martin J Reed wrote:
>> Rolf and Duncan
>> 
>> Many thanks. Your answers pointed me to a refinement that is closer to what I want:
>> 
>>   rm(list=intersect(ls(".GlobalEnv"),ls("package:reedgraph")),
>>      pos=".GlobalEnv")
>> 
>> This only removes items that are "masked" by GlobalEnv from my package.
>> 
>> As this is a bit long for some of the people that need to update their workspaces I have created a function to fix it:
>> 
>> <packagename>.update2package  <- function() {
>>   rm(list=intersect(ls(".GlobalEnv"),ls("package:<packagename>")),
>>      pos=".GlobalEnv")
>> }
>> 
>> Just for completeness (if anyone else reads this). It is possible to make this happen automatically at package load using
>> 
>> .onAttach <- function(libname, pkgname) { <packagename>.update2package() }
>> 
>> However, as Duncan says this is REALLY bad practice, but is useful to me while debugging….
> 
> I really don't see it as being at all useful.  What is the point, in terms of package
> development, of keeping copies of those functions in the global environment if
> you are going to remove them whenever you load the package?
> 
> It sounds to me like you need to implement some system of version control,
> such as subversion (svn).  Personally I use rcs --- simple enough for the simple
> minded such as my very good self to use, and amply adequate for my needs.
> 
> A version control system allows you to "backtrack" if a revision that you make
> to a bit of software turns out to be undesirable.  It will do that much more
> effectively IMHO than your current shaganappi strategy.
> 
> Finally you should note that CRAN policies expressly forbid the sort of thing that you
> propose doing with your .onAttach() function,  should you ever be inclined to submit
> your package to CRAN.
> 
>    cheers,
> 
>        Rolf




More information about the R-help mailing list