[R] Simple parallel for loop

R. Michael Weylandt michael.weylandt at gmail.com
Tue May 15 08:30:39 CEST 2012


Perhaps mcmapply from the parallel package? It's a parallel mapply to
complement mclapply.

Michael

On Tue, May 15, 2012 at 2:28 AM, Alaios <alaios at yahoo.com> wrote:
> Thanks Michael,
> last comment that I am trying to figure out, is that my called function has
> two inputs arguments, while foreach looks to be working with only on one,
> being also able to execute one command while I need two. ReadDataSet (based
> on iteration number). Call PlotFunction(based on two inputs that depend on
> iteration number).
>
> I am not quite sure how to pass with for each two input arguments in my
> function.
>
> I will also try to look for mclapply, but it also looks, at least for now,
> that only passed one input argument to my function.
>
> Cheers
> Alex
>
> ________________________________
> From: R. Michael Weylandt <michael.weylandt at gmail.com>
> To: Alaios <alaios at yahoo.com>
> Cc: R help <R-help at r-project.org>
> Sent: Tuesday, May 15, 2012 8:24 AM
> Subject: Re: [R] Simple parallel for loop
>
> I haven't actually used foreach very much myself, but I would imagine
> that you could just take advantage of the fact that most plot
> functions return their arguments silently and then just throw the
> results away (i.e., don't assign them)
>
> Switching %do% to %dopar% automatically activates parallelization
> (dopar being "do in parallel")
>
> I believe you decide the number of cores to use when you set up your
> parallel backend (either multicore or snow)
>
> Hope this helps,
> Michael
>
> On Tue, May 15, 2012 at 2:20 AM, Alaios <alaios at yahoo.com> wrote:
>> Hello Michael,
>> thanks for the answer, it looks like that the foreach package might do
>> what
>> I want. Few comments though
>>
>> The foreach loop asks for a way to combine results, which I do not want to
>> have any. AFter I load a dataset the subsequent function does plotting and
>> save the files as pdfs, nothing more.
>>
>> What is the difference between %do% and %dopar%, they look actually the
>> same.
>>
>> I do not see to be anyway to contol the number of used cores, like set to
>> use only 4, or 8 or 16.
>>
>> Regards
>> Alex
>>
>> ________________________________
>> From: R. Michael Weylandt <michael.weylandt at gmail.com>
>> To: Alaios <alaios at yahoo.com>
>> Cc: R help <R-help at r-project.org>
>> Sent: Tuesday, May 15, 2012 8:00 AM
>> Subject: Re: [R] Simple parallel for loop
>>
>> Take a look at foreach() and %dopar$ from the CRAN package foreach.
>>
>> Michael
>>
>> On Tue, May 15, 2012 at 1:57 AM, Alaios <alaios at yahoo.com> wrote:
>>> Dear all,
>>> I am having a for loop that iterates a given number of measurements that
>>> I
>>> would like to split over 16 available cores. The code is in the following
>>> format
>>>
>>> inputForFunction<-expand.grid(caseList,filterList)
>>> for (i in c(1:length(inputForFunction$Var1))){#
>>>       FileList<-GetFileList(flag=as.vector(inputForFunction$Var1[i]));
>>>        print(sprintf("Calling the plotsCreate for %s
>>>
>>> and%s",as.vector(inputForFunction$Var1[i]),as.vector(inputForFunction$Var2[i])))
>>>
>>>
>>>
>>> plotsCreate(Folder=mainFolder,case=as.vector(inputForFunction$Var1[i]),DataList=FileList,DataFilter=as.vector(inputForFunction$Var2[i]))
>>>  }
>>>
>>> as you can see after the inputForFunction is calculated then my code
>>> iterates over the available combinations of caseList and filterList. It
>>> would be great, without major changes, split these "tasks" to all the
>>> available processors.
>>>
>>> Is there some way to do that?
>>>
>>> Regards
>>> Alex
>>>
>>>        [[alternative HTML version deleted]]
>>>
>>>
>>> ______________________________________________
>>> R-help at r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>>
>>
>>
>
>



More information about the R-help mailing list