[R] Efficient way of loading files in R
pro|@@m|t@m|tt@| @end|ng |rom gm@||@com
Fri Sep 7 12:11:58 CEST 2018
getgeo() seems to be a custom routine. Import the file in reader and
confirm that's a CSV file from Excel. If this is a non standard input,
custom subroutine is creating new constraints. Usually R has no problem
till workspace is 1 gb
On Fri 7 Sep, 2018, 15:38 Deepa, <deepamahm.iisc using gmail.com> wrote:
> I am using a bioconductor package in R.
> The command that I use reads the contents of a file downloaded from a
> database and creates an expression object.
> The syntax works perfectly fine when the input size is of 10 MB. Whereas,
> when the file size is around 40MB the object isn't created.
> Is there an efficient way of loading a large input file to create the
> expression object?
> This is my code,
> gseEset1 <- getGEO('GSE53454')[] #filesize 10MB
> gseEset2 <- getGEO('GSE76896')[] #file size 40MB
> ##gseEset2 doesn't load and isn't created
> Many thanks
> [[alternative HTML version deleted]]
> R-help using r-project.org mailing list -- To UNSUBSCRIBE and more, see
> PLEASE do read the posting guide
> and provide commented, minimal, self-contained, reproducible code.
Pursuing Ph.D. in Finance and Accounting
Indian Institute of Management, Lucknow
Visit my SSRN author page:
* Top 10% Downloaded Author on SSRN
Mob: +91 7525023664
This message has been sent from a mobile device. I may contact you again.
[[alternative HTML version deleted]]
More information about the R-help