[R] Sequentially serializing large objects

Kevin Wang kdwang at google.com
Wed May 20 07:21:16 CEST 2015

Hi r-help, I've been having some issues serializing large objects over a
socket. I can reproduce the issue with the following two instances:

Instance 1:
> conn <- socketConnection("localhost", 34533, server = TRUE, open = "w+b")
> for (i in 1:10) { serialize(1:2e9, conn) }

Instance 2:
> conn <- socketConnection("localhost", 34533, open = "r+b")
> for (i in 1:10) { unserialize(conn) }

This gives me

Error in unserialize(conn) : error reading from connection

on the second process , and if I ignore the error, future unserialize calls
result in

Error in unserialize(conn) : unknown input format.

I'm running R 3.1.1. How can I unserialize the objects properly? Both of
these error messages aren't particularly helpful. I've tried serializing to
a raw vector then serializing chunks at a time, but for some reason doing
the subsetting makes R's memory usage balloon way above a copy of the large
object. writeBin() also seems to ignore the size parameter, so that doesn't
work either.


Kevin Wang | Software Engineer | kdwang at google.com | 248.327.3647

	[[alternative HTML version deleted]]

More information about the R-help mailing list