[Rd] how useful could be a fast and embedded database for the R community?
elw at stderr.org
Wed Dec 24 18:48:48 CET 2014
I believe in patches and working code.
You're proposing to compete with the likes of sqlite and berkeley db
-- not small competition, with excellent performance characteristics
when used properly.
You also used the 'b'illions word in reference to data sets - really?
On Tue, Dec 23, 2014 at 11:31 AM, joanv <joan.iglesias at live.com> wrote:
> Dear all,
> I'm developing a new database with the ability to perform very fast seek,
> insert, and delete operations. Also is able to perform very fast comparison
> of datasets. It has been designed to work embedded into other programs
> programmed in R, Fortran, C++, etc.
> It can manage efficiently billions of numeric datasets in a single machine.
> Right now I do not know in what fields of the R community could be helpful
> such a database, or if there could be a need of such a capability in the R
> Could someone help me in this topic? Partners for the project are also
> wanted, specially R experts, or experts on other kinds of calculation
> programs (vasp, gaussian, etc... )
> Regards and thank you.
> View this message in context: http://r.789695.n4.nabble.com/how-useful-could-be-a-fast-and-embedded-database-for-the-R-community-tp4701051.html
> Sent from the R devel mailing list archive at Nabble.com.
> R-devel at r-project.org mailing list
More information about the R-devel