How to subset RData file that is too large for memory?
I have an RData file that has become too large to load on my computer using the load()
command. It contains a data frame with ~3 million observations and ~100 variables.
I want to
load()
to load the condensed file and resume regular R operations How should I go about this?
last week Jared Lander (author of the book "R for Everyone") helped is in creating a R tint on a bigboards.io hex. We installed RStudio Server on it and we use the whole storage/processing capabilities of the hex. Ceph is used to distribute the data on the nodes
Anyway, it feels like a solution for your question.
链接地址: http://www.djcxy.com/p/38338.html上一篇: 如何从R包加载.Rdata?
下一篇: 如何对RData文件进行子集分配过大?