Quantcast
Channel: Active questions tagged r - Stack Overflow
Viewing all articles
Browse latest Browse all 204771

Estimate required memory for a process in R

$
0
0

I have a dataframe with around 1.5 million participants and 245 variables. I need to run the following code:

data %>%
  gather(Measure, Result, -Var1, -Var2) %>%
  group(Var1, Var2, Measure) %>%
  summarise(n = sum(!is.na(Result))) %>%
  na.omit()

However, I don't have enough memory for this process and I get the "Cannot allocate vector of size 2.6 GB error".

I don't want to learn how to use packages that are created for handling large size data and instead I am planning on buying more RAM memory. However, I am not sure how much RAM should I buy. The missing 2.6 GB is maybe just for the part of the process and not for the whole process, right? Is there a way I can estimate how much memory does this process take so I can buy the required amount of RAM?


Viewing all articles
Browse latest Browse all 204771

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>