Quantcast
Channel: Active questions tagged r - Stack Overflow
Viewing all articles
Browse latest Browse all 201945

How to transfer parallele to each nodes for parallel in R?

$
0
0

Dear all: I have a question when I run a R script in parallel.

# 
library(parallel) 
library(mvtnorm) 
source('./function.R')

snp <- read.table('./high_19.gt', header = T, check.names = F) 
pheno <- read.table('./area_h.txt', header = T, stringsAsFactors = F)#, #colClasses = c('character', 'numeric', 'numeric', 'character', 'character'))

par0 <- c(172.6193597, 94.0255600, 0.1237499, 1.08, 0.8)

cl <- makeCluster(7)

clusterEvalQ(cl, library(mvtnorm))

clusterExport(cl, c('par0', 'snp', 'pheno'))

clusterEvalQ(cl, source('./function.R'))

res <- parLapply(cl, 1:1000, 'mle')

stopCluster(cl)
#

My snp file has 10000+ rows.

When I set parLapply(cl, 1:1000, 'mle'), it works.

However, when I set parLapply(cl, 1:10000, 'mle') or more, it will report error like this

Error in checkForRemoteErrors(val) : 7 nodes produced errors; first error: length of 'dimnames' [1] not equal to array extent

It seems like some parameters are not used in several nodes.

Why it couldn't run when the list was set too large? What should I do to solve this problem?

Thank you


Viewing all articles
Browse latest Browse all 201945

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>