Questions tagged [doparallel]
R package that is a “parallel backend” for the foreach package. It provides a mechanism needed to execute foreach loops in parallel.
doparallel
478
questions
0
votes
0
answers
33
views
How can I handle dead workers in R-future parallel processing
I want to fit multiple models to each participant in my data. I want to parallelize this by applying the fitting function in parallel to the data of multiple participants. For one of the models, the ...
0
votes
0
answers
27
views
Error in {task 1 failed - [readValues] cannot read values in R doParallel "foreach"
I am trying to speed up an intervisibility calculation between landscape viewpoints (X, Y, height), using a rasterprofile and line of sight functions over a digital elevation model. I ran a large-...
1
vote
1
answer
67
views
How would I bootstrap the lmer model? [closed]
I am trying to bootstrap (non-parametric) my lmer model, but I am struggling as to how to specify the formula when the response variables are in columns 18:1543 ( 5:10 for example data). I've tried ...
0
votes
0
answers
18
views
Error with parallelize='variables' using "missForest" in R
I've started using missForest to potentially replace rfImpute and while doing some testing with both synthetic and real data and the different flavours of parallelization strategies offered by ...
3
votes
1
answer
122
views
Utilizing multiple nodes for parallel computing in R
I have access to a HPC system. Let's say I have three nodes/system available. Details of each node is as follows:
scontrol show node
Arch=x86_64 CoresPerSocket=10
CPUAlloc=20 CPUTot=20 CPULoad=...
0
votes
0
answers
29
views
how do I resolve this system singular error?
This is a question about futuremice.
I am trying to run the mice MI in parallel. This is my code.
imp_df <- futuremice(df,
pred = pred_matrix,
meth = ...
3
votes
2
answers
104
views
Shared memory in parallel foreach using set.seed
As seen in this question, in Windows it is not possible to run parallel processes with shared memory in R. Therefore, I have devised the following methodology, using a series of set.seed(), to ...
0
votes
1
answer
24
views
R crashes with segfault in doing keras::unserialize_model() in foreach loop from doParallel
I'm having a problem where R crashes when calling keras::unserialize_model() in a doParallel foreach loop.
I have to sanitize this code, so hopefully I don't munge anything. And I'm not an R developer;...
2
votes
2
answers
81
views
How to pass once the full dataset to one worker and specific subsets to the other workers in foreach loop using isplit()
I am currently fitting a set of models on a subset of data for each level of a factor variable. As the models take a long time to run, I use the foreach and doParallel package to estimate the set of ...
0
votes
0
answers
23
views
doParallel foreach with many for-loops within the foreach
I am stuck with an issue when trying to parallelize a function that I have coded.
The function contains several for loops within for loops (which is required to fill multiple arrays with data).
I have ...
1
vote
0
answers
78
views
R foreach do parallel %dopar% performance problems (and possibly affecting entire computer)
Note:
I recognize this is a slightly more amorphous/non-replicable problem than is ideal, but I feel it is worthwhile given the other instances we've seen on stackoverflow and potential general ...
0
votes
2
answers
100
views
StopIteration error in nested foreach/for loop
I've got a nasty error which only shows up when you ask R with geterrmessage() to actually display it. It also is no problem as long as you run the code in an interactive console session but when you ...
1
vote
1
answer
31
views
Parallel processing in R using multiple cores and using a function
I would like to run the apply function (my_func2) more efficiently by using parallelization in R across multiple imputed datasets by using all 8 cores on my computer. Each imputed dataset is about 1.7 ...
0
votes
0
answers
20
views
uses a function that calls another in a foreach loop dopar
I have two functions myfunct1 and myfunct2. myfunct1 uses the myfunct2 function
I want to use these two functions in a foreach()%dopar%{} loop but i get a error:
Error in {:
task 1 failed - "...
0
votes
1
answer
42
views
Continue to interact when there is a long calculate
I have code that is taking a long time to execute and I would like to know if there is a way to continue interacting with other elements of the application?
When you press the "go" button, a ...
0
votes
1
answer
80
views
memory usage when splitting large dataframe using lapply
I have a large dataframe 54k rows, 38k columns. My goal is to run a simple function, test_func, on this dataframe. test_func requires the use of 8 index columns that contain generic information, such ...
1
vote
0
answers
57
views
When is furrr::future_pmap() faster than pmap()?
I implemented a function that is quite expansive.
start<-Sys.time()
plan(multicore, workers = 4)
grid <- expand.grid(v,x,y,z)
mapping<-pmap(list(grid[,1],grid[,2],grid[,3],grid[,4]),.f=...
1
vote
1
answer
53
views
doPar does not initialize when run through Rscript
I want to create an Rscript that uses dopar to count the lines of many files using many cores, then outputs a TSV of the file names, plus the number of lines (divided by 4).
I am using Ubuntu 20.04. ...
1
vote
0
answers
64
views
Parallel tasks not removed after stopping cluster
After running and stopping a parallel process, workers still appear to be running on my system and taking up resources. I know processes belong to parallel because they are the same number as the ...
2
votes
1
answer
54
views
do.call() function within %do.par%, function arguments not found
I am implementing a Monte Carlo simulation to test several methods (many of them).
The methods are implementented in a methods.R script. To illustrate, let's say that I have only implemented 2 methods....
1
vote
2
answers
69
views
Ordering issues in foreach in R
To my knowledge, the results obtained from parallel computations are expected to be unordered. Does foreach perform any operations by default to ensure that the order of results matches the order of a ...
2
votes
0
answers
21
views
Acceleration with DoParallel in R not working in my code
# approximation of PI using random numbers and large iterations
# the surface of a square of one by one = 1
# the program will generate random numbers between 0 and 1
# these are x and y as ...
1
vote
1
answer
61
views
Limit iterations in foreach and doParallel
I'm trying to implement a nested for-loop using foreach and doParallel, but I don't want to loop over all combinations of values. Basically, I've got a square dataset and I want to run a function over ...
0
votes
1
answer
67
views
Error in parallel calculation on R function
I am mastering parallel programming in R simple examples tried to implement the function given in the code, it loads the processor, but the results of calculations are not. Could you please tell me ...
0
votes
0
answers
39
views
What causes "subscript out of bounds" when using doParallel
I am using the MethylMix package, which requires equal rowlength for KIRP.meth, KIRP.mrna, and normal.meth. My code raised "subscript out of bounds" error.
library(MethylMix)
cl <- ...
1
vote
1
answer
58
views
Getting foreach to store its return value in a 3d array
I have a function which returns a (numVals x N) array/matrix. This function needs to be evaluated K times. My goal is to store all results in a multidimensional array containing doubles with shape c(...
5
votes
1
answer
276
views
Difference between the working process of `mclapply` and `foreach()` loop
This is a general question out of curiosity. I am using the doParallel package for parallel computing. I use these packages for the simulation purposes.
What I observed is that when I was using the ...
0
votes
0
answers
47
views
Parallel collapse + reduction not working
I am trying to parallelize a two nested loop and the collapse clause fails.
Hey there, I am trying to parallelize these two nested loops in order to calculate two integrals (int_coulomb and ...
0
votes
0
answers
22
views
Subgraph count from a large graph (undirected) in a efficient way. How to increase performace of the R code? (Rcpp, doParallel)
I am working with a very large graph (The number of vertices in the graph is 1000+, which means I working with a (1000+ x 1000+) adjacency matrix). My laptop has an 8-core CPU and 8 gigs of RAM (...
2
votes
0
answers
38
views
R zombie processes
Is there a way to safely remove/stop/shut down those R processes? Using doParallel package
I'm always calling stopImplicitCluster() but apparantly it doesn't affect.
Thanks in advance!