All Questions
Tagged with doparallel r
470
questions
0
votes
0
answers
33
views
How can I handle dead workers in R-future parallel processing
I want to fit multiple models to each participant in my data. I want to parallelize this by applying the fitting function in parallel to the data of multiple participants. For one of the models, the ...
0
votes
0
answers
27
views
Error in {task 1 failed - [readValues] cannot read values in R doParallel "foreach"
I am trying to speed up an intervisibility calculation between landscape viewpoints (X, Y, height), using a rasterprofile and line of sight functions over a digital elevation model. I ran a large-...
1
vote
1
answer
67
views
How would I bootstrap the lmer model? [closed]
I am trying to bootstrap (non-parametric) my lmer model, but I am struggling as to how to specify the formula when the response variables are in columns 18:1543 ( 5:10 for example data). I've tried ...
0
votes
0
answers
18
views
Error with parallelize='variables' using "missForest" in R
I've started using missForest to potentially replace rfImpute and while doing some testing with both synthetic and real data and the different flavours of parallelization strategies offered by ...
3
votes
1
answer
122
views
Utilizing multiple nodes for parallel computing in R
I have access to a HPC system. Let's say I have three nodes/system available. Details of each node is as follows:
scontrol show node
Arch=x86_64 CoresPerSocket=10
CPUAlloc=20 CPUTot=20 CPULoad=...
0
votes
0
answers
29
views
how do I resolve this system singular error?
This is a question about futuremice.
I am trying to run the mice MI in parallel. This is my code.
imp_df <- futuremice(df,
pred = pred_matrix,
meth = ...
3
votes
2
answers
104
views
Shared memory in parallel foreach using set.seed
As seen in this question, in Windows it is not possible to run parallel processes with shared memory in R. Therefore, I have devised the following methodology, using a series of set.seed(), to ...
0
votes
1
answer
24
views
R crashes with segfault in doing keras::unserialize_model() in foreach loop from doParallel
I'm having a problem where R crashes when calling keras::unserialize_model() in a doParallel foreach loop.
I have to sanitize this code, so hopefully I don't munge anything. And I'm not an R developer;...
2
votes
2
answers
81
views
How to pass once the full dataset to one worker and specific subsets to the other workers in foreach loop using isplit()
I am currently fitting a set of models on a subset of data for each level of a factor variable. As the models take a long time to run, I use the foreach and doParallel package to estimate the set of ...
0
votes
0
answers
23
views
doParallel foreach with many for-loops within the foreach
I am stuck with an issue when trying to parallelize a function that I have coded.
The function contains several for loops within for loops (which is required to fill multiple arrays with data).
I have ...
1
vote
0
answers
78
views
R foreach do parallel %dopar% performance problems (and possibly affecting entire computer)
Note:
I recognize this is a slightly more amorphous/non-replicable problem than is ideal, but I feel it is worthwhile given the other instances we've seen on stackoverflow and potential general ...
0
votes
2
answers
100
views
StopIteration error in nested foreach/for loop
I've got a nasty error which only shows up when you ask R with geterrmessage() to actually display it. It also is no problem as long as you run the code in an interactive console session but when you ...
0
votes
0
answers
20
views
uses a function that calls another in a foreach loop dopar
I have two functions myfunct1 and myfunct2. myfunct1 uses the myfunct2 function
I want to use these two functions in a foreach()%dopar%{} loop but i get a error:
Error in {:
task 1 failed - "...
0
votes
1
answer
42
views
Continue to interact when there is a long calculate
I have code that is taking a long time to execute and I would like to know if there is a way to continue interacting with other elements of the application?
When you press the "go" button, a ...
0
votes
1
answer
80
views
memory usage when splitting large dataframe using lapply
I have a large dataframe 54k rows, 38k columns. My goal is to run a simple function, test_func, on this dataframe. test_func requires the use of 8 index columns that contain generic information, such ...