with `13G` replaced with a little less than your available system memory.
### I get an `OutOfMemoryException` error and I'm short on RAM
Obviously if you're short on RAM there is a limit on how large of a dataset you can train,
but there are some techniques you can use to limit how much `largeRCRF` needs.
* If your training dataset is large you might not want both R and `largeRCRF` to have their own separate copies
(limitations due to Java require `largeRCRF` have its own copy). When specifying the `data` parameter into `train`,
instead provide an environment containing one object called `data` which is the dataset. `largeRCRF` will delete that variable
after importing it into the Java environment.
Example:
```
R> data.env <-new.env()
R> data.env$data <-trainingData
R> rm(trainingData)
R> model <-train(...,data=data.env,...)
```
* Each core that is training trees requires its own memory; you can try limiting `largeRCRF` to train only one tree at a time by specifiying `cores=1`.
* By default `largeRCRF` keeps the entire forest loaded in memory during training,
when in practice only the trees being trained on need to be loaded.
You can specify `savePath` to give a directory for `largeRCRF` to save trees in during training,
which will allow to `largeRCRF` to conserve memory for only those trees being currently trained.
### Training stalls immediately at 0 trees and the CPU is idle
This issue has been observed before on one particular system (and only on that system) but it's not clear what causes it.
It would be appreciated if you could report this bug to [joelt@sfu.ca](mailto:joelt@sfu.ca) and give your operating system
and the version of Java installed (the entire output of `java --version`).
As a workaround, this issue seems to occur randomly; so try restarting your code to see if it runs.