I am working on the latest bioconductor_docker r-studio server (link) and I want to load a seurat object (size = 1.9GB) like
library(Seurat)
mca <- readRDS(<path_2_seurat_file>)
but I get the error:
19 Jul 2021 09:20:57 [rsession-rstudio] ERROR session hadabend; LOGGED FROM: rstudio::core::Error {anonymous}::rInit(const rstudio::r::session::RInitInfo&) src/cpp/session/SessionMain.cpp:675
I think this error might be due to some memory usage restriction as:
So I tried running the docker container with the parameters -it --memory="8g"
but it did not solve the problem.
Do you know any solutions to this problem or know how I could find a solution?
Here are supplementary information about the session:
Rstudio-server version: Version 1.4.1717
SessionInfo:
R version 4.1.0 (2021-05-18)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Ubuntu 20.04.2 LTS
Matrix products: default
BLAS/LAPACK: /usr/lib/x86_64-linux-gnu/openblas-pthread/libopenblasp-r0.3.8.so
locale:
[1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C LC_TIME=en_US.UTF-8 LC_COLLATE=en_US.UTF-8 LC_MONETARY=en_US.UTF-8
[6] LC_MESSAGES=C LC_PAPER=en_US.UTF-8 LC_NAME=C LC_ADDRESS=C LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] SeuratObject_4.0.2 Seurat_4.0.3
loaded via a namespace (and not attached):
[1] nlme_3.1-152 matrixStats_0.59.0 spatstat.sparse_2.0-0 RcppAnnoy_0.0.18 RColorBrewer_1.1-2 httr_1.4.2
[7] sctransform_0.3.2 tools_4.1.0 utf8_1.2.1 R6_2.5.0 irlba_2.3.3 rpart_4.1-15
[13] KernSmooth_2.23-20 uwot_0.1.10 mgcv_1.8-36 DBI_1.1.1 lazyeval_0.2.2 colorspace_2.0-2
[19] tidyselect_1.1.1 gridExtra_2.3 compiler_4.1.0 cli_3.0.0 plotly_4.9.4.1 scales_1.1.1
[25] lmtest_0.9-38 spatstat.data_2.1-0 ggridges_0.5.3 pbapply_1.4-3 goftest_1.2-2 stringr_1.4.0
[31] digest_0.6.27 spatstat.utils_2.2-0 pkgconfig_2.0.3 htmltools_0.5.1.1 parallelly_1.26.1 fastmap_1.1.0
[37] htmlwidgets_1.5.3 rlang_0.4.11 rstudioapi_0.13 shiny_1.6.0 generics_0.1.0 zoo_1.8-9
[43] jsonlite_1.7.2 ica_1.0-2 dplyr_1.0.7 magrittr_2.0.1 patchwork_1.1.1 Matrix_1.3-4
[49] Rcpp_1.0.7 munsell_0.5.0 fansi_0.5.0 abind_1.4-5 reticulate_1.20 lifecycle_1.0.0
[55] stringi_1.6.2 MASS_7.3-54 Rtsne_0.15 plyr_1.8.6 grid_4.1.0 parallel_4.1.0
[61] listenv_0.8.0 promises_1.2.0.1 ggrepel_0.9.1 crayon_1.4.1 miniUI_0.1.1.1 deldir_0.2-10
[67] lattice_0.20-44 cowplot_1.1.1 splines_4.1.0 tensor_1.5 pillar_1.6.1 igraph_1.2.6
[73] spatstat.geom_2.2-0 future.apply_1.7.0 reshape2_1.4.4 codetools_0.2-18 leiden_0.3.8 glue_1.4.2
[79] data.table_1.14.0 png_0.1-7 vctrs_0.3.8 httpuv_1.6.1 gtable_0.3.0 RANN_2.6.1
[85] purrr_0.3.4 spatstat.core_2.2-0 polyclip_1.10-0 tidyr_1.1.3 scattermore_0.7 future_1.21.0
[91] assertthat_0.2.1 ggplot2_3.3.5 mime_0.11 xtable_1.8-4 later_1.2.0 survival_3.2-11
[97] viridisLite_0.4.0 tibble_3.1.2 cluster_2.1.2 globals_0.14.0 fitdistrplus_1.1-5 ellipsis_0.3.2
[103] ROCR_1.0-11
When trying to read a cell_data_set object I now get the same error as before + another error.
library(monocle3)
cds <- readRDS(<path_to_cell_data_set_file>)
19 Jul 2021 12:50:10 [rsession-rstudio] ERROR session hadabend; LOGGED FROM: rstudio::core::Error {anonymous}::rInit(const rstudio::r::session::RInitInfo&) src/cpp/session/SessionMain.cpp:675
19 Jul 2021 12:50:10 [rsession-rstudio] ERROR system error 2 (No such file or directory) [path:/home/rstudio/.local/share/rstudio/sessions/active/session-975447f0/suspended-session-data/search_path/search_path_elements]; OCCURRED AT rstudio::core::Error rstudio::core::FilePath::openForRead(std::shared_ptrstd::basic_istream<char >&) const src/cpp/shared_core/FilePath.cpp:1427; LOGGED FROM: void rstudio::r::session::{anonymous}::reportDeferredDeserializationError(const rstudio::core::Error&) src/cpp/r/session/RInit.cpp:63
I also check (with file.exists()
) and the file <path_to_cell_data_set_file>
exits.
After some digging around I found this post on how to assign more memory to docker container.
I though that by adding -m 8g
or -it --memory="8g"
to the docker run
command was enough to increase the memory limits. However, after typing docker stats
, I found that my memory limit was stuck at 2GB.
I followed the instructions on this link which lead me to increase the memory limits in the > docker > Preferences > resources > Memory.
Now everything works like clockwork.