Compiling relion-2.0-stable on CentOS 7

Again another of those boring IT posts, sorry. This one is about relion, the most important player in the field I work right now. A kind of Messi of EM image processing.

For those that don’t know yet what relion is, let’s say RELION (for REgularised LIkelihood OptimisatioN, pronounce rely-on) is a stand-alone computer program that employs an empirical Bayesian approach to refinement of (multiple) 3D reconstructions or 2D class averages in electron cryo-microscopy (cryo-EM). Or in simple words, a program to process images from an electron microscope.  If you want to install it from scratch, you can follow this guide. I spoke about relion before, so you know we have a relion build that was working fine until the last update to a new kernel, new NVIDIA drivers and new openmpi. That is to say, I could expect it’s time to compile it again.

Let’s do it over our already existing build. The plan is simple: we compile it on one of our computing nodes, and we rsync the result to the rest. Step by step

root@node214  ## > rm -rf * 
root@node214 > cmake -DCUDA_ARCH=61 \
-DCMAKE_INSTALL_PREFIX=/usr/local/relion-2.0_stable/ ..
-- The C compiler identification is GNU 4.8.5
-- The CXX compiler identification is GNU 4.8.5
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- BUILD TYPE set to the default type: 'Release'
-- Using provided CUDA_ARCH=61
-- Setting cpu precision to double
-- Setting gpu precision to single
-- Found CUDA: /usr/local/cuda (found version "8.0") 
-- Using cuda wrapper to compile....
-- Cuda version is >= 7.5 
and single-precision build, enable double usage warning.
-- Found MPI_C: /usr/lib64/openmpi-2.1.0/lib/ 
-- Found MPI_CXX: /usr/lib64/openmpi-2.1.0/lib/ 
-- MPI_INCLUDE_PATH : /usr/lib64/openmpi-2.1.0/include
-- MPI_LIBRARIES : /usr/lib64/openmpi-2.1.0/lib/
-- MPI_CXX_INCLUDE_PATH : /usr/lib64/openmpi-2.1.0/include
-- MPI_CXX_LIBRARIES : /usr/lib64/openmpi-2.1.0/lib/
-- Looking for XOpenDisplay in 
-- Looking for XOpenDisplay in 
/usr/lib64/;/usr/lib64/ - found
-- Looking for gethostbyname
-- Looking for gethostbyname - found
-- Looking for connect
-- Looking for connect - found
-- Looking for remove
-- Looking for remove - found
-- Looking for shmat
-- Looking for shmat - found
-- Looking for IceConnectionNumber in ICE
-- Looking for IceConnectionNumber in ICE - found
-- Found X11: /usr/lib64/
-- Could NOT find FLTK 
-- FLTK was NOT found
-- -------------------------------------------------
-- -------------------------------------------------
-- Found previously built external (non-system) FLTK library
-- Found FFTW: fftw3
-- FFTW_LIBRARIES: /usr/lib64/
-- Looking for sincos
-- Looking for sincos - found
-- Looking for __sincos
-- Looking for __sincos - not found
-- Building shared libs (smaller build size and binaries)
-- CMAKE_BINARY_DIR:/usr/local/relion-2.0_build/build
-- added particle_polish...
-- added maingui...
... adding a lot of stuff...
-- added stack_create...
-- Configuring done
-- Generating done
-- Build files have been written to: 

Time to make it. We have 20 cores in this node, so.

root@node214 ## > make -j 20 install
[ 1%] Scanning dependencies of target copy_scripts
...bla bla bla (compilation and linking)...
Scanning dependencies...
[100%] Built target refine_mpi
Linking CXX executable ../../bin/relion_stack_create
[100%] Built target tiltpair_plot
Built target stack_create
Linking CXX executable ../../bin/relion_reconstruct
[100%] Built target reconstruct
[100%] Built target helix_toolbox
Install the project...
..a lot of stuff..
-- Installing: 
-- Set runtime path of 
to "/usr/local
...bla bla bla (the same install/set runtime)...

Note that at the end there is no message saying relion has been successfully installed. Note that I loaded the needed modules (openmpi, cuda) before the compilation, and tested they work. So far, tested as an user also is OK, provided the right modules are present and working. The next will be to tune it up. Maybe I’ll give you our tuning tomorrow. Maybe. If you’re a good reader.


About bitsanddragons

A traveller, an IT professional and a casual writer
This entry was posted in bits, centos, linux, slurm. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s