NWChem/Gromacs/Amber Installation Guide for Winmostar (Linux)

Theses programs are confirmed to run on Cento OS (64 bit), which is supposed to be used in the guide below.
AmberTools must be installed to use sander in Amber.
Installation for ERmod and Torqueas a job scheduler is also guided.
Please understand that calculation results are not guaranteed.

 Installation of each package (Linux)


If there are insufficient packages, please install them by yum.

 1. NWChem


Use OpenMPI for MPI parallel run and GCC to compile NWChem.

$ sudo yum install python-devel gcc-gfortran openblas-devel openblas-serial64 openmpi-devel scalapack-openmpi-devel blacs-openmpi-devel elpa-openmpi-devel tcsh --enablerepo=epel

Add
export PATH=/usr/lib64/openmpi/bin/:$PATH
export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib/:$LD_LIBRARY_PATH
to /etc/bashrc

Download the source code from http://www.nwchem-sw.org/index.php/Download
※The latest version is Nwchem-6.6.revision27746-src.2015-10-20.tar.gz (18 Sep 2016)
Compile it as follows.

$ . /etc/bashrc
$ mv Nwchem-6.6.revision27746-src.2015-10-20.tar.gz /tmp
$ cd /tmp $ tar xvfz Nwchem-6.6.revision27746-src.2015-10-20.tar.gz

$ cd nwchem-6.6/src
$ vi nwchem_env.sh
export NWCHEM_TOP=/tmp/nwchem-6.6
export NWCHEM_TARGET=LINUX64
export NWCHEM_MODULES=all
export USE_MPI=y
export USE_PYTHONCONFIG=y
export PYTHONVERSION=2.7
export PYTHONHOME=/usr
export USE_64TO32=y
export BLAS_SIZE=4
export BLASOPT="-lopenblas -lpthread -lrt"
export SCALAPACK_SIZE=4
export SCALAPACK="-L/usr/lib64/openmpi/lib -lscalapack -lmpiblacs"
export ELPA="-I/usr/lib64/gfortran/modules/openmpi -L/usr/lib64/openmpi/lib -lelpa"

$ . nwchem_env.sh
$ make nwchem_config
$ make 64_to_32
$ make

$ sudo mkdir -p /usr/local/NWChem/bin
$ sudo mkdir -p /usr/local/NWChem/data

$ sudo cp $NWCHEM_TOP/bin/${NWCHEM_TARGET}/nwchem /usr/local/NWChem/bin
$ sudo chmod 755 /usr/local/NWChem/bin/nwchem
$ sudo cp -r $NWCHEM_TOP/src/basis/libraries /usr/local/NWChem/data
$ sudo cp -r $NWCHEM_TOP/src/data /usr/local/NWChem
$ sudo cp -r $NWCHEM_TOP/src/nwpw/libraryps /usr/local/NWChem/data
$ sudo vi /usr/local/NWChem/data/default.nwchemrc
nwchem_basis_library /usr/local/NWChem/data/libraries/
nwchem_nwpw_library /usr/local/NWChem/data/libraryps/
ffield amber
amber_1 /usr/local/NWChem/data/amber_s/
amber_2 /usr/local/NWChem/data/amber_q/
amber_3 /usr/local/NWChem/data/amber_x/
amber_4 /usr/local/NWChem/data/amber_u/
spce /usr/local/NWChem/data/solvents/spce.rst
charmm_s /usr/local/NWChem/data/charmm_s/
charmm_x /usr/local/NWChem/data/charmm_x/

$ cp /usr/local/NWChem/data/default.nwchemrc ~/.nwchemrc
$ sudo cp /usr/local/NWChem/data/default.nwchemrc /etc/skel/.nwchemrc

Add
export PATH=$PATH:/usr/local/NWChem/bin
to /etc/bashrc .

 2. Gromacs


Install MPICH for MPI parallel run and GCC for C/C++ compiler.
$ sudo yum install mpich-devel gcc gcc-c++ cmake make
$ export PATH=$PATH:/usr/lib64/mpich/bin

Download gromacs-5.0.4.tar.gz from http://www.gromacs.org/.
※The latest version is 5.0.4. (10 Mar 2015)
Complie it as follows. (Assuming your CPU supports AVX256 in this case.)

$ tar xvfz gromacs-5.0.4.tar.gz
$ cd gromacs-5.0.4
$ mkdir build
$ cd build
$ cmake .. -DGMX_SIMD=AVX_256 -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=OFF -DGMX_THREAD_MPI=ON -DGMX_DOUBLE=OFF
$ make
$ sudo make install
$ cmake .. -DGMX_SIMD=AVX_256 -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=OFF -DGMX_THREAD_MPI=ON -DGMX_DOUBLE=ON
$ make
$ sudo make install
$ cmake .. -DGMX_SIMD=AVX_256 -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=OFF -DGMX_MPI=ON -DGMX_DOUBLE=OFF
$ make
$ sudo make install
$ cmake .. -DGMX_SIMD=AVX_256 -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=OFF -DGMX_MPI=ON -DGMX_DOUBLE=ON
$ make
$ sudo make install

It is installed in /usr/local/gromacs

Add
source /usr/local/gromacs/bin/GMXRC
export PATH=$PATH:/usr/lib64/mpich/bin
to /etc/bashrc .

 3. AmberTools18[2]


Download AmberTools18.tar.bz2 from http://ambermd.org/GetAmber.php#ambertools.
Complie it as follows.

$ sudo yum install flex
$ tar xvfj AmberTools18.tar.bz2
$ sudo mv amber18 /usr/local/
$ export AMBERHOME=/usr/local/amber18
$ cd $AMBERHOME
$ sudo (echo y | ./configure -noX11 --skip-python gnu)
$ source amber.sh
$ sudo make install
$ cd

Add
source /usr/local/amber18/amber.sh to /etc/bashrc .

 4. ERmod


Download ermod-0.3.4.tar.gz from http://sourceforge.net/projects/ermod/files/?source=navbar
Compile it as follows.

$ sudo yum install fftw-devel
$ sudo yum install lapack-devel
$ tar zxvf ermod-0.3.4.tar.gz
$ cd ermod-0.3.4
$ ./configure
$ make
$ sudo make install

See ERmod wiki page for details.

 5. Torque


Use torque as a job scheduler.
EPEL repository is necessary to install torque by yum.
Install it and set up as follows.

$ su -
# rpm -Uvh http://ftp.riken.jp/Linux/fedora/epel/6/x86_64/epel-release-6-8.noarch.rpm
# yum install --enablerepo=epel torque-server torque-client torque-mom torque-scheduler
# /usr/sbin/create-munge-key
# hostname > /etc/torque/server_name
# pbs_server -t create
# vi qmgr.txt
create queue L0 queue_type = execution
set queue L0 enabled = true
set queue L0 started = true
set server default_queue = L0
set server scheduling = true
set queue L0 resources_max.ncpus = 12 # ← for 12 cores
set queue L0 resources_max.nodes = 1
# service trqauthd start
# qmgr < qmgr.txt
# echo `hostname` "np=12 num_node_boards=1" > /var/lib/torque/server_priv/nodes
# sed -i.bak s/localhost/`hostname`/g /var/lib/torque/mom_priv/config
# echo "nodes=1" > /var/lib/torque/mom_priv/mom.layout
# service pbs_server restart
# service pbs_mom restart
# service pbs_sched restart
# chkconfig pbs_mom on
# chkconfig pbs_sched on
# chkconfig pbs_server on
# chkconfig munge on
# chkconfig trqauthd on


The installation has finished.

 Citation

1. Antechamber
J. Wang, W. Wang, P.A. Kollman and D.A. Case. "Automatic atom type and bond type perception in molecular mechanical calculations".
Journal of Molecular Graphics and Modelling, 25, 247-260 (2006).
J. Wang, R.M. Wolf, J.W. Caldwell, P.A. Kollman and D.A. Case. "Development and testing of a general AMBER force field".
Journal of Computational Chemistry, 25, 1157-1174 (2004).



©2008-2020 X-Ability. Co. Ltd.,