Differences between revisions 10 and 11
Revision 10 as of 2009-06-06 13:16:37
Size: 14547
Editor: MarcinDulak
Comment:
Revision 11 as of 2009-06-06 14:03:27
Size: 17493
Editor: MarcinDulak
Comment:
Deletions are marked like this. Additions are marked like this.
Line 176: Line 176:
    python campos_installer.py --machine='dulak-cluster' --create_scripts cblas
    cp ~/RPMS/*/cblas-*.rpm /home/dulak-server/rpm/campos
    rpmbuild --bb --with blas_version=3.0.37.el5 --with modules=1 --with default_version=1 \
                   --with prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 cblas.spec

    rpmbuild --bb --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                   --with modules=1 --with default_version=1 --with prefix=/opt/cblas/2.23.3/1.${dist}.gfortran.acml.4.0.1 cblas.spec

- python-setuptools::

       rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-setuptools/0.6c9/1.${disttag}.python2.4 python-setuptools.spec

- python-nose::

       rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-nose/0.10.4/1.${disttag}.python2.4 python-nose.spec
       module load python-setuptools
       module load python-nose
Line 181: Line 194:
    rpmbuild --bb --with cblas_prefix=/opt/acml/4.0.1/gfortran64/lib \     rpmbuild --bb --with cblas_prefix=none \ # dotblas fails with acml
Line 183: Line 196:
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib --with modules=1 --with default_version=1 \                   --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with compiler=gfortran
--with modules=1 --with default_version=1 \
Line 186: Line 200:
    rpmbuild --bb --with cblas_prefix=/usr/lib64 --with blas_version=3.0-37.el5 \
                  --with modules=1 --with default_version=1 \
                  --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.blas.3.0-37.el5.lapack numpy.spec
    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack numpy.spec

  Test with::

    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    python -c "import numpy; numpy.test()"
    python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)"
    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
     
    module load acml-gfortran64/4.0.1-1.el5.fys
    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    python -c "import numpy; numpy.test()"
    python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)"
    module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    module unload acml-gfortran64/4.0.1-1.el5.fys

  Load the default numpy::

    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
Line 192: Line 224:
    rpmbuild -bb --with modules --with default_version --with prefix=/opt/gnuplot-py/1.8.1/1.${disttag}.python2.4 RasMol.spec     rpmbuild -bb --with modules --with default_version --with prefix=/opt/gnuplot-py/1.8.1/1.${disttag}.python2.4 gnuplot-py.spec
Line 198: Line 230:
    cd
    rpm -e --nodeps python-numeric
    cd ~/Intel-Nehalem/RPMS
    rpm -e --nodeps python-numeric # as root
Line 201: Line 233:
    cp python-numeric-*.rpm /home/dulak-server/rpm/external # **Skip this step if not installing on "dulak-server"**
Line 203: Line 234:
    python campos_installer.py --machine='dulak-cluster' --create_scripts python-numeric
    cp ~/RPMS/*/python-numeric-*.rpm /home/dulak-server/rpm/campos

    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \
                  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.acml.4.0.1.acml python-numeric.spec

    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack python-numeric.spec
Line 210: Line 250:
  If you use modules::

    module load python-numeric
    echo "module load python-numeric" >> ~/global_install.sh

  otherwise logout and login again!
Line 219: Line 252:
    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
Line 220: Line 254:
    ldd `rpm -ql python-numeric | grep lapack_lite.so`
    ldd `rpm -ql python-numeric | grep _dotblas.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep lapack_lite.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep _dotblas.so`
    module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack

    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    python -c "import lapack_lite"
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep lapack_lite.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep _dotblas.so`
    module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
Line 225: Line 266:
    rpm -ivh --oldpackage ~/python-numeric-*.rpm     rpm -ivh --oldpackage ~/Intel-Nehalem/RPMS/python-numeric-*.rpm

  load the default `Numeric` version::

    

This page describes the necessary steps for installing of dl160

On the server

Install external packages

As root:

  • create yum repository definitions (do not enable them):

    # atrpms
    echo '[atrpms]' > /etc/yum.repos.d/atrpms.repo
    echo 'name=name=CentOS $releasever - $basearch - ATrpms' >> /etc/yum.repos.d/atrpms.repo
    echo 'baseurl=http://dl.atrpms.net/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo
    echo '#baseurl=http://mirrors.ircam.fr/pub/atrpms/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo
    echo 'gpgkey=http://ATrpms.net/RPM-GPG-KEY.atrpms' >> /etc/yum.repos.d/atrpms.repo
    echo 'gpgcheck=1' >> /etc/yum.repos.d/atrpms.repo
    echo 'enabled=0' >> /etc/yum.repos.d/atrpms.repo
    # epel
    echo '[epel]' > /etc/yum.repos.d/epel.repo
    echo 'name=name=CentOS $releasever - $basearch - EPEL' >> /etc/yum.repos.d/epel.repo
    echo 'baseurl=http://download.fedora.redhat.com/pub/epel/$releasever/$basearch' >> /etc/yum.repos.d/epel.repo
    echo 'gpgkey=http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL' >> /etc/yum.repos.d/epel.repo
    echo 'gpgcheck=1' >> /etc/yum.repos.d/epel.repo
    echo 'enabled=0' >> /etc/yum.repos.d/epel.repo
  • install, as root:

    yum install yum-utils
    # /var directories must be created
    yum search --enablerepo=atrpms arpack-devel
    yum search --enablerepo=epel jmol
  • configure rpmbuild:

    • use the following ~rpmbuild/.rpmmacros:

      %disttag        el5.fys
      
      %packager       rpmbuild@fysik.dtu.dk
      %distribution   Fysik RPMS
      %vendor         Fysik RPMS <rpm@fysik.dtu.dk>
      
      %_signature     gpg
      %_gpg_path      ~/.gnupg
      %_gpg_name      Fysik RPMS
      
      #%_topdir       /home/camp/rpmbuild/AMD-Opteron
      %_topdir        /home/camp/rpmbuild/Intel-Nehalem
      %_rpmdir        %{_topdir}/RPMS
      %_srcrpmdir     %{_topdir}/SRPMS
      %_svndir        /home/camp/rpmbuild/rpmbuild
      %_specdir       %{_svndir}/SPECS
      %_sourcedir     %{_svndir}/SOURCES
      %_rpmfilename   %%{NAME}-%%{VERSION}-%%{RELEASE}.%%{ARCH}.rpm
      #%_tmppath      %{_topdir}
      %_tmppath       /tmp/rpmbuild
      %_builddir      %{_tmppath}/BUILD
      
      %niflheim       1
    • as rpmbuild create directories:

      mkdir -p ~/Intel-Nehalem/RPMS
      mkdir -p ~/Intel-Nehalem/SRPMS
      mkdir -p ~/Intel-Nehalem/BUILD
      mkdir -p ~/Intel-Nehalem/SPECS # needed only by openmpi
      mkdir -p ~/Intel-Nehalem/SOURCES # needed only by openmpi
      mkdir -p /tmp/rpmbuild/BUILD
  • install official packages, as rpmbuild:

    cd ~/Intel-Nehalem/RPMS
    yumdownloader --resolve gcc-gfortran gcc43-c++ gcc43-gfortran blas-devel lapack-devel python-devel
    yumdownloader --resolve gnuplot libXi-devel xorg-x11-fonts-100dpi pexpect tetex-latex tkinter qt-devel
    yumdownloader --resolve openmpi openmpi-devel openmpi-libs compat-dapl libibverbs librdmacm openib
    yum localinstall * # as root
  • install atrpms packages, as rpmbuild (vtk-python is currently unavailable 16 Apr 2009):

    ~/Intel-Nehalem/RPMS
    yumdownloader --resolve --enablerepo=atrpms vtk-python arpack-devel graphviz
    wget http://ATrpms.net/RPM-GPG-KEY.atrpms
    rpm --import RPM-GPG-KEY.atrpms # as root
    yum localinstall * # as root
  • install the packages from epel, as rpmbuild:

    ~/Intel-Nehalem/RPMS
    yumdownloader --resolve --enablerepo=epel jmol
    yumdownloader --resolve --enablerepo=epel environment-modules suitesparse-devel
    wget http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL
    rpm --import RPM-GPG-KEY-EPEL # as root
    yum localinstall * # as root
    source /etc/profile.d/modules.sh
  • remove default openmpi:

    yum remove openmpi openmpi-libs
  • edit /etc/yum.conf so it contains:

    exclude=netcdf-* netcdf3-* fftw-* fftw2-* fftw3-* python-numeric openmpi-*

It's time to build custom RPMS

As rpmbuild:

cd ~/rpmbuild/SPECS

Build a custom openmpi, using torque support:

export rpmtopdir=${HOME}/Intel-Nehalem # set this to _topdir value from ~/.rpmmacros
wget http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.2.tar.bz2 \
     -O ~/rpmbuild/SOURCES/openmpi-1.3.2.tar.bz2
sh ./buildrpm-1.3.2-1.gfortran.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran.sh.log.Intel-Nehalem
sh ./buildrpm-1.3.2-1.gfortran43.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran43.sh.log.Intel-Nehalem
sh ./buildrpm-1.3.2-1.pathscale.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.pathscale.sh.log.Intel-Nehalem
rpm -ivh ~/RPMS/*/openmpi-*.rpm

If scripts that contain ALL build/install/uninstall commands (global_install.sh and global_uninstall.sh) need to be created, every time after an RPM is successfully built, do:

grep -v "#\!" install.sh >> ~/Intel-Nehalem/global_install.sh
cat uninstall.sh ~/Intel-Nehalem/global_uninstall.sh | grep -v "#\!" >> ~/Intel-Nehalem/global_uninstall.sh.tmp && mv -f ~/Intel-Nehalem/global_uninstall.sh.tmp ~/Intel-Nehalem/global_uninstall.sh
# ignore "cat: .../global_uninstall.sh: No such ..." error when running first time

Note that global_uninstall.sh won't remove built RPM files, just will uninstall the packages.

Build the following for dacapo:

  • set the disttag variable for convenience:

    export disttag="el5.fys"
  • install icc/ifort compilers,

  • acml:

    rpmbuild -bb --with compiler=gfortran --with version1=0 --with version2=1 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=pathscale --with version1=0 --with version2=1 --with modules --with default_version acml.spec
    
    rpmbuild -bb --with compiler=gfortran43 --with version1=1 --with version2=0 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=pathscale --with version1=1 --with version2=0 --with modules --with default_version acml.spec
    
    rpmbuild -bb --with compiler=gfortran43 --with version1=2 --with version2=0 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=pathscale --with version1=2 --with version2=0 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=ifort --with version1=2 --with version2=0 --with modules --with default_version acml.spec
  • goto:

    rpmbuild --bb --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/goto/1.26/1.${disttag}.gfortran.smp goto.spec
    rpmbuild --bb --with compiler=gfortran43 --with modules=1 --with default_version=1 \
                  --with prefix=/opt/goto/1.26/1.${disttag}.gfortran43.smp goto.spec
    rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \
                  --with modules=1 --with default_version=1 --with prefix=/opt/goto/1.26/1.${disttag}.pathscale.smp goto.spec

    Note - 1.26 version fails on Nehalem with:

    ../../../param.h:1195:21: error: division by zero in #if
  • campos-dacapo-pseudopotentials:

    rpmbuild -bb --with modules --with default_version campos-dacapo-pseudopotentials.spec
  • rasmol:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/RasMol/2.7.3/3.${disttag} RasMol.spec
  • cblas:

    rpmbuild --bb  --with blas_version=3.0.37.el5 --with modules=1 --with default_version=1 \
                   --with prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 cblas.spec
    
    rpmbuild --bb  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                   --with modules=1 --with default_version=1 --with prefix=/opt/cblas/2.23.3/1.${dist}.gfortran.acml.4.0.1 cblas.spec
  • python-setuptools:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-setuptools/0.6c9/1.${disttag}.python2.4 python-setuptools.spec
  • python-nose:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-nose/0.10.4/1.${disttag}.python2.4 python-nose.spec
    module load python-setuptools
    module load python-nose
  • numpy:

    rpmbuild --bb --with cblas_prefix=none \ # dotblas fails with acml
                  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.acml.4.0.1.acml numpy.spec
    
    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack numpy.spec

    Test with:

    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    python -c "import numpy; numpy.test()"
    python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)"
    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    
    module load acml-gfortran64/4.0.1-1.el5.fys
    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    python -c "import numpy; numpy.test()"
    python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)"
    module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    module unload acml-gfortran64/4.0.1-1.el5.fys

    Load the default numpy:

    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
  • gnuplot-py:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/gnuplot-py/1.8.1/1.${disttag}.python2.4 gnuplot-py.spec
    
    module load gnuplot-py
  • python-numeric (we must install 24.2 version, and we keep the default version):

    cd ~/Intel-Nehalem/RPMS
    rpm -e --nodeps python-numeric # as root
    yumdownloader --resolve --disableexcludes=main python-numeric
    cd ~/rpmbuild/SPECS
    
    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \
                  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.acml.4.0.1.acml python-numeric.spec
    
    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack python-numeric.spec

    Note: (16 Apr 2009) currently Numeric's test.py results in (we ignore this error):

    glibc detected *** python: free(): invalid next size (normal): 0x09aee970 ***

    After installing python-numeric make a very rough check:

    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    python -c "import lapack_lite"
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep lapack_lite.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep _dotblas.so`
    module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    
    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    python -c "import lapack_lite"
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep lapack_lite.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep _dotblas.so`
    module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack

    and reinstall the default version:

    rpm -ivh --oldpackage ~/Intel-Nehalem/RPMS/python-numeric-*.rpm

    load the default Numeric version:

    System Message: WARNING/2 (<string>, line 272)

    Literal block expected; none found.

  • ScientificPython:

    python campos_installer.py --machine='dulak-cluster' --create_scripts ScientificPython
    cp ~/RPMS/*/ScientificPython-*.rpm /home/dulak-server/rpm/campos
  • campos-ase2:

    python campos_installer.py --machine='dulak-cluster' --create_scripts campos-ase2
    cp ~/RPMS/*/campos-ase2-*.rpm /home/dulak-server/rpm/campos
  • campos-dacapo-python:

    python campos_installer.py --machine='dulak-cluster' --create_scripts campos-dacapo-python
  • campos-dacapo:

    python campos_installer.py --machine='dulak-cluster' --create_scripts --compiler=gfortran43 campos-dacapo
    cp ~/RPMS/*/campos-dacapo-*.rpm /home/dulak-server/rpm/campos

    logout and login again!

build following for gpaw:

  • campos-gpaw-setups:

    python campos_installer.py --machine='dulak-cluster' --create_scripts campos-gpaw-setups
  • campos-ase3:

    python campos_installer.py --machine='dulak-cluster' --create_scripts campos-ase3
    cp ~/RPMS/*/campos-ase3-*.rpm /home/dulak-server/rpm/campos
  • campos-gpaw:

    python campos_installer.py --machine='dulak-cluster' --create_scripts --compiler=gfortran43 campos-gpaw
    cp ~/RPMS/*/campos-gpaw-*.rpm /home/dulak-server/rpm/campos

    logout and login again!

Testing packages

Test dacapo installation (as normal user!).

If you use modules:

module load openmpi
module load campos-dacapo-pseudopotentials
module load python-numeric
module load campos-dacapo-python
module load ScientificPython
module load gnuplot-py
module load RasMol
module load campos-ase2
module load campos-dacapo
ulimit -s 65000 # dacapo needs a large stack

Test with (make sure that /scratch/$USER exists):

cp -r `rpm -ql campos-dacapo-python | grep "share/campos-dacapo-python$"` /tmp
cd /tmp/campos-dacapo-python/Tests
python test.py 2>&1 | tee test.log

It can take up to 1 day. Please consider disabling these "long" tests in test.py:

tests.remove('../Examples/Wannier-ethylene.py')
tests.remove('../Examples/Wannier-Pt4.py')
tests.remove('../Examples/Wannier-Ptwire.py')
tests.remove('../Examples/Wannier-Fe-bcc.py')
tests.remove('../Examples/transport_1dmodel.py')

Note all vtk related tests will fail.

Test gpaw installation (as normal user!):

If you use modules:

module load openmpi
module load campos-ase3
module load campos-gpaw-setups
module load campos-gpaw

Test with:

cp -r `rpm -ql campos-gpaw | grep "share/campos-gpaw/test$"` /tmp/test.gpaw.$$
cd /tmp/test.gpaw.*
python test.py 2>&1 | tee test.log

It takes about 20 minutes.

On "Golden Client"

Login, as root, to the "Golden Client":

ssh n001

Enable nfs mount of the server home directory - follow 'Enable nfs mount on the "Golden Client"' from configuring NFS. After this do:

cd /home/dulak-server/rpm/campos

rpm -ivh campos-dacapo-2*

If getting:

package example_package.el5.i386 is already installed

remove these packages with:

rpm -e --nodeps example_package

to allow the installation to proceed.

Make sure that both python-numeric versions are installed:

rpm -q python-numeric

This command will show a list of packages that need to be installed to fulfill dacapo dependencies. All these packages should be already under /home/dulak-server/rpm. Remember to test the dacapo and gpaw installations on the "Golden Client" too.

If you are installing workstation only, your setup is ready for testing - go to benchmarking and maintenance.

If you are building a cluster go back to installing and configuring systemimager,

Niflheim: Cluster_software_-_RPMS (last edited 2015-07-02 11:28:20 by OleHolmNielsen)