Differences between revisions 1 and 15 (spanning 14 versions)
Revision 1 as of 2009-06-06 12:05:58
Size: 11167
Editor: MarcinDulak
Comment:
Revision 15 as of 2009-06-10 15:46:08
Size: 34106
Editor: MarcinDulak
Comment:
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
This page describes the necessary steps for installing
`gpaw
<https://wiki.fysik.dtu.dk/gpaw>`_ and `dacapo <https://wiki.fysik.dtu.dk/dacapo>`_ programs.
This page describes the necessary steps for installing of dl160
Line 30: Line 29:
- install:: - install, as root::
Line 33: Line 32:

- if not yet done, go to `configuring rpmbuild <Building_a_Cluster_-_Tutorial/configuring_rpmbuild>`_,

- **Skip this step if not installing on "dulak-server"**: create the ``/home/dulak-server/rpm/external`` directory (to keep external RPMS)::

   mkdir /home/dulak-server/rpm/external; cd /home/dulak-server/rpm/external

- install official packages::
   # /var directories must be created
   yum search --enablerepo=atrpms arpack-devel
   yum search --enablerepo=epel jmol

- configure rpmbuild:

  - use the following ~rpmbuild/.rpmmacros::

     %disttag el5.fys

     %packager rpmbuild@fysik.dtu.dk
     %distribution Fysik RPMS
     %vendor Fysik RPMS <rpm@fysik.dtu.dk>

     %_signature gpg
     %_gpg_path ~/.gnupg
     %_gpg_name Fysik RPMS

     #%_topdir /home/camp/rpmbuild/AMD-Opteron
     %_topdir /home/camp/rpmbuild/Intel-Nehalem
     %_rpmdir %{_topdir}/RPMS
     %_srcrpmdir %{_topdir}/SRPMS
     %_svndir /home/camp/rpmbuild/rpmbuild
     %_specdir %{_svndir}/SPECS
     %_sourcedir %{_svndir}/SOURCES
     %_rpmfilename %%{NAME}-%%{VERSION}-%%{RELEASE}.%%{ARCH}.rpm
     #%_tmppath %{_topdir}
     %_tmppath /tmp/rpmbuild
     %_builddir %{_tmppath}/BUILD

     # no debuginfo
     %debug_package %{nil}

     %niflheim 1

  - as rpmbuild create directories::

     mkdir -p ~/Intel-Nehalem/RPMS
     mkdir -p ~/Intel-Nehalem/SRPMS
     mkdir -p ~/Intel-Nehalem/BUILD
     mkdir -p ~/Intel-Nehalem/SPECS # needed only by openmpi
     mkdir -p ~/Intel-Nehalem/SOURCES # needed only by openmpi
     mkdir -p /tmp/rpmbuild/BUILD

- install official packages, as rpmbuild::

   cd ~/Intel-Nehalem/RPMS
Line 45: Line 82:
   yum localinstall *

- install `atrpms` packages
(``vtk-python`` is currently unavailable 16 Apr 2009)::

   yumdownloader --resolve --enablerepo=atrpms vtk-python fftw2 fftw2-devel netcdf netcdf-devel arpack-devel graphviz
   yumdownloader --resolve pygtk2-devel gtk2-devel tk-devel agg ghostscript libtiff-devel
yum localinstall * # as root

- install `atrpms` packages, as rpmbuild
(``vtk-python`` is currently unavailable 16 Apr 2009)::

   ~/Intel-Nehalem/RPMS
   
yumdownloader --resolve --enablerepo=atrpms vtk-python arpack-devel graphviz
Line 51: Line 90:
   rpm --import RPM-GPG-KEY.atrpms
   yum localinstall *

- install the packages from `epel`::

   yumdownloader --resolve --enablerepo=epel fftw3 fftw3-devel python-matplotlib python-docutils jmol
   rpm --import RPM-GPG-KEY.atrpms # as root
   yum localinstall * # as root

- install the packages from `epel`, as rpmbuild::

   ~/Intel-Nehalem/RPMS
   
yumdownloader --resolve --enablerepo=epel jmol
Line 59: Line 99:
   rpm --import RPM-GPG-KEY-EPEL
   yum localinstall *
   rpm --import RPM-GPG-KEY-EPEL # as root
   yum localinstall * # as root
Line 62: Line 102:

- remove default openmpi::

   yum remove openmpi openmpi-libs
Line 70: Line 114:
As root:: As rpmbuild::
Line 74: Line 118:
**Skip this step if not installing on "dulak-server"**: create the ``/home/dulak-server/rpm/campos`` directory (to keep custom built RPMS)::

  mkdir /home/dulak-server/rpm/campos

Preferably build a custom openmpi, using the latest gcc/gfortran and torque support::

  wget http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.1.tar.bz2 \
       -O ~/rpmbuild/SOURCES/openmpi-1.3.1.tar.bz2
  export rpmtopdir=${HOME} # set this to _topdir value from ~/.rpmmacros
  sh ./buildrpm-1.3.1-1.gfortran.sh ../SOURCES/openmpi-1.3.1.tar.bz2 2>&1 | tee buildrpm-1.3.1-1.gfortran.sh.log
Build a custom openmpi, using torque support::

  export rpmtopdir=${HOME}/Intel-Nehalem # set this to _topdir value from ~/.rpmmacros
  wget http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.2.tar.bz2 \
       -O ~/rpmbuild/SOURCES/openmpi-1.3.2.tar.bz2
  sh ./buildrpm-1.3.2-1.gfortran.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran.sh.log.Intel-Nehalem
  sh ./buildrpm-1.3.2-1.gfortran43.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran43.sh.log.Intel-Nehalem
  sh ./buildrpm-1.3.2-1.pathscale.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.pathscale.sh.log.Intel-Nehalem
Line 85: Line 127:
  cp ~/RPMS/*/openmpi-*.rpm /home/dulak-server/rpm/campos
Line 90: Line 131:
    grep -v "#\!" install.sh >> ~/global_install.sh
    cat uninstall.sh ~/global_uninstall.sh | grep -v "#\!" >> ~/global_uninstall.sh.tmp && mv -f ~/global_uninstall.sh.tmp ~/global_uninstall.sh
    # ignore "cat: /root/global_uninstall.sh: No such ..." error when running first time
    grep -v "#\!" install.sh >> ~/Intel-Nehalem/global_install.sh
    cat uninstall.sh ~/Intel-Nehalem/global_uninstall.sh | grep -v "#\!" >> ~/Intel-Nehalem/global_uninstall.sh.tmp && mv -f ~/Intel-Nehalem/global_uninstall.sh.tmp ~/Intel-Nehalem/global_uninstall.sh
    # ignore "cat: .../global_uninstall.sh: No such ..." error when running first time
Line 97: Line 138:
   
- set the disttag variable for convenience::

   export disttag="el5.fys"

- install icc/ifort compilers,

- acml::

    rpmbuild -bb --with compiler=gfortran --with version1=0 --with version2=1 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=pathscale --with version1=0 --with version2=1 --with modules --with default_version acml.spec

    rpmbuild -bb --with compiler=gfortran43 --with version1=1 --with version2=0 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=pathscale --with version1=1 --with version2=0 --with modules --with default_version acml.spec

    rpmbuild -bb --with compiler=gfortran43 --with version1=2 --with version2=0 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=pathscale --with version1=2 --with version2=0 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=ifort --with version1=2 --with version2=0 --with modules --with default_version acml.spec

 - goto::

    rpmbuild --bb --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/goto/1.26/1.${disttag}.gfortran.smp goto.spec
    rpmbuild --bb --with compiler=gfortran43 --with modules=1 --with default_version=1 \
                  --with prefix=/opt/goto/1.26/1.${disttag}.gfortran43.smp goto.spec
    rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \
                  --with modules=1 --with default_version=1 --with prefix=/opt/goto/1.26/1.${disttag}.pathscale.smp goto.spec

   **Note** - **1.26** version fails on Nehalem with::

     ../../../param.h:1195:21: error: division by zero in #if
Line 100: Line 172:
    python campos_installer.py --machine='dulak-cluster' --create_scripts campos-dacapo-pseudopotentials     rpmbuild -bb --with modules --with default_version campos-dacapo-pseudopotentials.spec
Line 104: Line 176:
    python campos_installer.py --machine='dulak-cluster' --create_scripts RasMol
    cp ~/RPMS/*/RasMol-*.rpm /home/dulak-server/rpm/campos
    rpmbuild -bb --with modules --with default_version --with prefix=/opt/RasMol/2.7.3/3.${disttag} RasMol.spec

- `cblas <https://wiki.fysik.dtu.dk/niflheim/Cluster_software_-_RPMS#cblas>`_::

    rpmbuild --bb --with blas_version=3.0.37.el5 --with modules=1 --with default_version=1 \
                   --with prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 cblas.spec

    rpmbuild --bb --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                   --with modules=1 --with default_version=1 --with prefix=/opt/cblas/2.23.3/1.${dist}.gfortran.acml.4.0.1 cblas.spec

- python-setuptools::

       rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-setuptools/0.6c9/1.${disttag}.python2.4 python-setuptools.spec

- python-nose::

       rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-nose/0.10.4/1.${disttag}.python2.4 python-nose.spec
       module load python-setuptools
       module load python-nose

- numpy::

    rpmbuild --bb --with cblas_prefix=none \ # dotblas fails with acml
                  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.acml.4.0.1.acml numpy.spec

    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack numpy.spec

  Test with::

    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)"
    python -c "import numpy; numpy.test()"
    module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
     
    module load acml-gfortran64/4.0.1-1.el5.fys
    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)"
    python -c "import numpy; numpy.test()"
    module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    module unload acml-gfortran64/4.0.1-1.el5.fys

  Load the default numpy::

    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
Line 109: Line 228:
    python campos_installer.py --machine='dulak-cluster' --create_scripts gnuplot-py
    cp ~/RPMS/*/gnuplot-py-*.rpm /home/dulak-server/rpm/campos

  if you use modules::
    rpmbuild -bb --with modules --with default_version --with prefix=/opt/gnuplot-py/1.8.1/1.${disttag}.python2.4 gnuplot-py.spec
Line 115: Line 231:
    echo "module load gnuplot-py" >> ~/global_install.sh

  otherwise logout and login again!

- `cblas <https://wiki.fysik.dtu.dk/niflheim/Cluster_software_-_RPMS#cblas>`_::

    python campos_installer.py --machine='dulak-cluster' --create_scripts cblas
    cp ~/RPMS/*/cblas-*.rpm /home/dulak-server/rpm/campos
Line 126: Line 234:
    cd
    rpm -e --nodeps python-numeric
    cd ~/Intel-Nehalem/RPMS
    rpm -e --nodeps python-numeric # as root
Line 129: Line 237:
    cp python-numeric-*.rpm /home/dulak-server/rpm/external # **Skip this step if not installing on "dulak-server"**
Line 131: Line 238:
    python campos_installer.py --machine='dulak-cluster' --create_scripts python-numeric
    cp ~/RPMS/*/python-numeric-*.rpm /home/dulak-server/rpm/campos

    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \
                  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.acml.4.0.1.acml python-numeric.spec

    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack python-numeric.spec
Line 138: Line 254:
  If you use modules::

    module load python-numeric
    echo "module load python-numeric" >> ~/global_install.sh

  otherwise logout and login again!
Line 147: Line 256:
    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
Line 148: Line 258:
    ldd `rpm -ql python-numeric | grep lapack_lite.so`
    ldd `rpm -ql python-numeric | grep _dotblas.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep lapack_lite.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep _dotblas.so`
    module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack

    module load acml-gfortran64/4.0.1-1.el5.fys
    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    python -c "import lapack_lite"
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml | grep lapack_lite.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml | grep _dotblas.so`
    module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    module unload acml-gfortran64/4.0.1-1.el5.fys
Line 153: Line 272:
    rpm -ivh --oldpackage ~/python-numeric-*.rpm     rpm -ivh --oldpackage ~rpmbuild/Intel-Nehalem/RPMS/python-numeric-23.*.rpm

  load the default `Numeric` version::

    module load acml-gfortran64/4.0.1-1.el5.fys
    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml

- netcdf::

     rpmbuild --bb --with compiler=gfortran --with modules=1 --with default_version=1 \
                    --with prefix=/opt/netcdf4/4.0.1/1.gfortran.${disttag} netcdf4.spec
     rpmbuild --bb --with compiler=gfortran43 --with modules=1 --with default_version=1 \
                    --with prefix=/opt/netcdf4/4.0.1/1.gfortran43.${disttag} netcdf4.spec
     rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \
                    --with modules=1 --with default_version=1 --with prefix=/opt/netcdf4/4.0.1/1.pathscale.${disttag} netcdf4.spec
Line 157: Line 290:
    python campos_installer.py --machine='dulak-cluster' --create_scripts ScientificPython
    cp ~/RPMS/*/ScientificPython-*.rpm /home/dulak-server/rpm/campos
    rpmbuild --bb --with Numeric_includedir=none --with numpy=numpy --with compiler=gfortran --with netcdf=netcdf4 \
                   --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \
                   --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \
                   --with modules=1 --with default_version=1 \
                   --with prefix=/opt/ScientificPython/2.8/1.${disttag}.python2.4.serial_version.numpy ScientificPython.spec

    rpmbuild --bb --with numpy=numeric --with compiler=gfortran --with netcdf=netcdf4 \
                   --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \
                   --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \
                   --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                   --with prefix=/opt/ScientificPython/2.6.2/1.${disttag}.python2.4.openmpi.numeric ScientificPython.spec

- python-docutils::

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-docutils/0.5/1.${disttag}.python2.4 python-docutils.spec

- pytz::

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/pytz/2008g/1.${disttag}.python2.4 pytz.spec

- python-docutils::

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-dateutil/1.4.1/3.${disttag}.python2.4 python-dateutil.spec

- python-matplotlib::

    module load pytz
    module load python-docutils
    module load python-dateutil
    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-matplotlib/0.98.5.2/1.${disttag}.python2.4 python-matplotlib.spec
Line 162: Line 325:
    python campos_installer.py --machine='dulak-cluster' --create_scripts campos-ase2
    cp ~/RPMS/*/campos-ase2-*.rpm /home/dulak-server/rpm/campos
    rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase2/2.3.13/1.${disttag}.python2.4 campos-ase2.spec
Line 167: Line 329:
    python campos_installer.py --machine='dulak-cluster' --create_scripts campos-dacapo-python     rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-dacapo-python/0.9.4/1.${disttag}.python2.4 campos-dacapo-python.spec

- fftw2::

    rpmbuild --bb --with compiler=gfortran --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.gfortran.${disttag} fftw2.spec
    rpmbuild --bb --with compiler=gfortran43 --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.gfortran43.${disttag} fftw2.spec
    rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \
                   --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.pathscale.${disttag} fftw2.spec
Line 171: Line 340:
    python campos_installer.py --machine='dulak-cluster' --create_scripts --compiler=gfortran43 campos-dacapo
    cp ~/RPMS/*/campos-dacapo-*.rpm /home/dulak-server/rpm/campos

  logout and login again!

build following for gpaw:
     rpmbuild --bb --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \
                    --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran43.${disttag}/lib64 \
                    --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib --with netcdf=netcdf4 \
                    --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/include \
                    --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/lib64 \
                    --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                    --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml campos-dacapo.spec

     rpmbuild --bb --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                    --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran.${disttag}/lib64 \
                    --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib --with netcdf=netcdf4 \
                    --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \
                    --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \
                    --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                    --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml campos-dacapo.spec

     rpmbuild --bb --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \
                    --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran43.${disttag}/lib64 \
                    --with lapack=lapack --with lapackdir=/usr/lib64 --with netcdf=netcdf4 \
                    --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/include \
                    --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/lib64 \
                    --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                    --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran43.openmpi.blas.3.0.37.el5.lapack campos-dacapo.spec

     rpmbuild --bb --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \
                    --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran.${disttag}/lib64 \
                    --with lapack=lapack --with lapackdir=/usr/lib64 --with netcdf=netcdf4 \
                    --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \
                    --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \
                    --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                    --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran.openmpi.blas.3.0.37.el5.lapack campos-dacapo.spec

Build following for gpaw:

- fftw3::

    rpmbuild --bb --with major_version=3 --with version1=2 --with version2=1 --with compiler=gfortran \
                  --with default_version=1 --with modules=1 --with prefix=/opt/fftw3/3.2.1/12.gfortran.${disttag} fftw2.spec
    rpmbuild --bb --with major_version=3 --with version1=2 --with version2=1 --with compiler=gfortran43 \
                  --with default_version=1 --with modules=1 --with prefix=/opt/fftw3/3.2.1/12.gfortran43.${disttag} fftw2.spec

- scipy::

    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 \
                  --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usrlib64 \
                  --with lapack=lapack --with lapackdir=/usr/lib64 \
                  --with ufsparse_libdir=/usr/lib64 --with ufsparse_includedir=/usr/include/suitesparse \
                  --with fftw_libdir=/opt/fftw3/3.2.1/12.gfortran.${disttag}/lib64 \
                  --with compiler=gfortran --with default_version=1 --with modules=1 \
                  --with prefix=/opt/scipy/0.7.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack scipy.spec

    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \
                  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with ufsparse_libdir=/usr/lib64 --with ufsparse_includedir=/usr/include/suitesparse \
                  --with fftw_libdir=/opt/fftw3/3.2.1/12.gfortran.${disttag}/lib64 \
                  --with compiler=gfortran --with default_version=1 --with modules=1 \
                  --with prefix=/opt/scipy/0.7.0/1.${disttag}.gfortran.python2.4.acml.4.0.1.acml scipy.spec
Line 180: Line 413:
    python campos_installer.py --machine='dulak-cluster' --create_scripts campos-gpaw-setups     rpmbuild -bb --with default_version --with modules campos-gpaw-setups.spec
Line 184: Line 417:
    python campos_installer.py --machine='dulak-cluster' --create_scripts campos-ase3
    cp ~/RPMS/*/campos-ase3-*.rpm /home/dulak-server/rpm/campos
    rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase3/3.1.0.846/1.${disttag}.python2.4 campos-ase3.spec

- povray::

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/povray/3.6.1/3.${disttag} povray.spec

- python-jinja2::

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-jinja2/2.1.1/1.${disttag}.python2.4 python-jinja2.spec

- python-pygments::

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-pygments/1.0/1.${disttag}.python2.4 python-pygments.spec

- babel::

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/babel/0.9.4/1.${disttag}.python2.4 babel.spec

- python-sphinx::

    module load python-jinja2
    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-sphinx/0.6.1/1.${disttag}.python2.4 python-sphinx.spec

- blacs::

     rpmbuild --bb --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                    --with prefix=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi blacs.spec

     rpmbuild --bb --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                    --with prefix=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi blacs.spec

- scalapack::

     rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \
                    --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \
                    --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with default_version=1 --with modules=1 --with openmpi=openmpi \
                    --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                    --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml scalapack.spec

     rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \
                    --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                    --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with default_version=1 --with modules=1 --with openmpi=openmpi \
                    --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib \
                    --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml scalapack.spec

     rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \
                    --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \
                    --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with default_version=1 --with modules=1 --with openmpi=openmpi \
                    --with lapack=lapack --with lapackdir=/usr/lib64 \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                    --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.blas.3.0.37.el5.lapack scalapack.spec

     rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \
                    --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \
                    --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with default_version=1 --with modules=1 --with openmpi=openmpi \
                    --with lapack=lapack --with lapackdir=/usr/lib64 \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                    --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.blas.3.0.37.el5.lapack scalapack.spec
Line 189: Line 494:
    python campos_installer.py --machine='dulak-cluster' --create_scripts --compiler=gfortran43 campos-gpaw
    cp ~/RPMS/*/campos-gpaw-*.rpm /home/dulak-server/rpm/campos

  logout and login again!
     rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \
                    --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                    --with cmr_repository=/home/niflheim/repository/db \
                    --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with default_version=1 --with modules=1 \
                    --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                    --with openmpi=openmpi --with parallel=1 \
                    --with scalapack=scalapack --with scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml/lib64 \
                    --with prefix=/opt/campos-gpaw/0.6.3862/1.${disttag}.gfortran.python2.4.openmpi.acml.4.0.1.acml campos-gpaw.spec

     rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \
                    --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \
                    --with cmr_repository=/home/niflheim/repository/db \
                    --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                    --with default_version=1 --with modules=1 \
                    --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib \
                    --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                    --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                    --with openmpi=openmpi --with parallel=1 \
                    --with scalapack=scalapack --with scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml/lib64 \
                    --with prefix=/opt/campos-gpaw/0.6.3862/1.${disttag}.gfortran43.python2.4.openmpi.acml.4.2.0.acml campos-gpaw.spec

This page describes the necessary steps for installing of dl160

On the server

Install external packages

As root:

  • create yum repository definitions (do not enable them):

    # atrpms
    echo '[atrpms]' > /etc/yum.repos.d/atrpms.repo
    echo 'name=name=CentOS $releasever - $basearch - ATrpms' >> /etc/yum.repos.d/atrpms.repo
    echo 'baseurl=http://dl.atrpms.net/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo
    echo '#baseurl=http://mirrors.ircam.fr/pub/atrpms/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo
    echo 'gpgkey=http://ATrpms.net/RPM-GPG-KEY.atrpms' >> /etc/yum.repos.d/atrpms.repo
    echo 'gpgcheck=1' >> /etc/yum.repos.d/atrpms.repo
    echo 'enabled=0' >> /etc/yum.repos.d/atrpms.repo
    # epel
    echo '[epel]' > /etc/yum.repos.d/epel.repo
    echo 'name=name=CentOS $releasever - $basearch - EPEL' >> /etc/yum.repos.d/epel.repo
    echo 'baseurl=http://download.fedora.redhat.com/pub/epel/$releasever/$basearch' >> /etc/yum.repos.d/epel.repo
    echo 'gpgkey=http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL' >> /etc/yum.repos.d/epel.repo
    echo 'gpgcheck=1' >> /etc/yum.repos.d/epel.repo
    echo 'enabled=0' >> /etc/yum.repos.d/epel.repo
  • install, as root:

    yum install yum-utils
    # /var directories must be created
    yum search --enablerepo=atrpms arpack-devel
    yum search --enablerepo=epel jmol
  • configure rpmbuild:

    • use the following ~rpmbuild/.rpmmacros:

      %disttag        el5.fys
      
      %packager       rpmbuild@fysik.dtu.dk
      %distribution   Fysik RPMS
      %vendor         Fysik RPMS <rpm@fysik.dtu.dk>
      
      %_signature     gpg
      %_gpg_path      ~/.gnupg
      %_gpg_name      Fysik RPMS
      
      #%_topdir       /home/camp/rpmbuild/AMD-Opteron
      %_topdir        /home/camp/rpmbuild/Intel-Nehalem
      %_rpmdir        %{_topdir}/RPMS
      %_srcrpmdir     %{_topdir}/SRPMS
      %_svndir        /home/camp/rpmbuild/rpmbuild
      %_specdir       %{_svndir}/SPECS
      %_sourcedir     %{_svndir}/SOURCES
      %_rpmfilename   %%{NAME}-%%{VERSION}-%%{RELEASE}.%%{ARCH}.rpm
      #%_tmppath      %{_topdir}
      %_tmppath       /tmp/rpmbuild
      %_builddir      %{_tmppath}/BUILD
      
      # no debuginfo
      %debug_package %{nil}
      
      %niflheim       1
    • as rpmbuild create directories:

      mkdir -p ~/Intel-Nehalem/RPMS
      mkdir -p ~/Intel-Nehalem/SRPMS
      mkdir -p ~/Intel-Nehalem/BUILD
      mkdir -p ~/Intel-Nehalem/SPECS # needed only by openmpi
      mkdir -p ~/Intel-Nehalem/SOURCES # needed only by openmpi
      mkdir -p /tmp/rpmbuild/BUILD
  • install official packages, as rpmbuild:

    cd ~/Intel-Nehalem/RPMS
    yumdownloader --resolve gcc-gfortran gcc43-c++ gcc43-gfortran blas-devel lapack-devel python-devel
    yumdownloader --resolve gnuplot libXi-devel xorg-x11-fonts-100dpi pexpect tetex-latex tkinter qt-devel
    yumdownloader --resolve openmpi openmpi-devel openmpi-libs compat-dapl libibverbs librdmacm openib
    yumdownloader --resolve pygtk2-devel gtk2-devel tk-devel agg ghostscript libtiff-devel
    yum localinstall * # as root
  • install atrpms packages, as rpmbuild (vtk-python is currently unavailable 16 Apr 2009):

    ~/Intel-Nehalem/RPMS
    yumdownloader --resolve --enablerepo=atrpms vtk-python arpack-devel graphviz
    wget http://ATrpms.net/RPM-GPG-KEY.atrpms
    rpm --import RPM-GPG-KEY.atrpms # as root
    yum localinstall * # as root
  • install the packages from epel, as rpmbuild:

    ~/Intel-Nehalem/RPMS
    yumdownloader --resolve --enablerepo=epel jmol
    yumdownloader --resolve --enablerepo=epel environment-modules suitesparse-devel
    wget http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL
    rpm --import RPM-GPG-KEY-EPEL # as root
    yum localinstall * # as root
    source /etc/profile.d/modules.sh
  • remove default openmpi:

    yum remove openmpi openmpi-libs
  • edit /etc/yum.conf so it contains:

    exclude=netcdf-* netcdf3-* fftw-* fftw2-* fftw3-* python-numeric openmpi-*

It's time to build custom RPMS

As rpmbuild:

cd ~/rpmbuild/SPECS

Build a custom openmpi, using torque support:

export rpmtopdir=${HOME}/Intel-Nehalem # set this to _topdir value from ~/.rpmmacros
wget http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.2.tar.bz2 \
     -O ~/rpmbuild/SOURCES/openmpi-1.3.2.tar.bz2
sh ./buildrpm-1.3.2-1.gfortran.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran.sh.log.Intel-Nehalem
sh ./buildrpm-1.3.2-1.gfortran43.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran43.sh.log.Intel-Nehalem
sh ./buildrpm-1.3.2-1.pathscale.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.pathscale.sh.log.Intel-Nehalem
rpm -ivh ~/RPMS/*/openmpi-*.rpm

If scripts that contain ALL build/install/uninstall commands (global_install.sh and global_uninstall.sh) need to be created, every time after an RPM is successfully built, do:

grep -v "#\!" install.sh >> ~/Intel-Nehalem/global_install.sh
cat uninstall.sh ~/Intel-Nehalem/global_uninstall.sh | grep -v "#\!" >> ~/Intel-Nehalem/global_uninstall.sh.tmp && mv -f ~/Intel-Nehalem/global_uninstall.sh.tmp ~/Intel-Nehalem/global_uninstall.sh
# ignore "cat: .../global_uninstall.sh: No such ..." error when running first time

Note that global_uninstall.sh won't remove built RPM files, just will uninstall the packages.

Build the following for dacapo:

  • set the disttag variable for convenience:

    export disttag="el5.fys"
  • install icc/ifort compilers,

  • acml:

    rpmbuild -bb --with compiler=gfortran --with version1=0 --with version2=1 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=pathscale --with version1=0 --with version2=1 --with modules --with default_version acml.spec
    
    rpmbuild -bb --with compiler=gfortran43 --with version1=1 --with version2=0 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=pathscale --with version1=1 --with version2=0 --with modules --with default_version acml.spec
    
    rpmbuild -bb --with compiler=gfortran43 --with version1=2 --with version2=0 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=pathscale --with version1=2 --with version2=0 --with modules --with default_version acml.spec
    rpmbuild -bb --with compiler=ifort --with version1=2 --with version2=0 --with modules --with default_version acml.spec
  • goto:

    rpmbuild --bb --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/goto/1.26/1.${disttag}.gfortran.smp goto.spec
    rpmbuild --bb --with compiler=gfortran43 --with modules=1 --with default_version=1 \
                  --with prefix=/opt/goto/1.26/1.${disttag}.gfortran43.smp goto.spec
    rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \
                  --with modules=1 --with default_version=1 --with prefix=/opt/goto/1.26/1.${disttag}.pathscale.smp goto.spec

    Note - 1.26 version fails on Nehalem with:

    ../../../param.h:1195:21: error: division by zero in #if
  • campos-dacapo-pseudopotentials:

    rpmbuild -bb --with modules --with default_version campos-dacapo-pseudopotentials.spec
  • rasmol:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/RasMol/2.7.3/3.${disttag} RasMol.spec
  • cblas:

    rpmbuild --bb  --with blas_version=3.0.37.el5 --with modules=1 --with default_version=1 \
                   --with prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 cblas.spec
    
    rpmbuild --bb  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                   --with modules=1 --with default_version=1 --with prefix=/opt/cblas/2.23.3/1.${dist}.gfortran.acml.4.0.1 cblas.spec
  • python-setuptools:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-setuptools/0.6c9/1.${disttag}.python2.4 python-setuptools.spec
  • python-nose:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-nose/0.10.4/1.${disttag}.python2.4 python-nose.spec
    module load python-setuptools
    module load python-nose
  • numpy:

    rpmbuild --bb --with cblas_prefix=none \ # dotblas fails with acml
                  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.acml.4.0.1.acml numpy.spec
    
    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack numpy.spec

    Test with:

    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)"
    python -c "import numpy; numpy.test()"
    module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    
    module load acml-gfortran64/4.0.1-1.el5.fys
    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)"
    python -c "import numpy; numpy.test()"
    module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    module unload acml-gfortran64/4.0.1-1.el5.fys

    Load the default numpy:

    module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
  • gnuplot-py:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/gnuplot-py/1.8.1/1.${disttag}.python2.4 gnuplot-py.spec
    
    module load gnuplot-py
  • python-numeric (we must install 24.2 version, and we keep the default version):

    cd ~/Intel-Nehalem/RPMS
    rpm -e --nodeps python-numeric # as root
    yumdownloader --resolve --disableexcludes=main python-numeric
    cd ~/rpmbuild/SPECS
    
    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \
                  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.acml.4.0.1.acml python-numeric.spec
    
    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \
                  --with compiler=gfortran --with modules=1 --with default_version=1 \
                  --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack python-numeric.spec

    Note: (16 Apr 2009) currently Numeric's test.py results in (we ignore this error):

    glibc detected *** python: free(): invalid next size (normal): 0x09aee970 ***

    After installing python-numeric make a very rough check:

    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    python -c "import lapack_lite"
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep lapack_lite.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep _dotblas.so`
    module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack
    
    module load acml-gfortran64/4.0.1-1.el5.fys
    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    python -c "import lapack_lite"
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml | grep lapack_lite.so`
    ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml | grep _dotblas.so`
    module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml
    module unload acml-gfortran64/4.0.1-1.el5.fys

    and reinstall the default version:

    rpm -ivh --oldpackage ~rpmbuild/Intel-Nehalem/RPMS/python-numeric-23.*.rpm

    load the default Numeric version:

    module load acml-gfortran64/4.0.1-1.el5.fys
    module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml
  • netcdf:

    rpmbuild --bb  --with compiler=gfortran --with modules=1 --with default_version=1 \
                   --with prefix=/opt/netcdf4/4.0.1/1.gfortran.${disttag} netcdf4.spec
    rpmbuild --bb  --with compiler=gfortran43 --with modules=1 --with default_version=1 \
                   --with prefix=/opt/netcdf4/4.0.1/1.gfortran43.${disttag} netcdf4.spec
    rpmbuild --bb  --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \
                   --with modules=1 --with default_version=1 --with prefix=/opt/netcdf4/4.0.1/1.pathscale.${disttag} netcdf4.spec
  • ScientificPython:

    rpmbuild --bb  --with Numeric_includedir=none --with numpy=numpy --with compiler=gfortran --with netcdf=netcdf4 \
                   --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \
                   --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \
                   --with modules=1 --with default_version=1 \
                   --with prefix=/opt/ScientificPython/2.8/1.${disttag}.python2.4.serial_version.numpy ScientificPython.spec
    
    rpmbuild --bb  --with numpy=numeric --with compiler=gfortran --with netcdf=netcdf4 \
                   --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \
                   --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \
                   --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                   --with prefix=/opt/ScientificPython/2.6.2/1.${disttag}.python2.4.openmpi.numeric ScientificPython.spec
  • python-docutils:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-docutils/0.5/1.${disttag}.python2.4 python-docutils.spec
  • pytz:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/pytz/2008g/1.${disttag}.python2.4 pytz.spec
  • python-docutils:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-dateutil/1.4.1/3.${disttag}.python2.4 python-dateutil.spec
  • python-matplotlib:

    module load pytz
    module load python-docutils
    module load python-dateutil
    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-matplotlib/0.98.5.2/1.${disttag}.python2.4 python-matplotlib.spec
  • campos-ase2:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase2/2.3.13/1.${disttag}.python2.4 campos-ase2.spec
  • campos-dacapo-python:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-dacapo-python/0.9.4/1.${disttag}.python2.4 campos-dacapo-python.spec
  • fftw2:

    rpmbuild --bb  --with compiler=gfortran --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.gfortran.${disttag} fftw2.spec
    rpmbuild --bb  --with compiler=gfortran43 --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.gfortran43.${disttag} fftw2.spec
    rpmbuild --bb  --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \
                   --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.pathscale.${disttag} fftw2.spec
  • campos-dacapo:

    rpmbuild --bb  --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \
                   --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran43.${disttag}/lib64 \
                   --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib --with netcdf=netcdf4 \
                   --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/include \
                   --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/lib64 \
                   --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                   --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml campos-dacapo.spec
    
    rpmbuild --bb  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                   --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran.${disttag}/lib64 \
                   --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib --with netcdf=netcdf4 \
                   --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \
                   --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \
                   --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                   --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml campos-dacapo.spec
    
    rpmbuild --bb  --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \
                   --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran43.${disttag}/lib64 \
                   --with lapack=lapack --with lapackdir=/usr/lib64 --with netcdf=netcdf4 \
                   --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/include \
                   --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/lib64 \
                   --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                   --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran43.openmpi.blas.3.0.37.el5.lapack campos-dacapo.spec
    
    rpmbuild --bb  --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \
                   --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran.${disttag}/lib64 \
                   --with lapack=lapack --with lapackdir=/usr/lib64 --with netcdf=netcdf4 \
                   --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \
                   --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \
                   --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                   --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran.openmpi.blas.3.0.37.el5.lapack campos-dacapo.spec

Build following for gpaw:

  • fftw3:

    rpmbuild --bb --with major_version=3 --with version1=2 --with version2=1 --with compiler=gfortran \
                  --with default_version=1 --with modules=1 --with prefix=/opt/fftw3/3.2.1/12.gfortran.${disttag} fftw2.spec
    rpmbuild --bb --with major_version=3 --with version1=2 --with version2=1 --with compiler=gfortran43 \
                  --with default_version=1 --with modules=1 --with prefix=/opt/fftw3/3.2.1/12.gfortran43.${disttag} fftw2.spec
  • scipy:

    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 \
                  --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usrlib64 \
                  --with lapack=lapack --with lapackdir=/usr/lib64 \
                  --with ufsparse_libdir=/usr/lib64 --with ufsparse_includedir=/usr/include/suitesparse \
                  --with fftw_libdir=/opt/fftw3/3.2.1/12.gfortran.${disttag}/lib64 \
                  --with compiler=gfortran --with default_version=1 --with modules=1 \
                  --with prefix=/opt/scipy/0.7.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack scipy.spec
    
    rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \
                  --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                  --with ufsparse_libdir=/usr/lib64 --with ufsparse_includedir=/usr/include/suitesparse \
                  --with fftw_libdir=/opt/fftw3/3.2.1/12.gfortran.${disttag}/lib64 \
                  --with compiler=gfortran --with default_version=1 --with modules=1 \
                  --with prefix=/opt/scipy/0.7.0/1.${disttag}.gfortran.python2.4.acml.4.0.1.acml scipy.spec
  • campos-gpaw-setups:

    rpmbuild -bb --with default_version --with modules campos-gpaw-setups.spec
  • campos-ase3:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase3/3.1.0.846/1.${disttag}.python2.4 campos-ase3.spec
  • povray:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/povray/3.6.1/3.${disttag} povray.spec
  • python-jinja2:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-jinja2/2.1.1/1.${disttag}.python2.4 python-jinja2.spec
  • python-pygments:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-pygments/1.0/1.${disttag}.python2.4 python-pygments.spec
  • babel:

    rpmbuild -bb --with modules --with default_version --with prefix=/opt/babel/0.9.4/1.${disttag}.python2.4 babel.spec
  • python-sphinx:

    module load python-jinja2
    rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-sphinx/0.6.1/1.${disttag}.python2.4 python-sphinx.spec
  • blacs:

    rpmbuild --bb --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                   --with prefix=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi blacs.spec
    
    rpmbuild --bb  --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                   --with prefix=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi blacs.spec
  • scalapack:

    rpmbuild --bb  --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \
                   --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \
                   --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with default_version=1 --with modules=1 --with openmpi=openmpi \
                   --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                   --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml scalapack.spec
    
    rpmbuild --bb  --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \
                   --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                   --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with default_version=1 --with modules=1 --with openmpi=openmpi \
                   --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib \
                   --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml scalapack.spec
    
    rpmbuild --bb  --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \
                   --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \
                   --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with default_version=1 --with modules=1 --with openmpi=openmpi \
                   --with lapack=lapack --with lapackdir=/usr/lib64 \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                   --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.blas.3.0.37.el5.lapack scalapack.spec
    
    rpmbuild --bb  --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \
                   --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \
                   --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with default_version=1 --with modules=1 --with openmpi=openmpi \
                   --with lapack=lapack --with lapackdir=/usr/lib64 \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                   --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.blas.3.0.37.el5.lapack scalapack.spec
  • campos-gpaw:

    rpmbuild --bb  --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \
                   --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \
                   --with cmr_repository=/home/niflheim/repository/db \
                   --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with default_version=1 --with modules=1 \
                   --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \
                   --with openmpi=openmpi --with parallel=1 \
                   --with scalapack=scalapack --with scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml/lib64 \
                   --with prefix=/opt/campos-gpaw/0.6.3862/1.${disttag}.gfortran.python2.4.openmpi.acml.4.0.1.acml campos-gpaw.spec
    
    rpmbuild --bb  --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \
                   --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \
                   --with cmr_repository=/home/niflheim/repository/db \
                   --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \
                   --with default_version=1 --with modules=1 \
                   --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib \
                   --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \
                   --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \
                   --with openmpi=openmpi --with parallel=1 \
                   --with scalapack=scalapack --with scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml/lib64 \
                   --with prefix=/opt/campos-gpaw/0.6.3862/1.${disttag}.gfortran43.python2.4.openmpi.acml.4.2.0.acml campos-gpaw.spec

Testing packages

Test dacapo installation (as normal user!).

If you use modules:

module load openmpi
module load campos-dacapo-pseudopotentials
module load python-numeric
module load campos-dacapo-python
module load ScientificPython
module load gnuplot-py
module load RasMol
module load campos-ase2
module load campos-dacapo
ulimit -s 65000 # dacapo needs a large stack

Test with (make sure that /scratch/$USER exists):

cp -r `rpm -ql campos-dacapo-python | grep "share/campos-dacapo-python$"` /tmp
cd /tmp/campos-dacapo-python/Tests
python test.py 2>&1 | tee test.log

It can take up to 1 day. Please consider disabling these "long" tests in test.py:

tests.remove('../Examples/Wannier-ethylene.py')
tests.remove('../Examples/Wannier-Pt4.py')
tests.remove('../Examples/Wannier-Ptwire.py')
tests.remove('../Examples/Wannier-Fe-bcc.py')
tests.remove('../Examples/transport_1dmodel.py')

Note all vtk related tests will fail.

Test gpaw installation (as normal user!):

If you use modules:

module load openmpi
module load campos-ase3
module load campos-gpaw-setups
module load campos-gpaw

Test with:

cp -r `rpm -ql campos-gpaw | grep "share/campos-gpaw/test$"` /tmp/test.gpaw.$$
cd /tmp/test.gpaw.*
python test.py 2>&1 | tee test.log

It takes about 20 minutes.

On "Golden Client"

Login, as root, to the "Golden Client":

ssh n001

Enable nfs mount of the server home directory - follow 'Enable nfs mount on the "Golden Client"' from configuring NFS. After this do:

cd /home/dulak-server/rpm/campos

rpm -ivh campos-dacapo-2*

If getting:

package example_package.el5.i386 is already installed

remove these packages with:

rpm -e --nodeps example_package

to allow the installation to proceed.

Make sure that both python-numeric versions are installed:

rpm -q python-numeric

This command will show a list of packages that need to be installed to fulfill dacapo dependencies. All these packages should be already under /home/dulak-server/rpm. Remember to test the dacapo and gpaw installations on the "Golden Client" too.

If you are installing workstation only, your setup is ready for testing - go to benchmarking and maintenance.

If you are building a cluster go back to installing and configuring systemimager,

Niflheim: Cluster_software_-_RPMS (last edited 2015-07-02 11:28:20 by OleHolmNielsen)