Size: 11167
Comment:
|
Size: 37744
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 1: | Line 1: |
This page describes the necessary steps for installing `gpaw <https://wiki.fysik.dtu.dk/gpaw>`_ and `dacapo <https://wiki.fysik.dtu.dk/dacapo>`_ programs. On the server ------------- |
This page describes the necessary steps for installing the login nodes **fjorm** or **thul**. |
Line 30: | Line 26: |
- install:: | - install, as root:: |
Line 33: | Line 29: |
- if not yet done, go to `configuring rpmbuild <Building_a_Cluster_-_Tutorial/configuring_rpmbuild>`_, - **Skip this step if not installing on "dulak-server"**: create the ``/home/dulak-server/rpm/external`` directory (to keep external RPMS):: mkdir /home/dulak-server/rpm/external; cd /home/dulak-server/rpm/external - install official packages:: |
# /var directories must be created yum search --enablerepo=atrpms arpack-devel yum search --enablerepo=epel jmol - configure rpmbuild (as rpmbuild): - enable modules - add to `~/.bashrc`:: if [ -r "/home/camp/modulefiles.sh" ]; then source /home/camp/modulefiles.sh fi - set the `FYS_PLATFORM` variable:: export FYS_PLATFORM=Intel-Nehalem-el5 # thul export FYS_PLATFORM=AMD-Opteron-el5 # fjorm **Note** that this variable will be set automatically by `/home/camp/modulefiles.sh` after the `environment-modules` package is installed. - use the following ~rpmbuild/.rpmmacros:: %disttag el5.fys %packager rpmbuild@fysik.dtu.dk %distribution Fysik RPMS %vendor Fysik RPMS <rpm@fysik.dtu.dk> %_signature gpg %_gpg_path ~/.gnupg %_gpg_name Fysik RPMS %_topdir /home/camp/rpmbuild/%(echo $FYS_PLATFORM) %_rpmdir %{_topdir}/RPMS %_srcrpmdir %{_topdir}/SRPMS %_svndir /home/camp/rpmbuild/rpmbuild %_specdir %{_svndir}/SPECS %_sourcedir %{_svndir}/SOURCES %_rpmfilename %%{NAME}-%%{VERSION}-%%{RELEASE}.%%{ARCH}.rpm %_builddir /tmp %_tmppath %{_topdir}/BUILD # no debuginfo %debug_package %{nil} # don't strip (this does not fully work) # https://bugzilla.redhat.com/show_bug.cgi?id=219731 %define __strip /bin/true %niflheim 1 - as rpmbuild create directories:: mkdir -p ~/${FYS_PLATFORM}/RPMS mkdir -p ~/${FYS_PLATFORM}/SRPMS mkdir -p ~/${FYS_PLATFORM}/BUILD mkdir -p ~/${FYS_PLATFORM}/SPECS # needed only by openmpi mkdir -p ~/${FYS_PLATFORM}/SOURCES # needed only by openmpi - install official packages, as rpmbuild:: cd ~/${FYS_PLATFORM}/RPMS mkdir external; cd external |
Line 45: | Line 95: |
yum localinstall * - install `atrpms` packages (``vtk-python`` is currently unavailable 16 Apr 2009):: yumdownloader --resolve --enablerepo=atrpms vtk-python fftw2 fftw2-devel netcdf netcdf-devel arpack-devel graphviz |
yumdownloader --resolve pygtk2-devel gtk2-devel tk-devel agg ghostscript libtiff-devel yum localinstall * # as root - install `atrpms` packages, as rpmbuild (``vtk-python`` is currently unavailable 16 Apr 2009):: cd ~/${FYS_PLATFORM}/RPMS/external yumdownloader --resolve --enablerepo=atrpms vtk-python arpack-devel graphviz |
Line 51: | Line 103: |
rpm --import RPM-GPG-KEY.atrpms yum localinstall * - install the packages from `epel`:: yumdownloader --resolve --enablerepo=epel fftw3 fftw3-devel python-matplotlib python-docutils jmol |
rpm --import RPM-GPG-KEY.atrpms # as root yum localinstall * # as root - install the packages from `epel`, as rpmbuild:: ~/${FYS_PLATFORM}/RPMS/external yumdownloader --resolve --enablerepo=epel jmol |
Line 59: | Line 112: |
rpm --import RPM-GPG-KEY-EPEL yum localinstall * source /etc/profile.d/modules.sh |
rpm --import RPM-GPG-KEY-EPEL # as root yum localinstall * # as root - remove default openmpi:: yum remove openmpi openmpi-libs |
Line 66: | Line 122: |
- logout and login (as rpmbuild) again to activate modules settings from `~/.bashrc`. |
|
Line 70: | Line 128: |
As root:: | As rpmbuild:: |
Line 74: | Line 132: |
**Skip this step if not installing on "dulak-server"**: create the ``/home/dulak-server/rpm/campos`` directory (to keep custom built RPMS):: mkdir /home/dulak-server/rpm/campos Preferably build a custom openmpi, using the latest gcc/gfortran and torque support:: wget http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.1.tar.bz2 \ -O ~/rpmbuild/SOURCES/openmpi-1.3.1.tar.bz2 export rpmtopdir=${HOME} # set this to _topdir value from ~/.rpmmacros sh ./buildrpm-1.3.1-1.gfortran.sh ../SOURCES/openmpi-1.3.1.tar.bz2 2>&1 | tee buildrpm-1.3.1-1.gfortran.sh.log rpm -ivh ~/RPMS/*/openmpi-*.rpm cp ~/RPMS/*/openmpi-*.rpm /home/dulak-server/rpm/campos |
Install icc/ifort compilers, enable them (only on **thul**):: . /opt/intel/Compiler/11.0/083/bin/intel64/ifortvars_intel64.sh . /opt/intel/Compiler/11.0/083/bin/intel64/iccvars_intel64.sh Build intel compatibility packages (only on **thul**):: rpmbuild -bb --with modules --with default_version intel-redist.spec **Note**: do not install the resulting RPMS on the login node. They need to be installed **only** on compute nodes. On the login node only the module file needs to be deployed (as root):: mkdir -p /etc/modulefiles/intel cp /tmp/intel-11.0.083/11.0.083-1.intel64.el5.fys /etc/modulefiles/intel **Note**: the above module file should contain at least:: prepend-path LD_LIBRARY_PATH /opt/intel/Compiler/11.0/083/lib/intel64 prepend-path PATH /opt/intel/Compiler/11.0/083/bin/intel64 prepend-path MANPATH /opt/intel/Compiler/11.0/083/man Install mkl, and build mkl compatibility package (only on **thul**):: cd ~/rpmbuild/SOURCES mkdir 10.1.3.027 cp -r /opt/intel/mkl/10.1.3.027/lib 10.1.3.027 cp -r /opt/intel/mkl/10.1.3.027/doc 10.1.3.027 tar zcf intel-redist-mkl-10.1.3.027.tar.gz 10.1.3.027 cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version intel-redist-mkl.spec **Note**: do not install the resulting RPM on the login node. It needs to be installed **only** on compute nodes. On the login node only the module file needs to be deployed (as root):: mkdir -p /etc/modulefiles/mkl cp /tmp/intel-mkl-10.1.3.027/10.1.3.027-1.el5.fys.em64t /etc/modulefiles/mkl **Note**: the above module file should contain at least:: prepend-path LD_LIBRARY_PATH /opt/intel/mkl/10.1.3.027/lib/em64t Build a custom openmpi, using torque support:: export rpmtopdir=${HOME}/${FYS_PLATFORM} # set this to _topdir value from ~/.rpmmacros wget http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.2.tar.bz2 \ -O ~/rpmbuild/SOURCES/openmpi-1.3.2.tar.bz2 sh ./buildrpm-1.3.2-1.gfortran.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.2-1.gfortran43.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran43.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.2-1.pathscale.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.pathscale.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.2-1.ifort.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.ifort.sh.log.${FYS_PLATFORM} # thul only **Note**: intel openmpi needs to be installed ignoring dependencies:: rpm -ivh --nodeps --oldpackage ~rpmbuild/${FYS_PLATFORM}/RPMS/openmpi-1.3.2-1.ifort.el5.fys.x86_64.rpm |
Line 90: | Line 191: |
grep -v "#\!" install.sh >> ~/global_install.sh cat uninstall.sh ~/global_uninstall.sh | grep -v "#\!" >> ~/global_uninstall.sh.tmp && mv -f ~/global_uninstall.sh.tmp ~/global_uninstall.sh # ignore "cat: /root/global_uninstall.sh: No such ..." error when running first time |
grep -v "#\!" install.sh >> ~/${FYS_PLATFORM}/global_install.sh cat uninstall.sh ~/${FYS_PLATFORM}/global_uninstall.sh | grep -v "#\!" >> ~/${FYS_PLATFORM}/global_uninstall.sh.tmp && mv -f ~/${FYS_PLATFORM}/global_uninstall.sh.tmp ~/${FYS_PLATFORM}/global_uninstall.sh # ignore "cat: .../global_uninstall.sh: No such ..." error when running first time |
Line 97: | Line 198: |
- set the disttag variable for convenience:: export disttag="el5.fys" - acml:: rpmbuild -bb --with compiler=gfortran --with version1=0 --with version2=1 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=pathscale --with version1=0 --with version2=1 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=gfortran43 --with version1=1 --with version2=0 --with modules --with default_version acml.spec # fjorm only rpmbuild -bb --with compiler=pathscale --with version1=1 --with version2=0 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=gfortran43 --with version1=2 --with version2=0 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=pathscale --with version1=2 --with version2=0 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=ifort --with version1=2 --with version2=0 --with modules --with default_version acml.spec # thul only - goto (only on **fjorm**):: rpmbuild --bb --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/goto/1.26/1.${disttag}.gfortran.smp goto.spec rpmbuild --bb --with compiler=gfortran43 --with modules=1 --with default_version=1 \ --with prefix=/opt/goto/1.26/1.${disttag}.gfortran43.smp goto.spec rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \ --with modules=1 --with default_version=1 --with prefix=/opt/goto/1.26/1.${disttag}.pathscale.smp goto.spec **Note** - **1.26** version fails on Nehalem with:: ../../../param.h:1195:21: error: division by zero in #if - atlas:: rpmbuild --bb --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ -with modules=1 --with default_version=1 --with prefix=/opt/atlas/3.8.3/1.gfortran.${disttag} atlas.spec rpmbuild --bb --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ -with modules=1 --with default_version=1 --with prefix=/opt/atlas/3.8.3/1.gfortran43.${disttag} atlas.spec rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \ -with modules=1 --with default_version=1 --with prefix=/opt/atlas/3.8.3/1.pathscale.${disttag} atlas.spec |
|
Line 100: | Line 241: |
python campos_installer.py --machine='dulak-cluster' --create_scripts campos-dacapo-pseudopotentials | rpmbuild -bb --with modules --with default_version campos-dacapo-pseudopotentials.spec |
Line 104: | Line 245: |
python campos_installer.py --machine='dulak-cluster' --create_scripts RasMol cp ~/RPMS/*/RasMol-*.rpm /home/dulak-server/rpm/campos |
rpmbuild -bb --with modules --with default_version --with prefix=/opt/RasMol/2.7.3/3.${disttag} RasMol.spec - `cblas <https://wiki.fysik.dtu.dk/niflheim/Cluster_software_-_RPMS#cblas>`_:: rpmbuild --bb --with blas_version=3.0.37.el5 --with modules=1 --with default_version=1 \ --with prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 cblas.spec rpmbuild --bb --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with modules=1 --with default_version=1 --with prefix=/opt/cblas/2.23.3/1.${dist}.gfortran.acml.4.0.1 cblas.spec - python-setuptools:: rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-setuptools/0.6c9/1.${disttag}.python2.4 python-setuptools.spec - python-nose:: rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-nose/0.10.4/1.${disttag}.python2.4 python-nose.spec module load python-setuptools module load python-nose - numpy:: rpmbuild --bb --with cblas_prefix=none \ # dotblas fails with acml --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.acml.4.0.1.acml numpy.spec rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \ --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack numpy.spec Test with:: module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)" python -c "import numpy; numpy.test()" module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack module load acml-gfortran64/4.0.1-1.el5.fys module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)" python -c "import numpy; numpy.test()" module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml module unload acml-gfortran64/4.0.1-1.el5.fys Load the default numpy:: module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml |
Line 109: | Line 297: |
python campos_installer.py --machine='dulak-cluster' --create_scripts gnuplot-py cp ~/RPMS/*/gnuplot-py-*.rpm /home/dulak-server/rpm/campos if you use modules:: |
rpmbuild -bb --with modules --with default_version --with prefix=/opt/gnuplot-py/1.8.1/1.${disttag}.python2.4 gnuplot-py.spec |
Line 115: | Line 300: |
echo "module load gnuplot-py" >> ~/global_install.sh otherwise logout and login again! - `cblas <https://wiki.fysik.dtu.dk/niflheim/Cluster_software_-_RPMS#cblas>`_:: python campos_installer.py --machine='dulak-cluster' --create_scripts cblas cp ~/RPMS/*/cblas-*.rpm /home/dulak-server/rpm/campos |
|
Line 126: | Line 303: |
cd rpm -e --nodeps python-numeric |
cd ~/${FYS_PLATFORM}/RPMS/external rpm -e --nodeps python-numeric # as root |
Line 129: | Line 306: |
cp python-numeric-*.rpm /home/dulak-server/rpm/external # **Skip this step if not installing on "dulak-server"** | |
Line 131: | Line 307: |
python campos_installer.py --machine='dulak-cluster' --create_scripts python-numeric cp ~/RPMS/*/python-numeric-*.rpm /home/dulak-server/rpm/campos |
rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \ --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.acml.4.0.1.acml python-numeric.spec rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \ --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack python-numeric.spec |
Line 138: | Line 323: |
If you use modules:: module load python-numeric echo "module load python-numeric" >> ~/global_install.sh otherwise logout and login again! |
|
Line 147: | Line 325: |
module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | |
Line 148: | Line 327: |
ldd `rpm -ql python-numeric | grep lapack_lite.so` ldd `rpm -ql python-numeric | grep _dotblas.so` |
ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep lapack_lite.so` ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep _dotblas.so` module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack module load acml-gfortran64/4.0.1-1.el5.fys module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml python -c "import lapack_lite" ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml | grep lapack_lite.so` ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml | grep _dotblas.so` module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml module unload acml-gfortran64/4.0.1-1.el5.fys |
Line 153: | Line 341: |
rpm -ivh --oldpackage ~/python-numeric-*.rpm | rpm -ivh --oldpackage ~rpmbuild/${FYS_PLATFORM}/RPMS/external/python-numeric-23.*.rpm load the default `Numeric` version:: module load acml-gfortran64/4.0.1-1.el5.fys module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml - netcdf:: rpmbuild --bb --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/netcdf4/4.0.1/1.gfortran.${disttag} netcdf4.spec rpmbuild --bb --with compiler=gfortran43 --with modules=1 --with default_version=1 \ --with prefix=/opt/netcdf4/4.0.1/1.gfortran43.${disttag} netcdf4.spec rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \ --with modules=1 --with default_version=1 --with prefix=/opt/netcdf4/4.0.1/1.pathscale.${disttag} netcdf4.spec |
Line 157: | Line 359: |
python campos_installer.py --machine='dulak-cluster' --create_scripts ScientificPython cp ~/RPMS/*/ScientificPython-*.rpm /home/dulak-server/rpm/campos |
rpmbuild --bb --with Numeric_includedir=none --with numpy=numpy --with compiler=gfortran --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \ --with modules=1 --with default_version=1 \ --with prefix=/opt/ScientificPython/2.8/1.${disttag}.python2.4.serial_version.numpy ScientificPython.spec rpmbuild --bb --with numpy=numeric --with compiler=gfortran --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/ScientificPython/2.6.2/1.${disttag}.python2.4.openmpi.numeric ScientificPython.spec - python-docutils:: rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-docutils/0.5/1.${disttag}.python2.4 python-docutils.spec - pytz:: rpmbuild -bb --with modules --with default_version --with prefix=/opt/pytz/2008g/1.${disttag}.python2.4 pytz.spec - python-docutils:: rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-dateutil/1.4.1/3.${disttag}.python2.4 python-dateutil.spec - python-matplotlib:: module load pytz module load python-docutils module load python-dateutil rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-matplotlib/0.98.5.2/1.${disttag}.python2.4 python-matplotlib.spec |
Line 162: | Line 394: |
python campos_installer.py --machine='dulak-cluster' --create_scripts campos-ase2 cp ~/RPMS/*/campos-ase2-*.rpm /home/dulak-server/rpm/campos |
rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase2/2.3.13/1.${disttag}.python2.4 campos-ase2.spec |
Line 167: | Line 398: |
python campos_installer.py --machine='dulak-cluster' --create_scripts campos-dacapo-python | rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-dacapo-python/0.9.4/1.${disttag}.python2.4 campos-dacapo-python.spec - fftw2:: rpmbuild --bb --with compiler=gfortran --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.gfortran.${disttag} fftw2.spec rpmbuild --bb --with compiler=gfortran43 --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.gfortran43.${disttag} fftw2.spec rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \ --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.pathscale.${disttag} fftw2.spec |
Line 171: | Line 409: |
python campos_installer.py --machine='dulak-cluster' --create_scripts --compiler=gfortran43 campos-dacapo cp ~/RPMS/*/campos-dacapo-*.rpm /home/dulak-server/rpm/campos logout and login again! build following for gpaw: |
rpmbuild --bb --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran43.${disttag}/lib64 \ --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml campos-dacapo.spec rpmbuild --bb --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran.${disttag}/lib64 \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml campos-dacapo.spec rpmbuild --bb --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran43.${disttag}/lib64 \ --with lapack=lapack --with lapackdir=/usr/lib64 --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran43.openmpi.blas.3.0.37.el5.lapack campos-dacapo.spec rpmbuild --bb --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran.${disttag}/lib64 \ --with lapack=lapack --with lapackdir=/usr/lib64 --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran.openmpi.blas.3.0.37.el5.lapack campos-dacapo.spec Build following for gpaw: - fftw3:: rpmbuild --bb --with major_version=3 --with version1=2 --with version2=1 --with compiler=gfortran \ --with default_version=1 --with modules=1 --with prefix=/opt/fftw3/3.2.1/12.gfortran.${disttag} fftw2.spec rpmbuild --bb --with major_version=3 --with version1=2 --with version2=1 --with compiler=gfortran43 \ --with default_version=1 --with modules=1 --with prefix=/opt/fftw3/3.2.1/12.gfortran43.${disttag} fftw2.spec - scipy:: module unload numpy module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 \ --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usrlib64 \ --with lapack=lapack --with lapackdir=/usr/lib64 \ --with ufsparse_libdir=/usr/lib64 --with ufsparse_includedir=/usr/include/suitesparse \ --with fftw_libdir=/opt/fftw3/3.2.1/12.gfortran.${disttag}/lib64 \ --with compiler=gfortran --with default_version=1 --with modules=1 \ --with prefix=/opt/scipy/0.7.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack scipy.spec module unload numpy module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \ --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with ufsparse_libdir=/usr/lib64 --with ufsparse_includedir=/usr/include/suitesparse \ --with fftw_libdir=/opt/fftw3/3.2.1/12.gfortran.${disttag}/lib64 \ --with compiler=gfortran --with default_version=1 --with modules=1 \ --with prefix=/opt/scipy/0.7.0/1.${disttag}.gfortran.python2.4.acml.4.0.1.acml scipy.spec |
Line 180: | Line 488: |
python campos_installer.py --machine='dulak-cluster' --create_scripts campos-gpaw-setups | rpmbuild -bb --with default_version --with modules campos-gpaw-setups.spec |
Line 184: | Line 492: |
python campos_installer.py --machine='dulak-cluster' --create_scripts campos-ase3 cp ~/RPMS/*/campos-ase3-*.rpm /home/dulak-server/rpm/campos |
rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase3/3.1.0.846/1.${disttag}.python2.4 campos-ase3.spec - povray:: rpmbuild -bb --with modules --with default_version --with prefix=/opt/povray/3.6.1/3.${disttag} povray.spec - python-jinja2:: rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-jinja2/2.1.1/1.${disttag}.python2.4 python-jinja2.spec - python-pygments:: rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-pygments/1.0/1.${disttag}.python2.4 python-pygments.spec - babel:: rpmbuild -bb --with modules --with default_version --with prefix=/opt/babel/0.9.4/1.${disttag}.python2.4 babel.spec - python-sphinx:: module load python-jinja2 rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-sphinx/0.6.1/1.${disttag}.python2.4 python-sphinx.spec - blacs:: rpmbuild --bb --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi blacs.spec rpmbuild --bb --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi blacs.spec - scalapack:: rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \ --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 --with openmpi=openmpi \ --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml scalapack.spec rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \ --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 --with openmpi=openmpi \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib \ --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml scalapack.spec rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \ --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 --with openmpi=openmpi \ --with lapack=lapack --with lapackdir=/usr/lib64 \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.blas.3.0.37.el5.lapack scalapack.spec rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \ --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 --with openmpi=openmpi \ --with lapack=lapack --with lapackdir=/usr/lib64 \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.blas.3.0.37.el5.lapack scalapack.spec |
Line 189: | Line 569: |
python campos_installer.py --machine='dulak-cluster' --create_scripts --compiler=gfortran43 campos-gpaw cp ~/RPMS/*/campos-gpaw-*.rpm /home/dulak-server/rpm/campos logout and login again! |
rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \ --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with cmr_repository=/home/niflheim/repository/db \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with openmpi=openmpi --with parallel=1 \ --with scalapack=scalapack --with scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml/lib64 \ --with prefix=/opt/campos-gpaw/0.6.3862/1.${disttag}.gfortran.python2.4.openmpi.acml.4.0.1.acml campos-gpaw.spec rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \ --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \ --with cmr_repository=/home/niflheim/repository/db \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 \ --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with openmpi=openmpi --with parallel=1 \ --with scalapack=scalapack --with scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml/lib64 \ --with prefix=/opt/campos-gpaw/0.6.3862/1.${disttag}.gfortran43.python2.4.openmpi.acml.4.2.0.acml campos-gpaw.spec |
This page describes the necessary steps for installing the login nodes fjorm or thul.
Install external packages
As root:
create yum repository definitions (do not enable them):
# atrpms echo '[atrpms]' > /etc/yum.repos.d/atrpms.repo echo 'name=name=CentOS $releasever - $basearch - ATrpms' >> /etc/yum.repos.d/atrpms.repo echo 'baseurl=http://dl.atrpms.net/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo echo '#baseurl=http://mirrors.ircam.fr/pub/atrpms/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo echo 'gpgkey=http://ATrpms.net/RPM-GPG-KEY.atrpms' >> /etc/yum.repos.d/atrpms.repo echo 'gpgcheck=1' >> /etc/yum.repos.d/atrpms.repo echo 'enabled=0' >> /etc/yum.repos.d/atrpms.repo # epel echo '[epel]' > /etc/yum.repos.d/epel.repo echo 'name=name=CentOS $releasever - $basearch - EPEL' >> /etc/yum.repos.d/epel.repo echo 'baseurl=http://download.fedora.redhat.com/pub/epel/$releasever/$basearch' >> /etc/yum.repos.d/epel.repo echo 'gpgkey=http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL' >> /etc/yum.repos.d/epel.repo echo 'gpgcheck=1' >> /etc/yum.repos.d/epel.repo echo 'enabled=0' >> /etc/yum.repos.d/epel.repo
install, as root:
yum install yum-utils # /var directories must be created yum search --enablerepo=atrpms arpack-devel yum search --enablerepo=epel jmol
configure rpmbuild (as rpmbuild):
enable modules - add to ~/.bashrc:
if [ -r "/home/camp/modulefiles.sh" ]; then source /home/camp/modulefiles.sh fi
set the FYS_PLATFORM variable:
export FYS_PLATFORM=Intel-Nehalem-el5 # thul export FYS_PLATFORM=AMD-Opteron-el5 # fjorm
Note that this variable will be set automatically by /home/camp/modulefiles.sh after the environment-modules package is installed.
use the following ~rpmbuild/.rpmmacros:
%disttag el5.fys %packager rpmbuild@fysik.dtu.dk %distribution Fysik RPMS %vendor Fysik RPMS <rpm@fysik.dtu.dk> %_signature gpg %_gpg_path ~/.gnupg %_gpg_name Fysik RPMS %_topdir /home/camp/rpmbuild/%(echo $FYS_PLATFORM) %_rpmdir %{_topdir}/RPMS %_srcrpmdir %{_topdir}/SRPMS %_svndir /home/camp/rpmbuild/rpmbuild %_specdir %{_svndir}/SPECS %_sourcedir %{_svndir}/SOURCES %_rpmfilename %%{NAME}-%%{VERSION}-%%{RELEASE}.%%{ARCH}.rpm %_builddir /tmp %_tmppath %{_topdir}/BUILD # no debuginfo %debug_package %{nil} # don't strip (this does not fully work) # https://bugzilla.redhat.com/show_bug.cgi?id=219731 %define __strip /bin/true %niflheim 1
as rpmbuild create directories:
mkdir -p ~/${FYS_PLATFORM}/RPMS mkdir -p ~/${FYS_PLATFORM}/SRPMS mkdir -p ~/${FYS_PLATFORM}/BUILD mkdir -p ~/${FYS_PLATFORM}/SPECS # needed only by openmpi mkdir -p ~/${FYS_PLATFORM}/SOURCES # needed only by openmpi
install official packages, as rpmbuild:
cd ~/${FYS_PLATFORM}/RPMS mkdir external; cd external yumdownloader --resolve gcc-gfortran gcc43-c++ gcc43-gfortran blas-devel lapack-devel python-devel yumdownloader --resolve gnuplot libXi-devel xorg-x11-fonts-100dpi pexpect tetex-latex tkinter qt-devel yumdownloader --resolve openmpi openmpi-devel openmpi-libs compat-dapl libibverbs librdmacm openib yumdownloader --resolve pygtk2-devel gtk2-devel tk-devel agg ghostscript libtiff-devel yum localinstall * # as root
install atrpms packages, as rpmbuild (vtk-python is currently unavailable 16 Apr 2009):
cd ~/${FYS_PLATFORM}/RPMS/external yumdownloader --resolve --enablerepo=atrpms vtk-python arpack-devel graphviz wget http://ATrpms.net/RPM-GPG-KEY.atrpms rpm --import RPM-GPG-KEY.atrpms # as root yum localinstall * # as root
install the packages from epel, as rpmbuild:
~/${FYS_PLATFORM}/RPMS/external yumdownloader --resolve --enablerepo=epel jmol yumdownloader --resolve --enablerepo=epel environment-modules suitesparse-devel wget http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL rpm --import RPM-GPG-KEY-EPEL # as root yum localinstall * # as root
remove default openmpi:
yum remove openmpi openmpi-libs
edit /etc/yum.conf so it contains:
exclude=netcdf-* netcdf3-* fftw-* fftw2-* fftw3-* python-numeric openmpi-*
logout and login (as rpmbuild) again to activate modules settings from ~/.bashrc.
It's time to build custom RPMS
As rpmbuild:
cd ~/rpmbuild/SPECS
Install icc/ifort compilers, enable them (only on thul):
. /opt/intel/Compiler/11.0/083/bin/intel64/ifortvars_intel64.sh . /opt/intel/Compiler/11.0/083/bin/intel64/iccvars_intel64.sh
Build intel compatibility packages (only on thul):
rpmbuild -bb --with modules --with default_version intel-redist.spec
Note: do not install the resulting RPMS on the login node. They need to be installed only on compute nodes. On the login node only the module file needs to be deployed (as root):
mkdir -p /etc/modulefiles/intel cp /tmp/intel-11.0.083/11.0.083-1.intel64.el5.fys /etc/modulefiles/intel
Note: the above module file should contain at least:
prepend-path LD_LIBRARY_PATH /opt/intel/Compiler/11.0/083/lib/intel64 prepend-path PATH /opt/intel/Compiler/11.0/083/bin/intel64 prepend-path MANPATH /opt/intel/Compiler/11.0/083/man
Install mkl, and build mkl compatibility package (only on thul):
cd ~/rpmbuild/SOURCES mkdir 10.1.3.027 cp -r /opt/intel/mkl/10.1.3.027/lib 10.1.3.027 cp -r /opt/intel/mkl/10.1.3.027/doc 10.1.3.027 tar zcf intel-redist-mkl-10.1.3.027.tar.gz 10.1.3.027 cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version intel-redist-mkl.spec
Note: do not install the resulting RPM on the login node. It needs to be installed only on compute nodes. On the login node only the module file needs to be deployed (as root):
mkdir -p /etc/modulefiles/mkl cp /tmp/intel-mkl-10.1.3.027/10.1.3.027-1.el5.fys.em64t /etc/modulefiles/mkl
Note: the above module file should contain at least:
prepend-path LD_LIBRARY_PATH /opt/intel/mkl/10.1.3.027/lib/em64t
Build a custom openmpi, using torque support:
export rpmtopdir=${HOME}/${FYS_PLATFORM} # set this to _topdir value from ~/.rpmmacros wget http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.2.tar.bz2 \ -O ~/rpmbuild/SOURCES/openmpi-1.3.2.tar.bz2 sh ./buildrpm-1.3.2-1.gfortran.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.2-1.gfortran43.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.gfortran43.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.2-1.pathscale.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.pathscale.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.2-1.ifort.sh ../SOURCES/openmpi-1.3.2.tar.bz2 2>&1 | tee buildrpm-1.3.2-1.ifort.sh.log.${FYS_PLATFORM} # thul only
Note: intel openmpi needs to be installed ignoring dependencies:
rpm -ivh --nodeps --oldpackage ~rpmbuild/${FYS_PLATFORM}/RPMS/openmpi-1.3.2-1.ifort.el5.fys.x86_64.rpm
If scripts that contain ALL build/install/uninstall commands (global_install.sh and global_uninstall.sh) need to be created, every time after an RPM is successfully built, do:
grep -v "#\!" install.sh >> ~/${FYS_PLATFORM}/global_install.sh cat uninstall.sh ~/${FYS_PLATFORM}/global_uninstall.sh | grep -v "#\!" >> ~/${FYS_PLATFORM}/global_uninstall.sh.tmp && mv -f ~/${FYS_PLATFORM}/global_uninstall.sh.tmp ~/${FYS_PLATFORM}/global_uninstall.sh # ignore "cat: .../global_uninstall.sh: No such ..." error when running first time
Note that global_uninstall.sh won't remove built RPM files, just will uninstall the packages.
Build the following for dacapo:
set the disttag variable for convenience:
export disttag="el5.fys"
acml:
rpmbuild -bb --with compiler=gfortran --with version1=0 --with version2=1 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=pathscale --with version1=0 --with version2=1 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=gfortran43 --with version1=1 --with version2=0 --with modules --with default_version acml.spec # fjorm only rpmbuild -bb --with compiler=pathscale --with version1=1 --with version2=0 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=gfortran43 --with version1=2 --with version2=0 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=pathscale --with version1=2 --with version2=0 --with modules --with default_version acml.spec rpmbuild -bb --with compiler=ifort --with version1=2 --with version2=0 --with modules --with default_version acml.spec # thul only
goto (only on fjorm):
rpmbuild --bb --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/goto/1.26/1.${disttag}.gfortran.smp goto.spec rpmbuild --bb --with compiler=gfortran43 --with modules=1 --with default_version=1 \ --with prefix=/opt/goto/1.26/1.${disttag}.gfortran43.smp goto.spec rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \ --with modules=1 --with default_version=1 --with prefix=/opt/goto/1.26/1.${disttag}.pathscale.smp goto.specNote - 1.26 version fails on Nehalem with:
../../../param.h:1195:21: error: division by zero in #if
atlas:
rpmbuild --bb --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ -with modules=1 --with default_version=1 --with prefix=/opt/atlas/3.8.3/1.gfortran.${disttag} atlas.spec rpmbuild --bb --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ -with modules=1 --with default_version=1 --with prefix=/opt/atlas/3.8.3/1.gfortran43.${disttag} atlas.spec rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \ -with modules=1 --with default_version=1 --with prefix=/opt/atlas/3.8.3/1.pathscale.${disttag} atlas.spec
campos-dacapo-pseudopotentials:
rpmbuild -bb --with modules --with default_version campos-dacapo-pseudopotentials.spec
-
rpmbuild -bb --with modules --with default_version --with prefix=/opt/RasMol/2.7.3/3.${disttag} RasMol.spec
-
rpmbuild --bb --with blas_version=3.0.37.el5 --with modules=1 --with default_version=1 \ --with prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 cblas.spec rpmbuild --bb --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with modules=1 --with default_version=1 --with prefix=/opt/cblas/2.23.3/1.${dist}.gfortran.acml.4.0.1 cblas.spec
python-setuptools:
rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-setuptools/0.6c9/1.${disttag}.python2.4 python-setuptools.spec
python-nose:
rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-nose/0.10.4/1.${disttag}.python2.4 python-nose.spec module load python-setuptools module load python-nose
numpy:
rpmbuild --bb --with cblas_prefix=none \ # dotblas fails with acml --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.acml.4.0.1.acml numpy.spec rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \ --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/numpy/1.3.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack numpy.spec
Test with:
module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)" python -c "import numpy; numpy.test()" module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack module load acml-gfortran64/4.0.1-1.el5.fys module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)" python -c "import numpy; numpy.test()" module unload numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml module unload acml-gfortran64/4.0.1-1.el5.fys
Load the default numpy:
module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml
-
rpmbuild -bb --with modules --with default_version --with prefix=/opt/gnuplot-py/1.8.1/1.${disttag}.python2.4 gnuplot-py.spec module load gnuplot-py
python-numeric (we must install 24.2 version, and we keep the default version):
cd ~/${FYS_PLATFORM}/RPMS/external rpm -e --nodeps python-numeric # as root yumdownloader --resolve --disableexcludes=main python-numeric cd ~/rpmbuild/SPECS rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \ --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.acml.4.0.1.acml python-numeric.spec rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 --with blas_version=3.0.37.el5 \ --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/python-numeric/24.2/4.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack python-numeric.spec
Note: (16 Apr 2009) currently Numeric's test.py results in (we ignore this error):
glibc detected *** python: free(): invalid next size (normal): 0x09aee970 ***
After installing python-numeric make a very rough check:
module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack python -c "import lapack_lite" ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep lapack_lite.so` ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack | grep _dotblas.so` module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack module load acml-gfortran64/4.0.1-1.el5.fys module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml python -c "import lapack_lite" ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml | grep lapack_lite.so` ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml | grep _dotblas.so` module unload python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml module unload acml-gfortran64/4.0.1-1.el5.fys
and reinstall the default version:
rpm -ivh --oldpackage ~rpmbuild/${FYS_PLATFORM}/RPMS/external/python-numeric-23.*.rpm
load the default Numeric version:
module load acml-gfortran64/4.0.1-1.el5.fys module load python-numeric/24.2-4.el5.fys.gfortran.python2.4.acml.4.0.1.acml
netcdf:
rpmbuild --bb --with compiler=gfortran --with modules=1 --with default_version=1 \ --with prefix=/opt/netcdf4/4.0.1/1.gfortran.${disttag} netcdf4.spec rpmbuild --bb --with compiler=gfortran43 --with modules=1 --with default_version=1 \ --with prefix=/opt/netcdf4/4.0.1/1.gfortran43.${disttag} netcdf4.spec rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \ --with modules=1 --with default_version=1 --with prefix=/opt/netcdf4/4.0.1/1.pathscale.${disttag} netcdf4.spec
-
rpmbuild --bb --with Numeric_includedir=none --with numpy=numpy --with compiler=gfortran --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \ --with modules=1 --with default_version=1 \ --with prefix=/opt/ScientificPython/2.8/1.${disttag}.python2.4.serial_version.numpy ScientificPython.spec rpmbuild --bb --with numpy=numeric --with compiler=gfortran --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/ScientificPython/2.6.2/1.${disttag}.python2.4.openmpi.numeric ScientificPython.spec
python-docutils:
rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-docutils/0.5/1.${disttag}.python2.4 python-docutils.spec
pytz:
rpmbuild -bb --with modules --with default_version --with prefix=/opt/pytz/2008g/1.${disttag}.python2.4 pytz.spec
python-docutils:
rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-dateutil/1.4.1/3.${disttag}.python2.4 python-dateutil.spec
python-matplotlib:
module load pytz module load python-docutils module load python-dateutil rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-matplotlib/0.98.5.2/1.${disttag}.python2.4 python-matplotlib.spec
-
rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase2/2.3.13/1.${disttag}.python2.4 campos-ase2.spec
-
rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-dacapo-python/0.9.4/1.${disttag}.python2.4 campos-dacapo-python.spec
fftw2:
rpmbuild --bb --with compiler=gfortran --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.gfortran.${disttag} fftw2.spec rpmbuild --bb --with compiler=gfortran43 --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.gfortran43.${disttag} fftw2.spec rpmbuild --bb --with compiler=pathscale --with compiler_bindir=/opt/pathscale/bin --with compiler_libdir=/opt/pathscale/lib/3.2 \ --with default_version=1 --with modules=1 --with prefix=/opt/fftw2/2.1.5/12.pathscale.${disttag} fftw2.spec
-
rpmbuild --bb --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran43.${disttag}/lib64 \ --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml campos-dacapo.spec rpmbuild --bb --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran.${disttag}/lib64 \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml campos-dacapo.spec rpmbuild --bb --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran43.${disttag}/lib64 \ --with lapack=lapack --with lapackdir=/usr/lib64 --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran43.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran43.openmpi.blas.3.0.37.el5.lapack campos-dacapo.spec rpmbuild --bb --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with fftw=fftw2 --with fftw_libdir=/opt/fftw2/2.1.5/12.gfortran.${disttag}/lib64 \ --with lapack=lapack --with lapackdir=/usr/lib64 --with netcdf=netcdf4 \ --with netcdf_includedir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/include \ --with netcdf_libdir=/opt/netcdf4/4.0.1/1.gfortran.${disttag}/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/campos-dacapo/2.7.16/1.${disttag}.gfortran.openmpi.blas.3.0.37.el5.lapack campos-dacapo.spec
Build following for gpaw:
fftw3:
rpmbuild --bb --with major_version=3 --with version1=2 --with version2=1 --with compiler=gfortran \ --with default_version=1 --with modules=1 --with prefix=/opt/fftw3/3.2.1/12.gfortran.${disttag} fftw2.spec rpmbuild --bb --with major_version=3 --with version1=2 --with version2=1 --with compiler=gfortran43 \ --with default_version=1 --with modules=1 --with prefix=/opt/fftw3/3.2.1/12.gfortran43.${disttag} fftw2.spec
scipy:
module unload numpy module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.blas.3.0.37.el5.lapack rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.blas.3.0.37.el5 \ --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usrlib64 \ --with lapack=lapack --with lapackdir=/usr/lib64 \ --with ufsparse_libdir=/usr/lib64 --with ufsparse_includedir=/usr/include/suitesparse \ --with fftw_libdir=/opt/fftw3/3.2.1/12.gfortran.${disttag}/lib64 \ --with compiler=gfortran --with default_version=1 --with modules=1 \ --with prefix=/opt/scipy/0.7.0/1.${disttag}.gfortran.python2.4.blas.3.0.37.el5.lapack scipy.spec module unload numpy module load numpy/1.3.0-1.el5.fys.gfortran.python2.4.acml.4.0.1.acml rpmbuild --bb --with cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.gfortran.acml.4.0.1 \ --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with ufsparse_libdir=/usr/lib64 --with ufsparse_includedir=/usr/include/suitesparse \ --with fftw_libdir=/opt/fftw3/3.2.1/12.gfortran.${disttag}/lib64 \ --with compiler=gfortran --with default_version=1 --with modules=1 \ --with prefix=/opt/scipy/0.7.0/1.${disttag}.gfortran.python2.4.acml.4.0.1.acml scipy.spec
-
rpmbuild -bb --with default_version --with modules campos-gpaw-setups.spec
-
rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase3/3.1.0.846/1.${disttag}.python2.4 campos-ase3.spec
povray:
rpmbuild -bb --with modules --with default_version --with prefix=/opt/povray/3.6.1/3.${disttag} povray.spec
python-jinja2:
rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-jinja2/2.1.1/1.${disttag}.python2.4 python-jinja2.spec
python-pygments:
rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-pygments/1.0/1.${disttag}.python2.4 python-pygments.spec
babel:
rpmbuild -bb --with modules --with default_version --with prefix=/opt/babel/0.9.4/1.${disttag}.python2.4 babel.spec
python-sphinx:
module load python-jinja2 rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-sphinx/0.6.1/1.${disttag}.python2.4 python-sphinx.spec
blacs:
rpmbuild --bb --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi blacs.spec rpmbuild --bb --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with modules=1 --with default_version=1 --with parallel=1 --with openmpi=openmpi \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi blacs.spec
scalapack:
rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \ --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 --with openmpi=openmpi \ --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml scalapack.spec rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \ --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 --with openmpi=openmpi \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib \ --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml scalapack.spec rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \ --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 --with openmpi=openmpi \ --with lapack=lapack --with lapackdir=/usr/lib64 \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.blas.3.0.37.el5.lapack scalapack.spec rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \ --with blas=blas --with blas_version=3.0.37.el5 --with blasdir=/usr/lib64 \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 --with openmpi=openmpi \ --with lapack=lapack --with lapackdir=/usr/lib64 \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with prefix=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.blas.3.0.37.el5.lapack scalapack.spec
-
rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran.openmpi/lib64 \ --with blas=acml --with blas_version=4.0.1 --with blasdir=/opt/acml/4.0.1/gfortran64/lib \ --with cmr_repository=/home/niflheim/repository/db \ --with compiler=gfortran --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 \ --with lapack=acml --with lapackdir=/opt/acml/4.0.1/gfortran64/lib \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran/lib64 \ --with openmpi=openmpi --with parallel=1 \ --with scalapack=scalapack --with scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.gfortran.openmpi.acml.4.0.1.acml/lib64 \ --with prefix=/opt/campos-gpaw/0.6.3862/1.${disttag}.gfortran.python2.4.openmpi.acml.4.0.1.acml campos-gpaw.spec rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.gfortran43.openmpi/lib64 \ --with blas=acml --with blas_version=4.2.0 --with blasdir=/opt/acml/4.2.0/gfortran4364/lib \ --with cmr_repository=/home/niflheim/repository/db \ --with compiler=gfortran43 --with compiler_bindir=/usr/bin --with compiler_libdir=/usr/lib64 \ --with default_version=1 --with modules=1 \ --with lapack=acml --with lapackdir=/opt/acml/4.2.0/gfortran4364/lib \ --with ompi_bindir=/opt/openmpi/1.3.2-1.gfortran43/bin \ --with ompi_includedir=/opt/openmpi/1.3.2-1.gfortran43/include --with ompi_libdir=/opt/openmpi/1.3.2-1.gfortran43/lib64 \ --with openmpi=openmpi --with parallel=1 \ --with scalapack=scalapack --with scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.gfortran43.openmpi.acml.4.2.0.acml/lib64 \ --with prefix=/opt/campos-gpaw/0.6.3862/1.${disttag}.gfortran43.python2.4.openmpi.acml.4.2.0.acml campos-gpaw.spec
Testing packages
Test dacapo installation (as normal user!).
If you use modules:
module load openmpi module load campos-dacapo-pseudopotentials module load python-numeric module load campos-dacapo-python module load ScientificPython module load gnuplot-py module load RasMol module load campos-ase2 module load campos-dacapo ulimit -s 65000 # dacapo needs a large stack
Test with (make sure that /scratch/$USER exists):
cp -r `rpm -ql campos-dacapo-python | grep "share/campos-dacapo-python$"` /tmp cd /tmp/campos-dacapo-python/Tests python test.py 2>&1 | tee test.log
It can take up to 1 day. Please consider disabling these "long" tests in test.py:
tests.remove('../Examples/Wannier-ethylene.py') tests.remove('../Examples/Wannier-Pt4.py') tests.remove('../Examples/Wannier-Ptwire.py') tests.remove('../Examples/Wannier-Fe-bcc.py') tests.remove('../Examples/transport_1dmodel.py')
Note all vtk related tests will fail.
Test gpaw installation (as normal user!):
If you use modules:
module load openmpi module load campos-ase3 module load campos-gpaw-setups module load campos-gpaw
Test with:
cp -r `rpm -ql campos-gpaw | grep "share/campos-gpaw/test$"` /tmp/test.gpaw.$$ cd /tmp/test.gpaw.* python test.py 2>&1 | tee test.log
It takes about 20 minutes.
On "Golden Client"
Login, as root, to the "Golden Client":
ssh n001
Enable nfs mount of the server home directory - follow 'Enable nfs mount on the "Golden Client"' from configuring NFS. After this do:
cd /home/dulak-server/rpm/campos rpm -ivh campos-dacapo-2*
If getting:
package example_package.el5.i386 is already installed
remove these packages with:
rpm -e --nodeps example_package
to allow the installation to proceed.
Make sure that both python-numeric versions are installed:
rpm -q python-numeric
This command will show a list of packages that need to be installed to fulfill dacapo dependencies. All these packages should be already under /home/dulak-server/rpm. Remember to test the dacapo and gpaw installations on the "Golden Client" too.
If you are installing workstation only, your setup is ready for testing - go to benchmarking and maintenance.
If you are building a cluster go back to installing and configuring systemimager,