.. _32H2O.xyz: attachment:32H2O.xyz This page describes the necessary steps for installing the login nodes **fjorm** or **thul**. **Warning**: el5 installs often 32-bit versions of packages on 64-bit machines. Please verify that 64-bit versions are installed, otherwise run the corresponding `yum install package` again. Install external packages ========================= As root: - create yum repository definitions (do **not** enable them):: # atrpms echo '[atrpms]' > /etc/yum.repos.d/atrpms.repo echo 'name=CentOS $releasever - $basearch - ATrpms' >> /etc/yum.repos.d/atrpms.repo echo 'baseurl=http://dl.atrpms.net/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo echo '#baseurl=http://mirrors.ircam.fr/pub/atrpms/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo echo 'gpgkey=http://ATrpms.net/RPM-GPG-KEY.atrpms' >> /etc/yum.repos.d/atrpms.repo echo 'gpgcheck=1' >> /etc/yum.repos.d/atrpms.repo echo 'enabled=0' >> /etc/yum.repos.d/atrpms.repo # epel echo '[epel]' > /etc/yum.repos.d/epel.repo echo 'name=CentOS $releasever - $basearch - EPEL' >> /etc/yum.repos.d/epel.repo echo 'baseurl=http://download.fedora.redhat.com/pub/epel/$releasever/$basearch' >> /etc/yum.repos.d/epel.repo echo 'gpgkey=http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL' >> /etc/yum.repos.d/epel.repo echo 'gpgcheck=1' >> /etc/yum.repos.d/epel.repo echo 'enabled=0' >> /etc/yum.repos.d/epel.repo - install, as root:: yum install yum-utils wget # /var directories must be created yum search --enablerepo=atrpms arpack-devel yum search --enablerepo=epel jmol - configure rpmbuild (as rpmbuild): - enable modules - add to `~/.bashrc`:: if [ -r "/home/camp/modulefiles.sh" ]; then source /home/camp/modulefiles.sh fi - set the `FYS_PLATFORM` variable:: export FYS_PLATFORM=Intel-Nehalem-el5 # thul export FYS_PLATFORM=AMD-Opteron-el5 # fjorm **Note** that this variable will be set automatically by `/home/camp/modulefiles.sh` after the `environment-modules` package is installed. - use the following ~rpmbuild/.rpmmacros:: %disttag el5.fys %packager rpmbuild@fysik.dtu.dk %distribution Fysik RPMS %vendor Fysik RPMS %_signature gpg %_gpg_path ~/.gnupg %_gpg_name Fysik RPMS %_topdir /home/camp/rpmbuild/%(echo $FYS_PLATFORM) %_rpmdir %{_topdir}/RPMS %_srcrpmdir %{_topdir}/SRPMS %_svndir /home/camp/rpmbuild/rpmbuild %_specdir %{_svndir}/SPECS %_sourcedir %{_svndir}/SOURCES %_rpmfilename %%{NAME}-%%{VERSION}-%%{RELEASE}.%%{ARCH}.rpm %_builddir /scratch/rpmbuild %_tmppath %{_topdir}/BUILD # no debuginfo %debug_package %{nil} # don't strip (this does not fully work) # https://bugzilla.redhat.com/show_bug.cgi?id=219731 %define __strip /bin/true %niflheim 1 - as rpmbuild create directories:: mkdir -p ~/${FYS_PLATFORM}/RPMS mkdir -p ~/${FYS_PLATFORM}/SRPMS mkdir -p ~/${FYS_PLATFORM}/BUILD mkdir -p ~/${FYS_PLATFORM}/SPECS # needed only by openmpi mkdir -p ~/${FYS_PLATFORM}/SOURCES # needed only by openmpi - install official packages, as rpmbuild:: cd ~/${FYS_PLATFORM}/RPMS mkdir external; cd external yumdownloader --resolve gcc-gfortran gcc43-c++ gcc43-gfortran blas-devel lapack-devel python-devel yumdownloader --resolve gnuplot libXi-devel xorg-x11-fonts-100dpi pexpect tetex-latex tkinter qt-devel yumdownloader --resolve openmpi openmpi-devel openmpi-libs compat-dapl libibverbs librdmacm openib yumdownloader --resolve pygtk2-devel gtk2-devel tk-devel agg ghostscript libtiff-devel yumdownloader --resolve libX11-devel libXext-devel openmotif openmotif-devel gd-devel libXpm-devel yum localinstall * # as root - install `atrpms` packages, as rpmbuild (``vtk-python`` is currently unavailable 16 Apr 2009):: cd ~/${FYS_PLATFORM}/RPMS/external yumdownloader --resolve --enablerepo=atrpms vtk-python arpack-devel graphviz wget http://ATrpms.net/RPM-GPG-KEY.atrpms rpm --import RPM-GPG-KEY.atrpms # as root yum localinstall * # as root - install the packages from `epel`, as rpmbuild:: ~/${FYS_PLATFORM}/RPMS/external yumdownloader --resolve --enablerepo=epel jmol gsl-devel yumdownloader --resolve --enablerepo=epel environment-modules suitesparse-devel wget http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL rpm --import RPM-GPG-KEY-EPEL # as root yum localinstall * # as root - remove default openmpi:: yum remove openmpi openmpi-libs - edit ``/etc/yum.conf`` so it contains:: exclude=netcdf-* netcdf3-* fftw-* fftw2-* fftw3-* python-numeric openmpi-* - logout and login (as rpmbuild) again to activate modules settings from `~/.bashrc`. It's time to build custom RPMS ============================== As rpmbuild:: cd ~/rpmbuild/SPECS Install `Install Open64 Compiler Suite `_ RPM, and deploy module file under ` ~rpmbuild/modulefiles/${FYS_PLATFORM}/open64`. Install using **--relocate** option:: rpm -ivh --relocate "/opt/open64"="/opt/open64/4.2.1" open64-4.2.1-0.x86_64.rpm Note that the module file is available as svn checkout at https://svn.fysik.dtu.dk/projects/modulefiles/trunk. **Note**: the module file should contain at least:: prepend-path PATH /opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 prepend-path PATH /opt/open64/4.2.1/bin prepend-path LD_LIBRARY_PATH /opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 Install icc/ifort compilers, install only the following packages from `l_cproc_p_11.1.046_intel64`:: intel-cproc046-11.1-1.x86_64.rpm intel-cproidb046-11.1-1.x86_64.rpm intel-cprolib046-11.1-1.x86_64.rpm intel-cprolibdev046-11.1-1.x86_64.rpm intel-cprotbblib046-11.1-1.noarch.rpm intel-cprotbblibdev046-11.1-1.noarch.rpm intel-cprocsdk046-11.1-1.noarch.rpm intel-cproidbsdk046-11.1-1.noarch.rpm and from `l_cprof_p_11.1.046_intel64`:: intel-cprof046-11.1-1.x86_64.rpm intel-cproflib046-11.1-1.x86_64.rpm intel-cprofsdk046-11.1-1.noarch.rpm , enable them (only on **thul**):: . /opt/intel/Compiler/11.1/046/bin/intel64/ifortvars_intel64.sh . /opt/intel/Compiler/11.1/046/bin/intel64/iccvars_intel64.sh Build intel compatibility packages (only on **thul**):: rpmbuild -bb --with modules intel-redist.spec **Note**: do not install the resulting RPMS on the login node. They need to be installed **only** on compute nodes. On the login node only the module file needs to be deployed under `~rpmbuild/modulefiles`, as rpmbuild. Note that this directory is available as svn checkout at https://svn.fysik.dtu.dk/projects/modulefiles/trunk. Here is what need to be done for a new version of the package:: mkdir ~rpmbuild/modulefiles/${FYS_PLATFORM}/ifort cp /scratch/rpmbuild/intel-11.1.046/11.1.046-1.intel64.el5.fys ~rpmbuild/modulefiles/${FYS_PLATFORM}/ifort cd ~rpmbuild/modulefiles/${FYS_PLATFORM}/ifort ln -s 11.1.046-1.intel64.el5.fys 11.1-1 **Note**: the above module file should contain at least:: prepend-path LD_LIBRARY_PATH /opt/intel/Compiler/11.1/046/lib/intel64 prepend-path PATH /opt/intel/Compiler/11.1/046/bin/intel64 prepend-path MANPATH /opt/intel/Compiler/11.1/046/man Install mkl, and build mkl compatibility package (only on **thul**):: cd ~/rpmbuild/SOURCES export mkl=10.1.3.027 mkdir ${mkl} cp -rp /opt/intel/mkl/${mkl}/lib ${mkl} cp -rp /opt/intel/mkl/${mkl}/doc ${mkl} tar zcf intel-redist-mkl-${mkl}.tar.gz ${mkl} cd ~/rpmbuild/SPECS rpmbuild -bb --with version1=1 --with version2=3 --with version3=027 --with modules intel-redist-mkl.spec **Note**: do not install the resulting RPM on the login node. It needs to be installed **only** on compute nodes. On the login node only the module file needs to be deployed under `~rpmbuild/modulefiles`, as rpmbuild. Note that this directory is available as svn checkout at https://svn.fysik.dtu.dk/projects/modulefiles/trunk. Here is what need to be done for a new version of the package:: mkdir ~rpmbuild/modulefiles/${FYS_PLATFORM}/mkl cp /tmp/intel-mkl-${mkl}/${mkl}-1.el5.fys.em64t ~rpmbuild/modulefiles/${FYS_PLATFORM}/mkl cd ~rpmbuild/modulefiles/${FYS_PLATFORM}/mkl ln -s ${mkl}-1.el5.fys.em64t 10.1p-027 **Note**: the above module file should contain at least:: prepend-path LD_LIBRARY_PATH /opt/intel/mkl/10.1.3.027/lib/em64t **Moreover**: `intel-*intel64*` RPMS need to be installed with *--nodeps*, so move them into a special directory:: mkdir -p ~/${FYS_PLATFORM}/RPMS/nodeps mv ~/${FYS_PLATFORM}/RPMS/intel-*intel64* ~/${FYS_PLATFORM}/RPMS/nodeps Build mkl/fftw (as root):: export mkl=10.1.3.027 . /opt/intel/Compiler/11.0/083/bin/intel64/ifortvars_intel64.sh cd /opt/intel/mkl/${mkl}/interfaces/fftw2xf make libem64t cd /opt/intel/mkl/${mkl}/lib/em64t ln -s libfftw2xf_intel.a libfftw.a Logout and login as **rpmbuild**. Build a custom openmpi, using torque support:: export rpmtopdir=${HOME}/${FYS_PLATFORM} # set this to _topdir value from ~/.rpmmacros wget http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.3.tar.bz2 \ -O ~/rpmbuild/SOURCES/openmpi-1.3.3.tar.bz2 sh ./buildrpm-1.3.3-1.gfortran.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.gfortran.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.3-1.gfortran43.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.gfortran43.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.3-1.pathscale.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.pathscale.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.3-1.ifort.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.ifort.sh.log.${FYS_PLATFORM} # thul only module load open64 # 24 June 2009: build fails on Nehalem with *configure: error: TM support requested but not found. Aborting* sh ./buildrpm-1.3.3-1.open64.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.open64.sh.log.${FYS_PLATFORM} **Note**: intel openmpi needs to be installed ignoring dependencies:: rpm -ivh --nodeps --oldpackage ~rpmbuild/${FYS_PLATFORM}/RPMS/openmpi-1.3.3-1.el5.fys.ifort.11.1.x86_64.rpm If scripts that contain **ALL** build/install/uninstall commands (``global_install.sh`` and ``global_uninstall.sh``) need to be created, every time after an RPM is successfully built, do:: grep -v "#\!" install.sh >> ~/${FYS_PLATFORM}/global_install.sh cat uninstall.sh ~/${FYS_PLATFORM}/global_uninstall.sh | grep -v "#\!" >> ~/${FYS_PLATFORM}/global_uninstall.sh.tmp && mv -f ~/${FYS_PLATFORM}/global_uninstall.sh.tmp ~/${FYS_PLATFORM}/global_uninstall.sh # ignore "cat: .../global_uninstall.sh: No such ..." error when running first time **Note** that `global_uninstall.sh` won't remove built RPM files, just will uninstall the packages. Build the following for dacapo: ------------------------------- - set the disttag variable for convenience:: export disttag="el5.fys" modules=1 - acml `AMD Core Math Library `_ contains optimized BLAS and LAPACK libraries for AMD opteron processors (work also on intel), and is available for download after registration. Build the following RPMS:: thread_mode=nomp version1=0 version2=1 compiler=gfortran version1=0 version2=1 compiler=pathscale version1=1 version2=0 compiler=gfortran43 # fjorm only version1=1 version2=0 compiler=pathscale version1=2 version2=0 compiler=gfortran43 version1=2 version2=0 compiler=pathscale version1=2 version2=0 compiler=ifort # thul only version1=3 version2=0 compiler=gfortran43 version1=3 version2=0 compiler=open64 # fjorm only version1=3 version2=0 compiler=ifort # thul only rpmbuild -bb --with compiler=${compiler} --with version1=${version1} --with version2=${version2} --with thread_mode=${thread_mode} \ --with modules=${modules} --with default_version acml.spec **Note** problems with `dgemm `_ acml 4.1.0 and 4.2.0 have been reported. Moreover campos-dacapo_ built with **acml-4-1-0-pathscale-64bit.tgz** fails with "ZINSC 2 returned info= 2" for the following job:: #!/usr/bin/env python from Dacapo import Calculator from ASE.IO.xyz import ReadXYZ from ASE.Dynamics.Langevin import Langevin prefix = 'D' atoms = ReadXYZ('32H2O.xyz') L = 9.8553729 atoms.SetUnitCell([L, L, L], fix=True) atoms.SetBoundaryConditions(True) r = 1 atoms = atoms.Repeat([r, r, r]) n = 48 * r calc = Calculator(nbands=128 * r**3, planewavecutoff=350, densitycutoff=500, xc='PBE') atoms.SetCalculator(calc) from time import time atoms.GetPotentialEnergy() pos=atoms[0].GetCartesianPosition() atoms[0].SetCartesianPosition(pos+0.005) t0=time() atoms.GetPotentialEnergy() print time()-t0 Download 32H2O.xyz_ - goto (only on **fjorm**). `GotoBLAS `_ contains optimized BLAS libraries, and is available for academic community after registration. Download **GotoBLAS-1.26.tar.gz** to ~/rpmbuild/SOURCES. The default build process searches your $PATH for available compilers. Priority order is PathScale, PGI, Intel, gfortran, g95 and g77, therefore setting compilers explicitly is needed. Build RPMS (on the **target** compute node!) using the following command:: blas_version=1.26 compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin compiler=pathscale compiler_version=3.2 compiler_libdir=/opt/pathscale/lib/3.2 compiler_bindir=/opt/pathscale/bin compiler=open64 # only on fjorm compiler_version=4.2.1 compiler_libdir=/opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 compiler_bindir=/opt/open64/4.2.1/bin rpmbuild --bb --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_libdir=${compiler_libdir} --with compiler_bindir=${compiler_bindir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/goto/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp goto.spec **Note** - **1.26** version fails on Nehalem with:: ../../../param.h:1195:21: error: division by zero in #if - atlas Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/project/math-atlas/Stable/3.8.3/atlas3.8.3.tar.bz2 cd ~/rpmbuild/SPECS blas_version=3.8.3 compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin rpmbuild --bb --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_libdir=${compiler_libdir} --with compiler_bindir=${compiler_bindir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/atlas/${blas_version}/1.${disttag}.${compiler}.${compiler_version} atlas.spec **Note**: - **3.8.3** version fails on Opteron with open64 with:: /scratch/rpmbuild/ATLAS/RpmObjs/..//CONFIG/src/backend/comptestC.c:5: undefined reference to `__pathscale_malloc_alg' - campos-dacapo-pseudopotentials .. _campos-dacapo-pseudopotentials-1.tar.gz: attachment:campos-dacapo-pseudopotentials-1.tar.gz campos-dacapo-pseudopotentials-1.tar.gz_ Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget --no-check-certificate \ "https://wiki.fysik.dtu.dk/dacapo/Installation?action=AttachFile&do=get&target=campos-dacapo-pseudopotentials-1.tar.gz" \ -O campos-dacapo-pseudopotentials-1.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version campos-dacapo-pseudopotentials.spec - rasmol **Note** that rasmol is built for 32bit architecture, so make sure that your system fulfills "BuildRequires" for both 64 and 32bit:: rpmbuild --nobuild RasMol.spec Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/openrasmol/RasMol_2.7.3.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/RasMol/2.7.3/3.${disttag} RasMol.spec - cblas Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.netlib.org/blas/blast-forum/cblas.tgz cd ~/rpmbuild/SPECS compiler=gfortran compiler_version=4.1.2 blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 compiler=gfortran compiler_version=4.1.2 blas=acml blas_version=4.0.1 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib rpmbuild --bb --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/cblas/2.23.3/1.${disttag}.${compiler}.${compiler_version}.${blas}.${blas_version} cblas.spec - python-setuptools Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://pypi.python.org/packages/source/s/setuptools/setuptools-0.6c9.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-setuptools/0.6c9/1.${disttag}.python2.4 python-setuptools.spec - python-nose Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://somethingaboutorange.com/mrl/projects/nose/nose-0.10.4.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-nose/0.10.4/1.${disttag}.python2.4 python-nose.spec module load python-setuptools module load python-nose - numpy **Note**: Standard (unoptimized) numpy for EL4 can be found at `Scientific Linux `_ (example given for i386):: wget ftp://ftp.scientificlinux.org/linux/scientific/4x/i386/SL/RPMS/numpy-1.0.4-1.i386.rpm wget --no-check-certificate https://www.scientificlinux.org/documentation/gpg/RPM-GPG-KEY-dawson rpm --import RPM-GPG-KEY-dawson yum localinstall numpy-1.0.4-1.i386.rpm and for EL5 at ``_. Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/project/numpy/NumPy/1.3.0/numpy-1.3.0.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran # Aug 6 2009: causes problems to gpaw cg2.py test compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.${compiler}.${compiler_version}.${blas}.${blas_version} release=1.${disttag}.${compiler}.${compiler_version}.python2.4.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.0.1 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.0.1 lapackdir=/opt/${blas}/${blas_version}/${compiler}64/lib cblas_prefix=none # dotblas fails with acml release=1.${disttag}.${compiler}.${compiler_version}.python2.4.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran # only on thul compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=mkl blas_version=10.1.3.027 blasdir=/opt/intel/mkl/10.1.3.027/lib/em64t lapack=mkl_lapack lapack_version=10.1.3.027 lapackdir=/opt/intel/mkl/10.1.3.027/lib/em64t cblas_prefix=/opt/intel/mkl/10.1.3.027 release=1.${disttag}.${compiler}.${compiler_version}.python2.4.${blas}.${blas_version}.${lapack}.${lapack_version} rpmbuild --bb --with cblas_prefix=${cblas_prefix} \ --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/numpy/1.3.0/${release} numpy.spec Test with:: module load cblas-blas-gfortran64/2.23.3-1.el5.fys.gfortran.4.1.2.blas.3.0.37.el5 module load numpy/1.3.0-1.el5.fys.gfortran.4.1.2.python2.4.blas.3.0.37.el5.lapack.3.0.37.el5 python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)" python -c "import numpy; numpy.test()" module unload numpy/1.3.0-1.el5.fys.gfortran.4.1.2.python2.4.blas.3.0.37.el5.lapack.3.0.37.el5 module load acml-gfortran64/4.0.1-1.el5.fys module load numpy/1.3.0-1.el5.fys.gfortran.4.1.2.python2.4.acml.4.0.1.acml.4.0.1 python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)" python -c "import numpy; numpy.test()" module unload numpy/1.3.0-1.el5.fys.gfortran.4.1.2.python2.4.acml.4.0.1.acml.4.0.1 module unload acml-gfortran64/4.0.1-1.el5.fys Load the default numpy:: module load numpy/1.3.0-1.el5.fys.gfortran.4.1.2.python2.4.acml.4.0.1.acml.4.0.1 - gnuplot-py Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/gnuplot-py/gnuplot-py-1.8.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/gnuplot-py/1.8.1/1.${disttag}.python2.4 gnuplot-py.spec module load gnuplot-py - python-numeric We must install **24.2** version, and we keep the default version. Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/numpy/Numeric-24.2.tar.gz cd ~/rpmbuild/SPECS cd ~/${FYS_PLATFORM}/RPMS/external rpm -e --nodeps python-numeric # as root yumdownloader --resolve --disableexcludes=main python-numeric cd ~/rpmbuild/SPECS compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.${compiler}.${compiler_version}.${blas}.${blas_version} release=1.${disttag}.${compiler}.${compiler_version}.python2.4.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.0.1 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.0.1 lapackdir=/opt/${blas}/${blas_version}/${compiler}64/lib cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.${compiler}.${compiler_version}.${blas}.${blas_version} release=1.${disttag}.${compiler}.${compiler_version}.python2.4.${blas}.${blas_version}.${lapack}.${lapack_version} rpmbuild --bb --with cblas_prefix=${cblas_prefix} \ --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/python-numeric/24.2/${release} python-numeric.spec **Note**: (16 Apr 2009) currently Numeric's `test.py` results in (we ignore this error):: glibc detected *** python: free(): invalid next size (normal): 0x09aee970 *** After installing python-numeric make a very rough check:: module load cblas-blas-gfortran64/2.23.3-1.el5.fys.gfortran.4.1.2.blas.3.0.37.el5 module load python-numeric/24.2-4.el5.fys.gfortran.4.1.2.python2.4.blas.3.0.37.el5.lapack.3.0.37.el5 python -c "import lapack_lite" ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.4.1.2.python2.4.blas.3.0.37.el5.lapack.3.0.37.el5 | grep lapack_lite.so` ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.4.1.2.python2.4.blas.3.0.37.el5.lapack.3.0.37.el5 | grep _dotblas.so` module unload python-numeric/24.2-4.el5.fys.gfortran.4.1.2.python2.4.blas.3.0.37.el5.lapack.3.0.37.el5 module load acml-gfortran64/4.0.1-1.el5.fys module load cblas-acml-gfortran64/2.23.3-1.el5.fys.gfortran.4.1.2.acml.4.0.1 module load python-numeric/24.2-4.el5.fys.gfortran.4.1.2.python2.4.acml.4.0.1.acml.4.0.1 python -c "import lapack_lite" ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.4.1.2.python2.4.acml.4.0.1.acml.4.0.1 | grep lapack_lite.so` ldd `rpm -ql python-numeric-24.2-4.el5.fys.gfortran.4.1.2.python2.4.acml.4.0.1.acml.4.0.1 | grep _dotblas.so` module unload python-numeric/24.2-4.el5.fys.gfortran.4.1.2.python2.4.acml.4.0.1.acml.4.0.1 module unload acml-gfortran64/4.0.1-1.el5.fys and reinstall the default version:: rpm -ivh --oldpackage ~rpmbuild/${FYS_PLATFORM}/RPMS/external/python-numeric-23.*.rpm load the default `Numeric` version:: module load acml-gfortran64/4.0.1-1.el5.fys module load cblas-acml-gfortran64/2.23.3-1.el5.fys.gfortran.4.1.2.acml.4.0.1 module load python-numeric/24.2-4.el5.fys.gfortran.4.1.2.python2.4.acml.4.0.1.acml.4.0.1 - netcdf `NetCDF `_ (*network Common Data Form*) is an interface for array-oriented data access and a library that provides an implementation of the interface. The *NetCDF* library also defines a machine-independent format for representing scientific data. We prefer version `3.6.1 `_ or higher (note that version `3.6.2` has fortran interface in libnetcdff.a). Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.unidata.ucar.edu/downloads/netcdf/ftp/netcdf-4.0.1.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin compiler=pathscale # only on fjorm compiler_version=3.2 compiler_libdir=/opt/pathscale/lib/3.2 compiler_bindir=/opt/pathscale/bin compiler=ifort # only on thul compiler_version=11.0 compiler_libdir=/opt/intel/Compiler/11.0/083/lib/intel64 compiler_bindir=/opt/intel/Compiler/11.0/083/bin/intel64 compiler=open64 # only on fjorm compiler_version=4.2.1 compiler_libdir=/opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 compiler_bindir=/opt/open64/4.2.1/bin rpmbuild --bb --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} \ --with compiler_libdir=${compiler_libdir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/netcdf4/4.0.1/1.${disttag}.${compiler}.${compiler_version} netcdf4.spec - ScientificPython Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget https://sourcesup.cru.fr/frs/download.php/2234/ScientificPython-2.6.2.tar.gz cd ~/rpmbuild/SOURCES&& wget https://sourcesup.cru.fr/frs/download.php/2234/ScientificPython-2.8.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin rpmbuild --bb --with Numeric_includedir=none --with numpy=numpy \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/ScientificPython/2.8/1.${disttag}.${compiler}.${compiler_version}.python2.4.serial_version.only.numpy ScientificPython.spec rpmbuild --bb --with numpy=numeric \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with prefix=/opt/ScientificPython/2.6.2/1.${disttag}.${compiler}.${compiler_version}.python2.4.openmpi.${openmpi_version}.numeric ScientificPython.spec - python-docutils Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/project/docutils/docutils/0.5/docutils-0.5.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-docutils/0.5/1.${disttag}.python2.4 python-docutils.spec - pytz Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://pypi.python.org/packages/source/p/pytz/pytz-2008g.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/pytz/2008g/1.${disttag}.python2.4 pytz.spec - python-dateutil Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://pypi.python.org/packages/source/p/python-dateutil/python-dateutil-1.4.1.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-dateutil/1.4.1/3.${disttag}.python2.4 python-dateutil.spec - python-matplotlib Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/project/matplotlib/matplotlib/matplotlib-0.98.5/matplotlib-0.98.5.3.tar.gz cd ~/rpmbuild/SPECS module load pytz module load python-docutils module load python-dateutil rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-matplotlib/0.98.5.3/1.${disttag}.python2.4 python-matplotlib.spec - campos-ase2 Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget --no-check-certificate \ "https://wiki.fysik.dtu.dk/ase2/Download?action=AttachFile&do=get&target=campos-ase-2.3.13.tar.gz" \ -O campos-ase-2.3.13.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase2/2.3.13/1.${disttag}.python2.4 campos-ase2.spec - campos-dacapo-python Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget --no-check-certificate \ "https://wiki.fysik.dtu.dk/dacapo/Installation?action=AttachFile&do=get&target=Dacapo-0.9.4.tar.gz" \ -O Dacapo-0.9.4.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-dacapo-python/0.9.4/1.${disttag}.python2.4 campos-dacapo-python.spec - fftw2 We use version `2.1.5 `_. Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.fftw.org/fftw-2.1.5.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin compiler=pathscale # only on fjorm compiler_version=3.2 compiler_libdir=/opt/pathscale/lib/3.2 compiler_bindir=/opt/pathscale/bin # Note: 24 June 2009: ifort fails to build static fftw2 compiler=ifort # only on thul compiler_version=11.0 compiler_libdir=/opt/intel/Compiler/11.0/083/lib/intel64 compiler_bindir=/opt/intel/Compiler/11.0/083/bin/intel64 compiler=open64 # only on fjorm compiler_version=4.2.1 compiler_libdir=/opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 compiler_bindir=/opt/open64/4.2.1/bin rpmbuild --bb --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} \ --with compiler_libdir=${compiler_libdir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/fftw2/2.1.5/12.${disttag}.${compiler}.${compiler_version} fftw2.spec - campos-dacapo Note that FC's fftw version 3 is incompatible with `dacapo`. If you build only a serial version add "--without parallel" to the rpmbuild options. Another useful option is "--without default_version" that does not put `/etc/profile.d` scripts, nor modules under `/opt/modulefiles` that allows the package to be relocatable. **Warning**: currently (9 Oct 2008) packaging of openmpi on FC8/FC9 requires the following link to be made:: ln -s /usr/include/openmpi/1.2.4-gcc/32/mpif-config.h /usr/include/openmpi/1.2.4-gcc/mpif-config.h **Warning**: currently (9 Oct 2008) the default build fails for EL4 due to double underlined symbols in the netcdf from `ATrpms`, however custom build (i.e. with building your won netcdf should work). Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget --no-check-certificate \ "https://wiki.fysik.dtu.dk/dacapo/Installation?action=AttachFile&do=get&target=campos-dacapo-2.7.16.tar.gz" \ -O campos-dacapo-2.7.16.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw2 fftw_libdir=/opt/${fftw}/2.1.5/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} compiler=pathscale compiler_version=3.2 compiler_libdir=/opt/pathscale/lib/3.2 compiler_bindir=/opt/pathscale/bin blas=acml blas_version=4.0.1 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.0.1 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw2 fftw_libdir=/opt/${fftw}/2.1.5/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 fftw=fftw2 fftw_libdir=/opt/${fftw}/2.1.5/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} rpmbuild --bb --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with fftw=${fftw} --with fftw_libdir=${fftw_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with prefix=/opt/campos-dacapo/2.7.16/${release} campos-dacapo.spec Optional builds:: compiler=ifort # only on thul # multinode jobs fail compiler_version=11.0 compiler_libdir=/opt/intel/Compiler/11.0/083/lib/intel64 compiler_bindir=/opt/intel/Compiler/11.0/083/bin/intel64 blas=mkl blas_version=10.1.3.027 blasdir=/opt/intel/${blas}/${blas_version}/lib/em64t lapack=mkl_lapack lapack_version=${blas_version} lapackdir=${blasdir} fftw=mkl fftw_libdir=${blasdir} netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} compiler=gfortran43 # only on thul compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=mkl blas_version=10.1.3.027 blasdir=/opt/intel/${blas}/${blas_version}/lib/em64t lapack=mkl_lapack lapack_version=${blas_version} lapackdir=${blasdir} fftw=mkl fftw_libdir=${blasdir} netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} compiler=open64 # only on fjorm compiler_version=4.2.1 compiler_libdir=/opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 compiler_bindir=/opt/open64/4.2.1/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw2 fftw_libdir=/opt/${fftw}/2.1.5/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=goto blas_version=1.26 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp/lib64 lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw2 fftw_libdir=/opt/${fftw}/2.1.5/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} compiler=open64 # only on fjorm compiler_version=4.2.1 compiler_libdir=/opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 compiler_bindir=/opt/open64/4.2.1/bin blas=goto blas_version=1.26 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp/lib64 lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw2 fftw_libdir=/opt/${fftw}/2.1.5/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} Build following for gpaw ------------------------ - fftw3 Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget ftp://ftp.fftw.org/pub/fftw/fftw-3.2.1.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin rpmbuild --bb --with major_version=3 --with version1=2 --with version2=1 \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} \ --with compiler_libdir=${compiler_libdir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/fftw3/3.2.1/12.${disttag}.${compiler}.${compiler_version} fftw2.spec - scipy Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/project/scipy/scipy/0.7.0/scipy-0.7.0.tar.gz cd ~/rpmbuild/SPECS module load numpy/1.3.0-1.el5.fys.gfortran.4.1.2.python2.4.blas.3.0.37.el5.lapack.3.0.37.el5 compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 fftw=fftw3 fftw_libdir=/opt/${fftw}/3.2.1/12.${disttag}.${compiler}.${compiler_version}/lib64 cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.${compiler}.${compiler_version}.${blas}.${blas_version} release=1.${disttag}.${compiler}.${compiler_version}.python2.4.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.0.1 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.0.1 lapackdir=/opt/${blas}/${blas_version}/${compiler}64/lib fftw=fftw3 fftw_libdir=/opt/${fftw}/3.2.1/12.${disttag}.${compiler}.${compiler_version}/lib64 cblas_prefix=/opt/cblas/2.23.3/1.${disttag}.${compiler}.${compiler_version}.${blas}.${blas_version} release=1.${disttag}.${compiler}.${compiler_version}.python2.4.${blas}.${blas_version}.${lapack}.${lapack_version} rpmbuild --bb --with cblas_prefix=${cblas_prefix} \ --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with ufsparse_libdir=/usr/lib64 --with ufsparse_includedir=/usr/include/suitesparse \ --with fftw=${fftw} --with fftw_libdir=${fftw_libdir} \ --with modules=${modules} --with default_version=1 \ --with prefix=/opt/scipy/0.7.0/${release} scipy.spec - campos-gpaw-setups Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget --no-check-certificate "http://wiki.fysik.dtu.dk/stuff/gpaw-setups-0.5.3574.tar.gz" cd ~/rpmbuild/SPECS rpmbuild -bb --with default_version --with modules campos-gpaw-setups.spec - povray Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.povray.org/redirect/www.povray.org/ftp/pub/povray/Official/Unix/povray-3.6.tar.bz2 mv povray-3.6.tar.bz2 povray-3.6.1.tar.bz2 cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/povray/3.6.1/3.${disttag} povray.spec - python-jinja2 Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://pypi.python.org/packages/source/J/Jinja2/Jinja2-2.1.1.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-jinja2/2.1.1/1.${disttag}.python2.4 python-jinja2.spec - python-pygments Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://pypi.python.org/packages/source/P/Pygments/Pygments-1.0.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-pygments/1.0/1.${disttag}.python2.4 python-pygments.spec - babel Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://ftp.edgewall.com/pub/babel/Babel-0.9.4.tar.bz2 cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/babel/0.9.4/1.${disttag}.python2.4 babel.spec - python-sphinx Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://pypi.python.org/packages/source/S/Sphinx/Sphinx-0.6.1.tar.gz cd ~/rpmbuild/SPECS module load python-jinja2 rpmbuild -bb --with modules --with default_version --with prefix=/opt/python-sphinx/0.6.1/1.${disttag}.python2.4 python-sphinx.spec - campos-ase3 Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget --no-check-certificate \ "https://wiki.fysik.dtu.dk/ase-files/python-ase-3.1.0.846.tar.gz" \ -O python-ase-3.1.0.846.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version --with prefix=/opt/campos-ase3/3.1.0.846/1.${disttag}.python2.4 campos-ase3.spec Snapshot package is built with, e.g.:: version1=2 version2=0 version_svn=1066 release=3.${version1}.${version2}.${version_svn}/1.${disttag}.python2.4 rpmbuild -bb --with version1=${version1} --with version2=${version2} --with version_svn=${version_svn} --with keep_install=1 \ --with modules --with default_version \ --with prefix=/home/camp/rpmbuild/opt/Intel-Nehalem-el5/campos-ase3/${release} campos-ase3.spec rm /home/camp/rpmbuild/Intel-Nehalem-el5/RPMS/campos-ase3-3.2.0.1066-1.el5.fys.python2.4.x86_64.rpm dir=~/Intel-Nehalem-el5/BUILD/campos-ase3-3.${version1}.${version2}.${version_svn}-1.${disttag}.python2.4-root ln -s ${dir}/etc/modulefiles/campos-ase3/3.${version1}.${version2}.${version_svn}-1.${disttag}.python2.4 \ ~/Intel-Nehalem-el5/modulefiles.testing/campos-ase3 ln -s ${dir}/home/camp/rpmbuild/opt/Intel-Nehalem-el5/campos-ase3/3.${version1}.${version2}.${version_svn} ~/opt/Intel-Nehalem-el5/campos-ase3 - blacs `BLACS `_ is used by ScaLapack. Build the following RPMS:: cd ~/rpmbuild/SOURCES wget http://www.netlib.org/blacs/mpiblacs.tgz wget http://www.netlib.org/blacs/blacstester.tgz cd ~/rpmbuild/SPECS compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin compiler=pathscale compiler_version=3.2 compiler_libdir=/opt/pathscale/lib/3.2 compiler_bindir=/opt/pathscale/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin compiler=open64 compiler_version=4.2.1 compiler_libdir=/opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 compiler_bindir=/opt/open64/4.2.1/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin rpmbuild --bb --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with prefix=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version} blacs.spec - scalapack `SCALAPACK `_. Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.netlib.org/scalapack/scalapack-1.8.0.tgz cd ~/rpmbuild/SPECS compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=goto blas_version=1.26 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp/lib64 lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=open64 compiler_version=4.2.1 compiler_libdir=/opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 compiler_bindir=/opt/open64/4.2.1/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=open64 compiler_version=4.2.1 compiler_libdir=/opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 compiler_bindir=/opt/open64/4.2.1/bin blas=goto blas_version=1.26 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp/lib64 lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=pathscale compiler_version=3.2 compiler_libdir=/opt/pathscale/lib/3.2 compiler_bindir=/opt/pathscale/bin blas=acml blas_version=4.0.1 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.0.1 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} rpmbuild --bb --with blacs=blacs --with blacsdir=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 \ --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with prefix=/opt/scalapack/1.8.0/${release} scalapack.spec - campos-gpaw Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget --no-check-certificate \ "https://wiki.fysik.dtu.dk/gpaw-files/gpaw-0.5.3667.tar.gz" \ -O gpaw-0.5.3667.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran43 # only on fjorm compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=goto blas_version=1.26 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp/lib64 lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag}.${compiler}.${compiler_version}.python2.4.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=pathscale # only on fjorm # note --with sl_second_underscore=1 is necessary for rpmbuild compiler_version=3.2 compiler_libdir=/opt/pathscale/lib/3.2 compiler_bindir=/opt/pathscale/bin blas=goto blas_version=1.26 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp/lib64 lapack=acml lapack_version=4.0.1 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack #scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.acml.4.0.1.${lapack}.${lapack_version}/lib64 release=1.${disttag}.${compiler}.${compiler_version}.python2.4.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag}.${compiler}.${compiler_version}.python2.4.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} rpmbuild --bb --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with blacs=${blacs} --with blacsdir=${blacsdir} \ --with scalapack=${scalapack} --with scalapackdir=${scalapackdir} \ --with prefix=/opt/campos-gpaw/0.5.3667/${release} campos-gpaw.spec **Note**: open64 compiler fails with:: /usr/bin/ld: /opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1/libmv.a(vcos.o): relocation R_X86_64_32S against `a local symbol' can not be used when making a shared object; recompile with -fPIC /opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1/libmv.a: could not read symbols: Bad value Optional builds:: compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag}.${compiler}.${compiler_version}.python2.4.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran43 # fails on thul compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag}.${compiler}.${compiler_version}.python2.4.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.0.1 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.0.1 lapackdir=/opt/${blas}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag}.${compiler}.${compiler_version}.python2.4.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=ifort # fails on thul compiler_version=11.0 compiler_libdir=/opt/intel/Compiler/11.0/083/lib/intel64 compiler_bindir=/opt/intel/Compiler/11.0/083/bin/intel64 blas=mkl blas_version=10.1.3.027 blasdir=/opt/intel/${blas}/${blas_version}/lib/em64t lapack=mkl_lapack lapack_version=10.1.3.027 lapackdir=${blasdir} openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=mkl blacsdir=${blasdir} scalapack=mkl scalapackdir=${blasdir} release=1.${disttag}.${compiler}.${compiler_version}.python2.4.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran43 # only on thul # fails on thul compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=mkl blas_version=10.1.3.027 blasdir=/opt/intel/${blas}/${blas_version}/lib/em64t lapack=mkl_lapack lapack_version=10.1.3.027 lapackdir=${blasdir} openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=mkl blacsdir=${blasdir} scalapack=mkl scalapackdir=${blasdir} release=1.${disttag}.${compiler}.${compiler_version}.python2.4.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran43 # only on thul compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=mkl blas_version=10.2.1.017 blasdir=/opt/intel/${blas}/${blas_version}/lib/em64t lapack=mkl_lapack lapack_version=10.2.1.017 lapackdir=${blasdir} openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=mkl blacsdir=${blasdir} scalapack=mkl scalapackdir=${blasdir} release=1.${disttag}.${compiler}.${compiler_version}.python2.4.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} Build the following for Asap ---------------------------- - Asap Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget "http://dcwww.camp.dtu.dk/campos/download/Asap-3.2.0.tar.gz" cd ~/rpmbuild/SPECS module load ASE3 compiler=pathscale # only on fjorm compiler_version=3.2 compiler_libdir=/opt/pathscale/lib/3.2 compiler_bindir=/opt/pathscale/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.python2.4.openmpi.${openmpi_version} compiler=open64 # only on fjorm # 24 Aug 2009 fails compiler_version=4.2.1 compiler_libdir=/opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 compiler_bindir=/opt/open64/4.2.1/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.python2.4.openmpi.${openmpi_version} compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.python2.4.openmpi.${openmpi_version} module load ifort/11.0-1 compiler=ifort # only thul compiler_version=11.0 compiler_libdir=/opt/intel/Compiler/11.0/083/lib/intel64 compiler_bindir=/opt/intel/Compiler/11.0/083/bin/intel64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.python2.4.openmpi.${openmpi_version} rpmbuild --bb --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with prefix=/opt/campos-asap3/3.2.2/${release} campos-asap.spec Build the following for abinit ------------------------------ .. _abinit-pseudopotentials-1.tar.gz: attachment:abinit-pseudopotentials-1.tar.gz abinit-pseudopotentials-1.tar.gz_ - abinit pseudopotentials Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget --no-check-certificate \ "https://wiki.fysik.dtu.dk/niflheim/Cluster_software_-_RPMS?action=AttachFile&do=get&target=abinit-pseudopotentials-1.tar.gz" \ -O abinit-pseudopotentials-1.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version abinit-pseudopotentials.spec - abinit `abinit `_ is a Density Functional Theory (DFT) package with pseudopotentials and a planewave basis. Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget ftp://ftp.abinit.org/pub/abinitio/ABINIT_v5.4.4/abinit-5.4.4p.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw3 fftw_libdir=/opt/${fftw}/3.2.1/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} rpmbuild --bb --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with fftw=${fftw} --with fftw_libdir=${fftw_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with prefix=/opt/abinit/5.4.4p/${release} abinit.spec Build the following for espresso -------------------------------- - espresso pseudopotentials Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.quantum-espresso.org/pseudo/espresso_pp.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version espresso_pp.spec - espresso Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.pwscf.org/downloads/PWversion/4.0.5/espresso-4.0.5.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw3 fftw_libdir=/opt/${fftw}/3.2.1/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} rpmbuild --bb --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with fftw=${fftw} --with fftw_libdir=${fftw_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with blacs=${blacs} --with blacsdir=${blacsdir} \ --with scalapack=${scalapack} --with scalapackdir=${scalapackdir} \ --with prefix=/opt/espresso/4.0.5/${release} espresso.spec Build the following for yambo ----------------------------- - yambo `yambo `_ is a FORTRAN/C code for Many-Body calculations in solid state and molecular physics. Yambo relies on the Kohn-Sham wavefunctions generated by two DFT public codes: abinit, and PWscf. This build creates interfaces to abinit and PWscf. Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget "http://www.yambo-code.org/counter/click.php?id=26" cd ~/rpmbuild/SPECS compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw3 fftw_libdir=/opt/${fftw}/3.2.1/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} rpmbuild --bb --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with fftw=${fftw} --with fftw_libdir=${fftw_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with blacs=${blacs} --with blacsdir=${blacsdir} \ --with scalapack=${scalapack} --with scalapackdir=${scalapackdir} \ --with iotkdir=/opt/espresso/4.0.5/${release}/share/espresso/iotk \ --with prefix=/opt/yambo/3.2.1.448/${release} yambo.spec Build the folowing for elk -------------------------- - elk species Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/project/elk/elk/0.9.262/elk-0.9.262.tgz cd ~/rpmbuild/SPECS rpmbuild -bb --with modules --with default_version elk-species.spec - elk Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://downloads.sourceforge.net/project/elk/elk/0.9.262/elk-0.9.262.tgz cd ~/rpmbuild/SPECS thread_mode=nomp compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib release=1.${disttag}.${compiler}.${compiler_version}.${thread_mode}.${blas}.${blas_version}.${lapack}.${lapack_version} compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 release=1.${disttag}.${compiler}.${compiler_version}.${thread_mode}.${blas}.${blas_version}.${lapack}.${lapack_version} # multithreaded thread_mode=mp compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64_${thread_mode}/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64_${thread_mode}/lib release=1.${disttag}.${compiler}.${compiler_version}.${thread_mode}.${blas}.${blas_version}.${lapack}.${lapack_version} # multithreaded thread_mode=mp compiler=gfortran43 # only on fjorm # sometimes freezes in threaded mode compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=goto blas_version=1.26 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp/lib64 lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64_${thread_mode}/lib release=1.${disttag}.${compiler}.${compiler_version}.${thread_mode}.${blas}.${blas_version}.${lapack}.${lapack_version} rpmbuild --bb --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with modules=${modules} --with default_version=1 \ --with thread_mode=${thread_mode} --with parallel=1 \ --with prefix=/opt/elk/0.9.262/${release} elk.spec Build the following for vtk --------------------------- - cmake Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.cmake.org/files/v2.6/cmake-2.6.4.tar.gz cd ~/rpmbuild/SOURCES&& wget "http://downloads.sourceforge.net/project/xmlrpc-c/Xmlrpc-c Super Stable/1.06.35/xmlrpc-c-1.06.35.tgz" cd ~/rpmbuild/SPECS rpmbuild -bb --with bootstrap --without gui cmake.spec # install the resulting RPM rpmbuild -bb xmlrpc-c.spec # install the resulting RPM rpm -e cmake rpmbuild -bb --without gui cmake.spec # install the resulting RPM - vtkdata Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.vtk.org/files/release/5.4/vtkdata-5.4.2.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with version1=4 --with version2=2 --with modules=1 --with default_version=1 \ --with prefix=/opt/vtkdata/5.4.2/1.${disttag} vtkdata.spec - vtk Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget http://www.vtk.org/files/release/5.4/vtk-5.4.2.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin rpmbuild -bb --without qt4 --with version1=4 --with version2=2 \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with prefix=/opt/vtk/5.4.2/1.${disttag}.${compiler}.${compiler_version}.python2.4.openmpi.${openmpi_version} vtk.spec Build the following for octopus ------------------------------- - octopus Build the following RPMS:: cd ~/rpmbuild/SOURCES&& wget "http://www.tddft.org/programs/octopus/down.php?file=3.1.0/octopus-3.1.0.tar.gz" cd ~/rpmbuild/SPECS compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw3 fftw_libdir=/opt/${fftw}/3.2.1/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag}.${compiler}.${compiler_version}/bin gsl_libdir=/usr/lib64 gsl_includedir=/usr/include/gsl gsl_bindir=/usr/bin arpack=arpack arpackdir=/usr/lib64 release=1.${disttag}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} rpmbuild --bb --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with fftw=${fftw} --with fftw_libdir=${fftw_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with modules=${modules} --with default_version=1 \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with arpack=${arpack} --with arpackdir=${arpackdir} \ --with gsl_libdir=${gsl_libdir} --with gsl_includedir=${gsl_includedir} --with gsl_bindir=${gsl_bindir} \ --with prefix=/opt/octopus/3.1.0/${release} octopus.spec Testing packages ================ Test `dacapo` installation (as **normal** user!). If you use modules:: module load campos-dacapo # fulfill all dependencies requested by module ulimit -s 500000 # dacapo needs a large stack Test with (make sure that `/scratch/$USER` exists):: cp -r `rpm -ql campos-dacapo-python | grep "share/campos-dacapo-python$"` /tmp cd /tmp/campos-dacapo-python/Tests python test.py 2>&1 | tee test.log It can take up to 1 day. Please consider disabling these "long" tests in `test.py`:: tests.remove('../Examples/Wannier-ethylene.py') tests.remove('../Examples/Wannier-Pt4.py') tests.remove('../Examples/Wannier-Ptwire.py') tests.remove('../Examples/Wannier-Fe-bcc.py') tests.remove('../Examples/transport_1dmodel.py') **Note** all `vtk` related tests will fail. Test `gpaw` installation (as **normal** user!): If you use modules:: module load campos-gpaw # fulfill all dependencies requested by module Test with:: cp -r `rpm -ql campos-gpaw | grep "share/campos-gpaw/test$"` /tmp/test.gpaw.$$ cd /tmp/test.gpaw.* python test.py 2>&1 | tee test.log It takes about 20 minutes. On "Golden Client" ------------------ Login, as root, to the "Golden Client":: ssh n001 Enable nfs mount of the `server` home directory - follow 'Enable nfs mount on the "Golden Client"' from `configuring NFS `_. After this do:: cd /home/dulak-server/rpm/campos rpm -ivh campos-dacapo-2* If getting:: package example_package.el5.i386 is already installed remove these packages with:: rpm -e --nodeps example_package to allow the installation to proceed. Make sure that both python-numeric versions are installed:: rpm -q python-numeric This command will show a list of packages that need to be installed to fulfill `dacapo` dependencies. All these packages should be already under ``/home/dulak-server/rpm``. Remember to test the `dacapo` and `gpaw` installations on the "Golden Client" too. If you are installing workstation only, your setup is ready for testing - go to `benchmarking and maintenance `_. If you are building a cluster go **back** to `installing and configuring systemimager `_,