Size: 81059
Comment:
|
Size: 81152
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 1595: | Line 1595: |
**Note** March 04 2010: version 4.0.5 does not compile with gfortran >= 4.3 (?) **Note** March 04 2010: version 4.1.2 results in Segmentation fault for atom-*in tests and "STOP 1" status for all (?) other tests |
|
Line 1597: | Line 1601: |
. fys.set.espresso.4.1.2.default.sh | . fys.set.espresso.4.0.5.default.sh |
Line 1603: | Line 1607: |
. fys.set.Niflheim.espresso.4.1.2.gfortran43.4.3.2.openmpi.1.3.3.acml.4.3.0.acml.4.3.0.sh . fys.build.espresso.sh |
This page describes the necessary steps to build RPM software packages on the login nodes fjorm or thul.
Warning: campos-dacapo-python provides a python interface to campos-dacapo Fortran code, through campos-ase2. On a 64-bit machine campos-ase2 works only with python <= 2.4, due to the unmaintained python-numeric package. Users on modern systems can use instead the campos-ase3 package and skip installation of any packages related to campos-ase2.
Warning: on 64-bit machines el5 installs often only 32-bit versions of packages. Please verify that 64-bit versions are installed, otherwise run the corresponding yum install package again.
Contents
- Configure external repositories
- Configure rpmbuild
- Install external packages
- It's time to build custom RPMS
- compilers and tools
- openmpi
- acml
- goto
- atlas
- rasmol
- cblas
- python-setuptools
- python-nose
- numpy
- gnuplot-py
- python-numeric
- netcdf
- ScientificPython 2.6.2
- ScientificPython 2.8 or later
- python-docutils
- pytz
- python-dateutil
- python-matplotlib
- campos-ase2
- fftw2
- campos-dacapo-pseudopotentials
- campos-dacapo-python
- campos-dacapo
- fftw3
- povray
- babel
- python-jinja2
- python-pygments
- python-sphinx
- auctex
- campos-ase3
- blacs
- scalapack
- PDSYEVRnew
- campos-gpaw-setups
- campos-gpaw
- Asap
- abinit
- siesta
- espresso
- yambo
- elk
- vtk
- octopus
- TAU
- vasp
- exciting
- fleur
- gulp
- Testing packages
Configure external repositories
As root:
create yum repository definitions (do not enable them):
# atrpms echo '[atrpms]' > /etc/yum.repos.d/atrpms.repo echo 'name=CentOS $releasever - $basearch - ATrpms' >> /etc/yum.repos.d/atrpms.repo echo 'baseurl=http://dl.atrpms.net/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo echo '#baseurl=http://mirrors.ircam.fr/pub/atrpms/el$releasever-$basearch/atrpms/stable' >> /etc/yum.repos.d/atrpms.repo echo 'gpgkey=http://ATrpms.net/RPM-GPG-KEY.atrpms' >> /etc/yum.repos.d/atrpms.repo echo 'gpgcheck=1' >> /etc/yum.repos.d/atrpms.repo echo 'enabled=0' >> /etc/yum.repos.d/atrpms.repo # epel echo '[epel]' > /etc/yum.repos.d/epel.repo echo 'name=CentOS $releasever - $basearch - EPEL' >> /etc/yum.repos.d/epel.repo echo 'baseurl=http://download.fedora.redhat.com/pub/epel/$releasever/$basearch' >> /etc/yum.repos.d/epel.repo echo 'gpgkey=http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL' >> /etc/yum.repos.d/epel.repo echo 'gpgcheck=1' >> /etc/yum.repos.d/epel.repo echo 'enabled=0' >> /etc/yum.repos.d/epel.repo
install, as root:
yum install yum-utils wget # /var directories must be created yum search --enablerepo=atrpms arpack-devel yum search --enablerepo=epel jmol
Configure rpmbuild
create and retrieve the rpmbuild project (as rpmbuild user):
cd svn co https://svn.fysik.dtu.dk/projects/rpmbuild/trunk rpmbuild
include the rpmbuild environment configuration script in ~/.bashrc:
copy the script:
cp ~/rpmbuild/SOURCES/.bashrc_rpmbuild ~/and add the following to ~/.bashrc:
if [ -r "${HOME}/.bashrc_rpmbuild" ]; then . ${HOME}/.bashrc_rpmbuild fiNote on Niflheim the variables:
export FYS_PLATFORM=Intel-Nehalem-el5 # thul export FYS_PLATFORM=AMD-Opteron-el5 # fjormwill be set automatically by /home/camp/modulefiles.sh if the environment-modules package is installed.
create temporary directory:
mkdir -p /scratch/$USER
apply settings from ~/.bashrc by:
. ~/.bashrc
use the following ~rpmbuild/.rpmmacros:
cp ~/rpmbuild/SOURCES/.rpmmacros ~/
create directories:
mkdir -p ~/${FYS_PLATFORM}/{RPMS,SRPMS,BUILD} mkdir -p ~/${FYS_PLATFORM}/{SPECS,SOURCES} # needed only by openmpi
Install external packages
download official packages, as rpmbuild:
cd ~/${FYS_PLATFORM}/RPMS mkdir external; cd external # packages from other distributions yumdownloader --resolve texlive-latex emacs-auctex tex-preview # el5 packages yumdownloader --resolve emacs binutils-devel glib-devel libstdc++-devel yumdownloader --resolve gcc-gfortran blas-devel lapack-devel python-devel yumdownloader --resolve gnuplot libXi-devel xorg-x11-fonts-100dpi pexpect tetex-latex tkinter qt-devel yumdownloader --resolve openmpi openmpi-devel openmpi-libs compat-dapl libibverbs librdmacm openib yumdownloader --resolve pygtk2-devel gtk2-devel tk-devel agg ghostscript libtiff-devel xterm yumdownloader --resolve libX11-devel libXext-devel openmotif openmotif-devel gd-devel libXpm-devel yumdownloader --resolve gcc43-c++ gcc43-gfortran yum localinstall * # as root
download atrpms packages, as rpmbuild (vtk-python is currently unavailable 16 Apr 2009):
cd ~/${FYS_PLATFORM}/RPMS/external yumdownloader --resolve --enablerepo=atrpms vtk-python arpack-devel graphviz wget http://ATrpms.net/RPM-GPG-KEY.atrpms rpm --import RPM-GPG-KEY.atrpms # as root yum localinstall * # as root
download the packages from epel, as rpmbuild:
~/${FYS_PLATFORM}/RPMS/external yumdownloader --resolve --enablerepo=epel jmol gsl-devel python-lxml yumdownloader --resolve --enablerepo=epel environment-modules suitesparse-devel wget http://download.fedora.redhat.com/pub/epel/RPM-GPG-KEY-EPEL rpm --import RPM-GPG-KEY-EPEL # as root yum localinstall * # as root
on Niflheim only: remove default openmpi:
yum remove openmpi openmpi-libs
edit /etc/yum.conf so it contains:
exclude=netcdf-* netcdf3-* fftw-* fftw2-* fftw3-* python-numeric openmpi-* libgfortran4* gcc4* libstdc++4* libgomp-*
It's time to build custom RPMS
compilers and tools
open64
Install (on the login and computes nodes) Install Open64 Compiler Suite RPM, and deploy module file under ` ~rpmbuild/modulefiles/${FYS_PLATFORM}/open64`. Install using --relocate option:
rpm -ivh --relocate "/opt/open64"="/opt/open64/4.2.1" open64-4.2.1-0.x86_64.rpm
The module file is available as svn checkout at https://svn.fysik.dtu.dk/projects/modulefiles/trunk. Note: the module file should contain at least:
prepend-path PATH /opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1 prepend-path PATH /opt/open64/4.2.1/bin prepend-path LD_LIBRARY_PATH /opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1
g95
Install (on the login and computes nodes) g95 RPM, and deploy module file under ` ~rpmbuild/modulefiles/${FYS_PLATFORM}/g95`. Install using --relocate option:
rpm -ivh --relocate "/usr/local"="/opt/g95/0.91" g95-0.91-4.x86_64.rpm
The module file is available as svn checkout at https://svn.fysik.dtu.dk/projects/modulefiles/trunk. Note: the module file should contain at least (do not add the lib directory to LD_LIBRARY_PATH):
prepend-path PATH /opt/g95/0.91/lib/gcc-lib/x86_64-unknown-linux-gnu/4.0.3 prepend-path PATH /opt/g95/0.91/bin
intel
Install icc/ifort compilers (only on the login nodes, packages available under /home/data/Intel with restricted access), install (using rpm -ivh) only the following packages from l_cproc_p_11.1.046_intel64:
intel-cproc046-11.1-1.x86_64.rpm intel-cproidb046-11.1-1.x86_64.rpm intel-cprolib046-11.1-1.x86_64.rpm intel-cprolibdev046-11.1-1.x86_64.rpm intel-cprotbblib046-11.1-1.noarch.rpm intel-cprotbblibdev046-11.1-1.noarch.rpm intel-cprocsdk046-11.1-1.noarch.rpm intel-cproidbsdk046-11.1-1.noarch.rpm
and from l_cprof_p_11.1.046_intel64:
intel-cprof046-11.1-1.x86_64.rpm intel-cproflib046-11.1-1.x86_64.rpm intel-cprofsdk046-11.1-1.noarch.rpm
, enable them (only on thul):
. /opt/intel/Compiler/11.1/046/bin/intel64/ifortvars_intel64.sh . /opt/intel/Compiler/11.1/046/bin/intel64/iccvars_intel64.sh
Build intel compatibility packages (only on thul):
cd ~/rpmbuild/SPECS rpmbuild -bb ${modules_usage} intel-redist.spec
Note: do not install the RPMS generated in the last step on the login node. They need to be installed only on compute nodes. On the login node only the module file needs to be deployed under ~rpmbuild/modulefiles, as rpmbuild. Note that this directory is available as svn checkout at https://svn.fysik.dtu.dk/projects/modulefiles/trunk. Here is what need to be done for a new version of the package:
mkdir ~rpmbuild/modulefiles/${FYS_PLATFORM}/ifort cp /scratch/rpmbuild/intel-11.1.046/11.1.046-1.intel64.${disttag} ~rpmbuild/modulefiles/${FYS_PLATFORM}/ifort cd ~rpmbuild/modulefiles/${FYS_PLATFORM}/ifort ln -s 11.1.046-1.intel64.${disttag} 11.1-1
Note: the above module file should contain at least:
prepend-path LD_LIBRARY_PATH /opt/intel/Compiler/11.1/046/lib/intel64 prepend-path LD_LIBRARY_PATH /opt/intel/Compiler/11.1/046/idb/lib/intel64 prepend-path PATH /opt/intel/Compiler/11.1/046/bin/intel64 prepend-path MANPATH : prepend-path MANPATH /opt/intel/Compiler/11.1/046/man
Note: intel changes packaging quite often, here are the installation instructions for the 10.1.015 compilers:
install RPMS (on the login and computes nodes) found in l_cc_p_10.1.015_intel64.tar.gz and l_fc_p_10.1.015_intel64.tar.gz manually (using rpm -ivh):
intel-icce101015-10.1.015-1.em64t.rpm intel-iidbe101015-10.1.015-1.em64t.rpm intel-isubhe101015-10.1.015-1.em64t.rpm intel-iforte101015-10.1.015-1.em64t.rpm
replace the installation path in the scripts (action necessary only on the login node):
files=`grep "<INSTALLDIR>" -r /opt/intel/cce/ | cut -d":" -f 1 | uniq` for file in $files; do echo $file sed -i 's#<INSTALLDIR>#/opt/intel/cce/10.1.015#g' $file done files=`grep "<INSTALLDIR>" -r /opt/intel/fce/ | cut -d":" -f 1 | uniq` for file in $files; do echo $file sed -i 's#<INSTALLDIR>#/opt/intel/fce/10.1.015#g' $file done files=`grep "<INSTALLDIR>" -r /opt/intel/idbe/ | cut -d":" -f 1 | uniq` for file in $files; do echo $file sed -i 's#<INSTALLDIR>#/opt/intel/idbe/10.1.015#g' $file done
provide the module file ~rpmbuild/modulefiles/${FYS_PLATFORM}/ifort/10.1.015-1.intel64.${disttag}:
prepend-path LD_LIBRARY_PATH /opt/intel/cce/10.1.015/lib prepend-path LD_LIBRARY_PATH /opt/intel/fce/10.1.015/lib prepend-path PATH /opt/intel/cce/10.1.015/bin prepend-path PATH /opt/intel/fce/10.1.015/bin prepend-path PATH /opt/intel/idbe/10.1.015/bin prepend-path MANPATH : prepend-path MANPATH /opt/intel/cce/10.1.015/man prepend-path MANPATH /opt/intel/fce/10.1.015/man prepend-path MANPATH /opt/intel/idbe/10.1.015/man
and the link:
cd ~rpmbuild/modulefiles/${FYS_PLATFORM}/ifort ln -s 10.1.015-1.intel64.${disttag} 10.1-1
intel mkl
Install mkl, and build mkl compatibility package (only on thul):
cd ~/rpmbuild/SOURCES export mkl=10.1.3.027 mkdir ${mkl} cp -rp /opt/intel/mkl/${mkl}/lib ${mkl} cp -rp /opt/intel/mkl/${mkl}/doc ${mkl} tar zcf intel-redist-mkl-${mkl}.tar.gz ${mkl} cd ~/rpmbuild/SPECS rpmbuild -bb --with version1=1 --with version2=3 --with version3=027 ${modules_usage} intel-redist-mkl.spec
Note: do not install the resulting RPM on the login node. It needs to be installed only on compute nodes. On the login node only the module file needs to be deployed under ~rpmbuild/modulefiles, as rpmbuild. Note that this directory is available as svn checkout at https://svn.fysik.dtu.dk/projects/modulefiles/trunk. Here is what need to be done for a new version of the package:
mkdir ~rpmbuild/modulefiles/${FYS_PLATFORM}/mkl cp /tmp/intel-mkl-${mkl}/${mkl}-1.${disttag}.em64t ~rpmbuild/modulefiles/${FYS_PLATFORM}/mkl cd ~rpmbuild/modulefiles/${FYS_PLATFORM}/mkl ln -s ${mkl}-1.${disttag}.em64t 10.1p-027
Note: the above module file should contain at least:
prepend-path LD_LIBRARY_PATH /opt/intel/mkl/10.1.3.027/lib/em64t
Moreover: intel-*intel64* RPMS need to be installed with --nodeps, so move them into a special directory:
mkdir -p ~/${FYS_PLATFORM}/RPMS/nodeps mv ~/${FYS_PLATFORM}/RPMS/intel-*intel64* ~/${FYS_PLATFORM}/RPMS/nodeps
Build mkl/fftw (as root):
export mkl=10.1.3.027 . /opt/intel/Compiler/11.0/083/bin/intel64/ifortvars_intel64.sh cd /opt/intel/mkl/${mkl}/interfaces/fftw2xf make libem64t cd /opt/intel/mkl/${mkl}/lib/em64t ln -s libfftw2xf_intel.a libfftw.a
Logout and login as rpmbuild.
openmpi
Build a custom openmpi, using torque support:
wget http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.3.tar.bz2 \ -O ~/rpmbuild/SOURCES/openmpi-1.3.3.tar.bz2 sh ./buildrpm-1.3.3-1.gfortran.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.gfortran.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.3-1.gfortran43.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.gfortran43.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.3-1.pathscale.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.pathscale.sh.log.${FYS_PLATFORM} sh ./buildrpm-1.3.3-1.ifort.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.ifort.sh.log.${FYS_PLATFORM} # thul only module load g95 sh ./buildrpm-1.3.3-1.g95.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.g95.sh.log.${FYS_PLATFORM} # fjorm only module unload g95 module load open64 # 24 June 2009: build fails on Nehalem with *configure: error: TM support requested but not found. Aborting* sh ./buildrpm-1.3.3-1.open64.sh ../SOURCES/openmpi-1.3.3.tar.bz2 2>&1 | tee buildrpm-1.3.3-1.open64.sh.log.${FYS_PLATFORM} module unload open64
Note: intel openmpi needs to be installed ignoring dependencies:
rpm -ivh --nodeps --oldpackage ~rpmbuild/${FYS_PLATFORM}/RPMS/openmpi-1.3.3-1.${disttag}.ifort.11.1.x86_64.rpm
If scripts that contain ALL build/install/uninstall commands (global_install.sh and global_uninstall.sh) need to be created, every time after an RPM is successfully built, do:
grep -v "#\!" install.sh >> ~/${FYS_PLATFORM}/global_install.sh cat uninstall.sh ~/${FYS_PLATFORM}/global_uninstall.sh | grep -v "#\!" >> ~/${FYS_PLATFORM}/global_uninstall.sh.tmp && mv -f ~/${FYS_PLATFORM}/global_uninstall.sh.tmp ~/${FYS_PLATFORM}/global_uninstall.sh # ignore "cat: .../global_uninstall.sh: No such ..." error when running first time
Note that global_uninstall.sh won't remove built RPM files, just will uninstall the packages.
acml
AMD Core Math Library contains optimized BLAS and LAPACK libraries for AMD opteron processors (work also on intel), and is available for download after registration.
Build the following RPMS:
. fys.set.Niflheim.gfortran.4.1.2.sh . fys.set.blas.acml.4.0.1.sh # . fys.set.Niflheim.pathscale.3.2.sh . fys.set.blas.acml.4.0.1.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.blas.acml.4.3.0.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.blas.acml.4.3.0_mp.sh # . fys.set.Niflheim.open64.4.2.1.sh . fys.set.blas.acml.4.3.0.shNote problems with dgemm acml 4.1.0 and 4.2.0 have been reported. Moreover campos-dacapo built with acml-4-1-0-pathscale-64bit.tgz fails with "ZINSC 2 returned info= 2" for the following job:
#!/usr/bin/env python from Dacapo import Calculator from ASE.IO.xyz import ReadXYZ from ASE.Dynamics.Langevin import Langevin prefix = 'D' atoms = ReadXYZ('32H2O.xyz') L = 9.8553729 atoms.SetUnitCell([L, L, L], fix=True) atoms.SetBoundaryConditions(True) r = 1 atoms = atoms.Repeat([r, r, r]) n = 48 * r calc = Calculator(nbands=128 * r**3, planewavecutoff=350, densitycutoff=500, xc='PBE') atoms.SetCalculator(calc) from time import time atoms.GetPotentialEnergy() pos=atoms[0].GetCartesianPosition() atoms[0].SetCartesianPosition(pos+0.005) t0=time() atoms.GetPotentialEnergy() print time()-t0Download 32H2O.xyz
goto
GotoBLAS contains optimized BLAS libraries, and is available for academic community after registration. Does not work in Intel Nehalem.
Download GotoBLAS-1.26.tar.gz to ~/rpmbuild/SOURCES. The default build process searches your $PATH for available compilers. Priority order is PathScale, PGI, Intel, gfortran, g95 and g77, therefore setting compilers explicitly is needed. Build RPMS (on the target compute node!) using the following command:
. fys.set.Niflheim.gfortran.4.1.2.sh . fys.build.goto.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.build.goto.sh # . fys.set.Niflheim.pathscale.3.2.sh . fys.build.goto.sh # . fys.set.Niflheim.open64.4.2.1.sh . fys.build.goto.shNote - 1.26 version fails on Nehalem with:
../../../param.h:1195:21: error: division by zero in #if
atlas
Build the following RPMS:
. fys.set.Niflheim.gfortran.4.1.2.sh . fys.build.atlas.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.build.atlas.shNote: - 3.8.3 version fails on Opteron with open64 with:
/scratch/rpmbuild/ATLAS/RpmObjs/..//CONFIG/src/backend/comptestC.c:5: undefined reference to `__pathscale_malloc_alg'
rasmol
Required by campos-ase2.
Note that rasmol is built for 32bit architecture, so make sure that your system fulfills "BuildRequires" for both 64 and 32bit:
rpmbuild --nobuild RasMol.specBuild the following RPMS:
. fys.build.RasMol.2.7.3.sh
cblas
Required by campos-ase2.
On a custom system build the following RPMS:
. fys.set.gfortran.default.sh . fys.set.blas.default.sh . fys.build.cblas.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.gfortran.4.1.2.sh . fys.set.blas.acml.4.0.1.sh . fys.build.cblas.sh
python-setuptools
Build the following RPMS:
. fys.build.setuptools.0.6c9.shIf using modules:
module load python-setuptoolselse:
. /etc/profile.d/python-setuptools.sh
python-nose
Build the following RPMS:
. fys.build.nose.0.10.4.shIf using modules:
module load python-noseelse:
. /etc/profile.d/python-nose.sh
numpy
On a custom system build the following RPMS:
. fys.set.gfortran.default.sh . fys.set.blas.default.sh . fys.set.lapack.default.sh . fys.set.Niflheim.cblas.2.23.3.sh . fys.build.numpy.1.3.0.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.numpy.1.3.0.gfortran.4.1.2.acml.4.0.1.cblas.none.sh . fys.build.numpy.1.3.0.sh # only on thul . fys.set.Niflheim.numpy.1.3.0.gfortran.4.1.2.mkl.10.1.3.027.cblas.mkl.10.1.3.027.sh . fys.build.numpy.1.3.0.sh # building using cblas/acml causes the test below to fail . fys.set.Niflheim.numpy.1.3.0.gfortran.4.1.2.acml.4.0.1.cblas.2.23.3.sh . fys.build.numpy.1.3.0.shAfter installing python-numeric make a very rough check:
is using modules:
module load ${blas}-${compiler}64/${blas_version}-1.${disttag} module load cblas-${blas}-${compiler}64/2.23.3-1.${disttag}.${compiler}.${compiler_version}.${blas}.${blas_version} module load numpy/1.3.0-1.${disttag}.${compiler}.${compiler_version}.python${python_version}.${blas}.${blas_version}.${lapack}.${lapack_version} python -c "import numpy; from numpy.core.multiarray import dot; b = numpy.ones(13, numpy.complex); dot(b, b)" python -c "import numpy; numpy.test()"
gnuplot-py
Required by campos-ase2.
Build the following RPMS:
. fys.build.gnuplot-py.1.8.shIf using modules:
module load gnuplot-pyelse:
. /etc/profile.d/gnuplot-py.sh
python-numeric
Required by campos-ase2.
Warning: on a 64-bit machine works only with python <= 2.4.
We must install 24.2 version, and we keep the default version:
cd ~/${FYS_PLATFORM}/RPMS/external rpm -e --nodeps python-numeric # as root yumdownloader --resolve --disableexcludes=main python-numeric cd ~/rpmbuild/SPECSOn a custom system build the following RPMS:
. fys.set.gfortran.default.sh . fys.set.blas.default.sh . fys.set.lapack.default.sh . fys.set.Niflheim.cblas.2.23.3.sh . fys.build.Numeric.24.2.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.Numeric.24.2.gfortran.4.1.2.acml.4.0.1.cblas.2.23.3.sh . fys.build.Numeric.24.2.shNote: (16 Apr 2009) currently Numeric's test.py results in (we ignore this error):
glibc detected *** python: free(): invalid next size (normal): 0x09aee970 ***After installing python-numeric make a very rough check:
is using modules:
module load ${blas}-${compiler}64/${blas_version}-1.${disttag} module load cblas-${blas}-${compiler}64/2.23.3-1.${disttag}.${compiler}.${compiler_version}.${blas}.${blas_version} module load python-numeric/24.2-4.${disttag}.${compiler}.${compiler_version}.python${python_version}.${blas}.${blas_version}.${lapack}.${lapack_version} python -c "import lapack_lite" ldd `rpm -ql python-numeric-24.2-4.${disttag}.${compiler}.${compiler_version}.python${python_version}.${blas}.${blas_version}.${lapack}.${lapack_version} | grep lapack_lite.so` ldd `rpm -ql python-numeric-24.2-4.${disttag}.${compiler}.${compiler_version}.python${python_version}.${blas}.${blas_version}.${lapack}.${lapack_version} | grep _dotblas.so`else:
. /etc/profile.d/python-numeric.sh python -c "import lapack_lite" ldd `rpm -ql python-numeric | grep lapack_lite.so` ldd `rpm -ql python-numeric | grep _dotblas.so`and reinstall the default version:
rpm -ivh --oldpackage ~/${FYS_PLATFORM}/RPMS/external/python-numeric-23.*.rpm
netcdf
NetCDF (network Common Data Form) is an interface for array-oriented data access and a library that provides an implementation of the interface. The NetCDF library also defines a machine-independent format for representing scientific data.
We prefer version 3.6.1 or higher (note that version 3.6.2 has fortran interface in libnetcdff.a).
On a custom system build the following RPMS:
. fys.set.gfortran.default.sh . fys.build.netcdf4.4.0.1.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.gfortran.4.1.2.sh . fys.set.Niflheim.netcdf4.4.0.1.sh . fys.build.netcdf4.4.0.1.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.Niflheim.netcdf4.4.0.1.sh . fys.build.netcdf4.4.0.1.sh # only on fjorm . fys.set.Niflheim.pathscale.3.2.sh . fys.set.Niflheim.netcdf4.4.0.1.sh . fys.build.netcdf4.4.0.1.sh # only on thul . fys.set.Niflheim.ifort.11.0.sh . fys.set.Niflheim.netcdf4.4.0.1.sh . fys.build.netcdf4.4.0.1.sh # only on fjorm . fys.set.Niflheim.open64.4.2.1.sh . fys.set.Niflheim.netcdf4.4.0.1.sh . fys.build.netcdf4.4.0.1.sh
ScientificPython 2.6.2
Warning: version 2.6.2 required by campos-ase2. Later versions are needed by campos-ase3.
On a custom system build the following RPMS:
. fys.set.ScientificPython.2.6.2.default.sh . fys.build.ScientificPython.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.ScientificPython.2.6.2.gfortran.4.1.2.openmpi.1.3.3.numeric.sh . fys.build.ScientificPython.sh
ScientificPython 2.8 or later
Required by campos-ase3.
On a custom system build the following RPMS:
. fys.set.ScientificPython.2.8.default.sh . fys.build.ScientificPython.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.ScientificPython.2.8.gfortran.4.1.2.serial_version.only.numpy.sh . fys.build.ScientificPython.sh
python-docutils
Build the following RPMS:
. fys.build.docutils.0.5.shIf using modules:
module load python-docutilselse:
. /etc/profile.d/python-docutils.sh
pytz
Build the following RPMS:
. fys.build.pytz.2008g.shIf using modules:
module load pytzelse:
. /etc/profile.d/pytz.sh
python-dateutil
Build the following RPMS:
. fys.build.python-dateutil.1.4.1.shIf using modules:
module load python-dateutilelse:
. /etc/profile.d/python-dateutil.sh
python-matplotlib
On a custom system build the following RPMS:
. fys.build.python-matplotlib.0.99.1.2.sh
campos-ase2
Build the following RPMS:
. fys.build.ase2.2.3.13.sh
fftw2
We use version 2.1.5.
On a custom system build the following RPMS:
. fys.set.fftw2.default.sh . fys.build.fftw2.2.1.5.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.gfortran.4.1.2.sh . fys.build.fftw2.2.1.5.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.build.fftw2.2.1.5.sh # . fys.set.Niflheim.pathscale.3.2.sh . fys.build.fftw2.2.1.5.sh # only on fjorm . fys.set.Niflheim.open64.4.2.1.sh . fys.build.fftw2.2.1.5.sh # Note: 24 June 2009: ifort fails to build static fftw2 . fys.set.Niflheim.ifort.11.0.sh . fys.build.fftw2.2.1.5.sh
campos-dacapo-pseudopotentials
campos-dacapo-pseudopotentials-1.tar.gz
Build the following RPMS:
. fys.build.dacapo-pseudopotentials.1.sh
campos-dacapo-python
Requires campos-ase2.
Build the following RPMS:
. fys.build.dacapo-python.0.9.4.sh
campos-dacapo
Note that FC's fftw version 3 is incompatible with dacapo.
If you build only a serial version add "--without parallel" to the rpmbuild options. Another useful option is "--without default_version" that does not put /etc/profile.d scripts, nor modules under /opt/modulefiles that allows the package to be relocatable.
Warning: currently (9 Oct 2008) packaging of openmpi on FC8/FC9 requires the following link to be made:
ln -s /usr/include/openmpi/1.2.4-gcc/32/mpif-config.h /usr/include/openmpi/1.2.4-gcc/mpif-config.hWarning: currently (9 Oct 2008) the default build fails for EL4 due to double underlined symbols in the netcdf from ATrpms, however custom build (i.e. with building your own netcdf should work).
On a custom system build the following RPMS:
. fys.set.dacapo.default.sh . fys.set.atlas.default.sh . fys.build.dacapo.sh # optional build using netlib blas/lapack . fys.set.dacapo.default.sh . fys.build.dacapo.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.dacapo.2.7.16.gfortran43.4.3.2.openmpi.1.3.3.acml.4.3.0.acml.4.3.0.fftw2.sh . fys.build.dacapo.sh # only on fjorm . fys.set.Niflheim.dacapo.2.7.16.open64.4.2.1.openmpi.1.3.3.acml.4.3.0.acml.4.3.0.fftw2.sh . fys.build.dacapo.sh # only on thul . fys.set.Niflheim.dacapo.2.7.16.gfortran43.4.3.2.openmpi.1.3.3.mkl.10.1.3.027.mkl.10.1.3.027.mkl.sh . fys.build.dacapo.sh # optional . fys.set.Niflheim.gfortran.4.1.2.sh . fys.set.Nilfheim.openmpi.1.3.3.sh . fys.set.Niflheim.netcdf4.4.0.1.sh . fys.set.Niflheim.fftw2.2.1.5.sh . fys.set.blas.default.sh . fys.set.lapack.acml.sh . fys.build.dacapo.sh # optional . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.Nilfheim.openmpi.1.3.3.sh . fys.set.Niflheim.netcdf4.4.0.1.sh . fys.set.Niflheim.fftw2.2.1.5.sh . fys.set.blas.acml.4.3.0.sh . fys.set.lapack.acml.sh . fys.set.Niflheim.goto.1.26.sh . fys.set.lapack.acml.sh . fys.build.dacapo.sh # optional . fys.set.Niflheim.open64.4.2.1.sh . fys.set.Nilfheim.openmpi.1.3.3.sh . fys.set.Niflheim.netcdf4.4.0.1.sh . fys.set.Niflheim.fftw2.2.1.5.sh . fys.set.blas.acml.4.3.0.sh . fys.set.lapack.acml.sh . fys.set.Niflheim.goto.1.26.sh . fys.set.lapack.acml.sh . fys.build.dacapo.sh # optional # only on thul # multinode jobs fail . fys.set.Niflheim.ifort.11.0.sh . fys.set.Nilfheim.openmpi.1.3.3.sh . fys.set.Niflheim.netcdf4.4.0.1.sh . fys.set.Niflheim.mkl.10.1.3.027.sh . fys.build.dacapo.sh
fftw3
On a custom system build the following RPMS:
. fys.set.fftw3.default.sh . fys.build.fftw3.3.2.1.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.gfortran.4.1.2.sh . fys.build.fftw3.3.2.1.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.build.fftw3.3.2.1.sh
- scipy
On a custom system build the following RPMS:
. fys.set.gfortran.default.sh . fys.set.blas.default.sh . fys.set.lapack.default.sh . fys.set.fftw3.default.sh . fys.set.Niflheim.cblas.2.23.3.sh . fys.build.scipy.0.7.0.sh On Niflheim build the following RPMS:: . fys.set.Niflheim.scipy.0.7.0.gfortran.4.1.2.blas.lapack.cblas.2.23.3.sh . fys.build.scipy.0.7.0.sh # optional - not tested! . fys.set.Niflheim.scipy.0.7.0.gfortran.4.1.2.acml.4.0.1.cblas.2.23.3.sh . fys.build.scipy.0.7.0.sh
povray
Build the following RPMS:
. fys.build.povray.sh
babel
Build the following RPMS:
. fys.build.Babel.0.9.4.shIf using modules:
module load babelelse:
. /etc/profile.d/babel.sh
python-jinja2
Build the following RPMS:
. fys.build.Jinja2.2.1.1.shIf using modules:
module load python-jinja2else:
. /etc/profile.d/python-jinja2.sh
python-pygments
Build the following RPMS:
. fys.build.Pygments.1.0.shIf using modules:
module load python-pygmentselse:
. /etc/profile.d/python-pygments.sh
python-sphinx
Build the following RPMS:
. fys.build.Sphinx.0.6.1.shIf using modules:
module load python-sphinxelse:
. /etc/profile.d/python-sphinx.sh
auctex
Build the following RPMS:
cd ~/rpmbuild/SOURCES&& wget http://ftp.gnu.org/pub/gnu/auctex/auctex-11.85.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb auctex.spec
campos-ase3
Build the following RPMS:
. fys.build.ase3.3.1.1390.shSnapshot package is built with, e.g.:
major_version=3 version1=2 version2=0 version_svn=1066 release=${major_version}.${version1}.${version2}.${version_svn}/1.${disttag}.python${python_version} rpmbuild -bb --with major_version=${major_version} --with version1=${version1} --with version2=${version2} --with version_svn=${version_svn} --with keep_install=1 \ ${modules_usage} --with default_version -with system=${rpm_platform} \ --with python_version=${python_version} \ --with prefix=/home/camp/rpmbuild/opt/Intel-Nehalem-el5/campos-ase3/${release} campos-ase3.spec rm /home/camp/rpmbuild/Intel-Nehalem-el5/RPMS/campos-ase3-${major_version}.${version1}.${version2}.${version_svn}-1.${disttag}.python${python_version}.x86_64.rpm dir=~/Intel-Nehalem-el5/BUILD/campos-ase3-${major_version}.${version1}.${version2}.${version_svn}-1.${disttag}.python${python_version}-root ln -s ${dir}/etc/modulefiles/campos-ase3/${major_version}.${version1}.${version2}.${version_svn}-1.${disttag}.python${python_version} \ ~/Intel-Nehalem-el5/modulefiles.testing/campos-ase3 ln -s ${dir}/home/camp/rpmbuild/opt/Intel-Nehalem-el5/campos-ase3/${major_version}.${version1}.${version2}.${version_svn} ~/opt/Intel-Nehalem-el5/campos-ase3
blacs
BLACS is used by ScaLapack.On a custom system build the following RPMS:
. fys.set.gfortran.default.sh . fys.set.openmpi.default.sh . fys.build.blacs.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.gfortran.4.1.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.build.blacs.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.build.blacs.sh # . fys.set.Niflheim.pathscale.3.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.build.blacs.sh # . fys.set.Niflheim.open64.4.2.1.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.build.blacs.sh
scalapack
SCALAPACK.On a custom system build the following RPMS:
. fys.set.gfortran.default.sh . fys.set.openmpi.default.sh . fys.set.blas.default.sh . fys.set.lapack.default.sh . fys.set.Niflheim.blacs.1.1.sh . fys.build.scalapack.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.gfortran.4.1.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.blas.default.sh . fys.set.lapack.default.sh . fys.set.Niflheim.blacs.1.1.sh . fys.build.scalapack.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.blas.default.sh . fys.set.lapack.default.sh . fys.set.Niflheim.blacs.1.1.sh . fys.build.scalapack.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.blas.acml.4.3.0.sh . fys.set.lapack.acml.sh . fys.set.Niflheim.blacs.1.1.sh . fys.build.scalapack.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.blas.acml.4.3.0.sh . fys.set.lapack.acml.sh . fys.set.Niflheim.goto.1.26.sh . fys.set.lapack.acml.sh . fys.set.Niflheim.blacs.1.1.sh . fys.build.scalapack.sh # . fys.set.Niflheim.open64.4.2.1.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.blas.acml.4.3.0.sh . fys.set.lapack.acml.sh . fys.set.Niflheim.blacs.1.1.sh . fys.build.scalapack.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.Niflheim.atlas.3.8.3.sh . fys.set.Niflheim.blacs.1.1.sh . fys.build.scalapack.sh
PDSYEVRnew
PDSYEVRnew ScaLAPACK's new parallel MRRR algorithm for computing eigenpairs of large real symmetric or complex Hermitian matrices.
On Niflheim build the following RPMS:
. fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.blas.acml.4.3.0.sh . fys.set.lapack.acml.sh . fys.set.Niflheim.blacs.1.1.sh . fys.set.Niflheim.scalapack.1.8.0.sh . fys.build.PDSYEVRnew.sh # . fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.blas.acml.4.3.0.sh . fys.set.lapack.acml.sh . fys.set.Niflheim.goto.1.26.sh . fys.set.Niflheim.blacs.1.1.sh . fys.set.Niflheim.scalapack.1.8.0.sh . fys.build.PDSYEVRnew.sh
campos-gpaw-setups
Build the following RPMS:
. fys.build.gpaw-setups.0.5.3574.sh
campos-gpaw
On a custom system build the following RPMS:
# March 3 2010: results in "orthogonalization failed" with EPEL's atlas on el5.x64_64 . fys.set.gpaw.default.sh . fys.set.atlas.default.sh . fys.build.gpaw.sh # optional build using netlib blas/lapack . fys.set.gpaw.default.sh . fys.build.gpaw.shOn Niflheim build the following RPMS:
. fys.set.Niflheim.gpaw.0.6.5147.gfortran43.4.3.2.openmpi.1.3.3.acml.4.3.0.acml.4.3.0.sh . fys.build.gpaw.sh # . fys.set.Niflheim.gpaw.0.6.5147.gfortran43.4.3.2.openmpi.1.3.3.acml.4.3.0.acml.4.3.0.sh . fys.build.gpaw.sh compiler=gfortran43 # only on fjorm compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=goto blas_version=1.26 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp/lib64 lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild aboveNote: open64 compiler fails with:
/usr/bin/ld: /opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1/libmv.a(vcos.o): relocation R_X86_64_32S against `a local symbol' can not be used when making a shared object; recompile with -fPIC /opt/open64/4.2.1/lib/gcc-lib/x86_64-open64-linux/4.2.1/libmv.a: could not read symbols: Bad valueOptional builds:
# defaut installation using atlas - gpaw-test fails for diagonalization on el5 x86_64 18 Jan 2010 compiler=gfortran compiler_version=`rpm -q --queryformat='%{RPMTAG_VERSION}\n' libgfortran | head -1` dir=`rpm -ql libgfortran | grep "libgfortran\.so" | head -1` compiler_libdir=`dirname ${dir}` dir=`which gfortran` compiler_bindir=`dirname ${dir}` blas=atlas blas_version=`rpm -q --queryformat='%{RPMTAG_VERSION}\n' atlas | head -1 | sed 's/-/./g'` dir=`rpm -ql atlas | grep "libatlas\.so" | head -1` blasdir=`dirname ${dir}` lapack=lapack lapack_version=`rpm -q --queryformat='%{RPMTAG_VERSION}\n' atlas | head -1 | sed 's/-/./g'` lapackdir=${blasdir} openmpi_version=`rpm -q --queryformat='%{RPMTAG_VERSION}\n' openmpi | head -1` dir=`rpm -ql openmpi | grep mpiexec | head -1` ompi_bindir=`dirname ${dir}` dir=`rpm -ql openmpi-devel | grep "libmpi\.so" | head -1` if test -z $dir; then dir=`rpm -ql openmpi | grep "libmpi\.so" | head -1`; fi ompi_libdir=`dirname ${dir}` dir=`rpm -ql openmpi-devel | grep "mpif\.h" | head -1` if test -z $dir; then dir=`rpm -ql openmpi | grep "mpif\.h" | head -1`; fi ompi_includedir=`dirname ${dir}` blacs=none scalapack=none release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above compiler=pathscale # only on fjorm # note --with sl_second_underscore=1 is necessary for rpmbuild compiler_version=3.2 compiler_libdir=/opt/pathscale/lib/3.2 compiler_bindir=/opt/pathscale/bin blas=goto blas_version=1.26 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}.smp/lib64 lapack=acml lapack_version=4.0.1 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack #scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.acml.4.0.1.${lapack}.${lapack_version}/lib64 release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above compiler=gfortran43 # problems reported on thul compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=atlas blas_version=3.8.3 blasdir=/opt/${blas}/${blas_version}/1.${disttag}.${compiler}.${compiler_version}/lib64 lapack=atlas lapack_version=3.8.3 lapackdir=/opt/${lapack}/${lapack_version}/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=3.0.37.el5 lapackdir=/usr/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.0.1 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.0.1 lapackdir=/opt/${blas}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above compiler=ifort # fails on thul compiler_version=11.0 compiler_libdir=/opt/intel/Compiler/11.0/083/lib/intel64 compiler_bindir=/opt/intel/Compiler/11.0/083/bin/intel64 blas=mkl blas_version=10.1.3.027 blasdir=/opt/intel/${blas}/${blas_version}/lib/em64t lapack=mkl_lapack lapack_version=10.1.3.027 lapackdir=${blasdir} openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=mkl blacsdir=${blasdir} scalapack=mkl scalapackdir=${blasdir} release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above compiler=gfortran43 # only on thul # fails on thul compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=mkl blas_version=10.1.3.027 blasdir=/opt/intel/${blas}/${blas_version}/lib/em64t lapack=mkl_lapack lapack_version=10.1.3.027 lapackdir=${blasdir} openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=mkl blacsdir=${blasdir} scalapack=mkl scalapackdir=${blasdir} release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above compiler=gfortran43 # only on thul compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=mkl blas_version=10.2.1.017 blasdir=/opt/intel/${blas}/${blas_version}/lib/em64t lapack=mkl_lapack lapack_version=10.2.1.017 lapackdir=${blasdir} openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=mkl blacsdir=${blasdir} scalapack=mkl scalapackdir=${blasdir} release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.without_ase.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above
Asap
On a custom system build the following RPMS:
. fys.set.Asap.default.sh . fys.build.Asap.shOn Niflheim build the following RPMS:
# only on fjorm . fys.set.Niflheim.Asap.3.2.6.pathscale.3.2.openmpi.1.3.3.sh . fys.build.Asap.sh # only on thul . fys.set.Niflheim.Asap.3.2.6.ifort.11.0.openmpi.1.3.3.sh . fys.build.Asap.sh
abinit
abinit-pseudopotentials-1.tar.gz
abinit pseudopotentials
Build the following RPMS:
. fys.build.abinit-pseudopotentials.1.sh
abinit
abinit is a Density Functional Theory (DFT) package with pseudopotentials and a planewave basis.
On a custom system build the following RPMS:
. fys.set.abinit.5.4.4p.default.sh . fys.set.atlas.default.sh . fys.build.abinit.5.4.4p.sh
On Niflheim build the following RPMS:
. fys.set.Niflheim.Asap.3.2.6.pathscale.3.2.openmpi.1.3.3.sh . fys.build.abinit.5.4.4p.sh
siesta
siesta-pseudopotentials-2.tar.gz
siesta pseudopotentials
Build the following RPMS:
cd ~/rpmbuild/SOURCES&& wget --no-check-certificate \ "https://wiki.fysik.dtu.dk/niflheim/Cluster_software_-_RPMS?action=AttachFile&do=get&target=siesta-pseudopotentials-2.tar.gz" \ -O siesta-pseudopotentials-2.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb ${modules_usage} --with default_version siesta-pseudopotentials.spec
siesta
siesta (Spanish Initiative for Electronic Simulations with Thousands of Atoms) is both a method and its computer program implementation, to perform electronic structure calculations and ab initio molecular dynamics simulations of molecules and solids.
Build the following RPMS:
cd ~/rpmbuild/SPECS major_version=2 version1=0 compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} rpmbuild --bb --with major_version=${major_version} --with version1=${version1} \ --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with blacs=${blacs} --with blacsdir=${blacsdir} \ --with scalapack=${scalapack} --with scalapackdir=${scalapackdir} \ ${modules_usage} --with default_version=1 \ --with system=${rpm_platform} \ --with prefix=/opt/siesta${major_version}/${major_version}.${version1}/${release} siesta.spec
espresso
espresso pseudopotentials
Build the following RPMS:
. fys.build.espresso_pp.sh
espresso
Note March 04 2010: version 4.0.5 does not compile with gfortran >= 4.3 (?)
Note March 04 2010: version 4.1.2 results in Segmentation fault for atom-*in tests and "STOP 1" status for all (?) other tests
On a custom system build the following RPMS:
. fys.set.espresso.4.0.5.default.sh . fys.set.atlas.default.sh . fys.build.espresso.sh
On Niflheim build the following RPMS:
# version used with yambo 3.2.1.426M . fys.set.Niflheim.espresso.4.0.5.gfortran43.4.3.2.openmpi.1.3.3.acml.4.3.0.acml.4.3.0.sh . fys.build.espresso.sh
yambo
yambo
yambo is a FORTRAN/C code for Many-Body calculations in solid state and molecular physics. Yambo relies on the Kohn-Sham wavefunctions generated by two DFT public codes: abinit, and PWscf. This build creates interfaces to abinit and PWscf.
On a custom system build the following RPMS:
. fys.set.yambo.3.2.1.448.default.sh . fys.set.atlas.default.sh . fys.build.yambo.sh
On Niflheim build the following RPMS:
. fys.set.Niflheim.yambo.3.2.1.448.gfortran43.4.3.2.openmpi.1.3.3.acml.4.3.0.acml.4.3.0.sh . fys.build.yambo.sh # special yambo 3.2.1.426M . fys.set.Niflheim.yambo.3.2.1.426M.gfortran43.4.3.2.openmpi.1.3.3.acml.4.3.0.acml.4.3.0.sh . fys.build.yambo.sh
elk
elk species
Build the following RPMS:
. fys.build.elk-species.1.0.0.sh
elk
On a custom system build the following RPMS:
. fys.set.gfortran.default.sh . fys.set.atlas.default.sh thread_mode=mp . fys.build.elk.1.0.0.sh
On Niflheim build the following RPMS:
. fys.set.Niflheim.gfortran43.4.3.2.sh . fys.set.blas.acml.4.3.0_mp.sh . fys.set.lapack.acml.sh thread_mode=mp . fys.build.elk.1.0.0.sh
vtk
cmake
Build the following RPMS:
cd ~/rpmbuild/SOURCES&& wget http://www.cmake.org/files/v2.6/cmake-2.6.4.tar.gz cd ~/rpmbuild/SOURCES&& wget "http://downloads.sourceforge.net/project/xmlrpc-c/Xmlrpc-c Super Stable/1.06.35/xmlrpc-c-1.06.35.tgz" cd ~/rpmbuild/SPECS rpmbuild -bb --with bootstrap --without gui cmake.spec # install the resulting RPM rpmbuild -bb xmlrpc-c.spec # install the resulting RPM rpm -e cmake rpmbuild -bb --without gui cmake.spec # install the resulting RPM
vtkdata
Build the following RPMS:
cd ~/rpmbuild/SOURCES&& wget http://www.vtk.org/files/release/5.4/vtkdata-5.4.2.tar.gz cd ~/rpmbuild/SPECS rpmbuild -bb --with version1=4 --with version2=2 ${modules_usage} --with default_version=1 \ --with prefix=/opt/vtkdata/5.4.2/1.${disttag} vtkdata.spec
vtk
Build the following RPMS:
cd ~/rpmbuild/SOURCES&& wget http://www.vtk.org/files/release/5.4/vtk-5.4.2.tar.gz cd ~/rpmbuild/SPECS compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin rpmbuild -bb --without qt4 --with version1=4 --with version2=2 \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ ${modules_usage} --with default_version=1 \ --with system=${rpm_platform} \ --with python_version=${python_version} \ --with prefix=/opt/vtk/5.4.2/1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.openmpi.${openmpi_version} vtk.spec
octopus
octopus
Build the following RPMS:
cd ~/rpmbuild/SOURCES&& wget "http://www.tddft.org/programs/octopus/down.php?file=3.1.0/octopus-3.1.0.tar.gz" cd ~/rpmbuild/SPECS compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw3 fftw_libdir=/opt/${fftw}/3.2.1/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin gsl_libdir=/usr/lib64 gsl_includedir=/usr/include/gsl gsl_bindir=/usr/bin arpack=arpack arpackdir=/usr/lib64 release=1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} rpmbuild --bb --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with fftw=${fftw} --with fftw_libdir=${fftw_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with arpack=${arpack} --with arpackdir=${arpackdir} \ --with gsl_libdir=${gsl_libdir} --with gsl_includedir=${gsl_includedir} --with gsl_bindir=${gsl_bindir} \ ${modules_usage} --with default_version=1 \ --with system=${rpm_platform} \ --with prefix=/opt/octopus/3.1.0/${release} octopus.spec
TAU
pdtoolkit
Program Database Toolkit is a framework for analyzing source code written in several programming languages and for making rich program knowledge accessible to developers of static and dynamic analysis tools. PDT is required by TAU.
Build the following RPMS:
cd ~/rpmbuild/SOURCES&& wget http://www.cs.uoregon.edu/research/paracomp/pdtoolkit/Download/pdt_latest.tar.gz mv pdt_latest.tar.gz `tar ztf pdt_latest.tar.gz | head -1 | cut -d "/" -f 2`.tar.gz cd ~/rpmbuild/SPECS major_version=3 version1=14 version2=1 compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin rpmbuild -bb --with version1=${version1} --with version2=${version2} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ ${modules_usage} --with default_version=1 \ --with system=${rpm_platform} \ --with python_version=${python_version} \ --with prefix=/opt/pdtoolkit/${major_version}.${version1}.${version2}/1.${disttag}.${compiler}.${compiler_version}.python${python_version} pdtoolkit.spec compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin !rpmbuild # as rpmbuild above
tau
TAU Performance System® is a portable profiling and tracing toolkit for performance analysis of parallel programs written in Fortran, C, C++, Java, Python.
Build the following RPMS:
cd ~/rpmbuild/SOURCES&& wget http://www.cs.uoregon.edu/research/paracomp/tau/tauprofile/dist/tau_latest.tar.gz mv tau_latest.tar.gz `tar ztf tau_latest.tar.gz | head -1 | cut -d "/" -f 1`.tar.gz cd ~/rpmbuild/SPECS major_version=2 version1=18 version2=2p4 module load PDTOOLKIT/3.14.1-1.el5.fys.gfortran.4.1.2.python${python_version} pdt_rootdir=${PDTROOT} compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.openmpi.${openmpi_version} rpmbuild -bb -with version1=${version1} --with version2=${version2} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with pdt_rootdir=${pdt_rootdir} \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ ${modules_usage} --with default_version=1 \ --with system=${rpm_platform} \ --with python_version=${python_version} \ --with prefix=/opt/tau/${major_version}.${version1}.${version2}/${release} tau.spec module load PDTOOLKIT/3.14.1-1.el5.fys.gfortran43.4.3.2.python${python_version} pdt_rootdir=${PDTROOT} compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin release=1.${disttag_network}.${compiler}.${compiler_version}.python${python_version}.openmpi.${openmpi_version} !rpmbuild # as rpmbuild above
vasp
Installed with restricted access under vasp UNIX group (needs te be created in advance).
vasp potentials
Download potpaw/potcar.date.tar, potpaw_GGA/potcar.date.tar, and potpaw_PBE/potcar.date.tar to ~/rpmbuild/SOURCES. Create links to the downloaded files so the files are named explicitly with the string date (not the actual dates), for example:
ls -al ~/rpmbuild/SOURCES/potpaw_*potcar.date.tar lrwxrwxrwx 1 rpmbuild campnone 29 Sep 25 17:15 ../SOURCES/potpaw_GGA_potcar.date.tar -> potpaw_GGA/potcar.06Feb03.tar lrwxrwxrwx 1 rpmbuild campnone 29 Sep 25 17:14 ../SOURCES/potpaw_PBE_potcar.date.tar -> potpaw_PBE/potcar.05Feb03.tar lrwxrwxrwx 1 rpmbuild campnone 25 Sep 25 17:14 ../SOURCES/potpaw_potcar.date.tar -> potpaw/potcar.05Feb03.tar
Build the following RPMS:
cd ~/rpmbuild/SPECS rpmbuild -bb ${modules_usage} --with default_version vasp-potpaw.spec
vasp
vasp. Download vasp.5.X.tar.gz and vasp.5.lib.tar.gz to ~/rpmbuild/SOURCES. Optionally download also vtstcode.tar.gz from http://theory.cm.utexas.edu/vtsttools/downloads/.
Build the following RPMS:
cd ~/rpmbuild/SPECS major_version=5 version1=2 # Compile with VASP TST library: http://theory.cm.utexas.edu/vtsttools/downloads/ tst=false # or true if [ "$tst" == "false" ]; then vasp_tst='.' else vasp_tst='.tst.' fi compiler=gfortran compiler_version=4.1.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.0.1 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.0.1 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw3 fftw_libdir=/opt/${fftw}/3.2.1/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack #scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.blas.3.0.37.el5.lapack.3.0.37.el5/lib64 release=1.${disttag_network}${vasp_tst}${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} rpmbuild --bb --with tst=${tst} --with major_version=${major_version} --with version1=${version1} \ --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with fftw=${fftw} --with fftw_libdir=${fftw_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with netcdf=${netcdf} --with netcdf_includedir=${netcdf_includedir} --with netcdf_libdir=${netcdf_libdir} \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with blacs=${blacs} --with blacsdir=${blacsdir} \ --with scalapack=${scalapack} --with scalapackdir=${scalapackdir} \ ${modules_usage} --with default_version=1 \ --with system=${rpm_platform} \ --with prefix=/opt/vasp/${major_version}.${version1}/${release} vasp.spec compiler=gfortran43 # 25 Sep 2009: segmentation fault when compiling dipol.f90 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib fftw=fftw3 fftw_libdir=/opt/${fftw}/3.2.1/12.${disttag}.${compiler}.${compiler_version}/lib64 netcdf=netcdf4 netcdf_includedir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/include netcdf_libdir=/opt/${netcdf}/4.0.1/1.${disttag}.${compiler}.${compiler_version}/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag_network}${vasp_tst}${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}.${fftw} !rpmbuild # as rpmbuild above
exciting
exciting species
Build the following RPMS:
. fys.build.exciting-species.9.10.sh
exciting
Note: currently (24 Nov 2009) perl-XML-SAX-0.14-5 is broken https://bugzilla.redhat.com/show_bug.cgi?id=438291 on EL5; if exciting's tests complain about:
could not find ParserDetails.ini
do as root:
perl -MXML::SAX -e "XML::SAX->add_parser(q(XML::SAX::PurePerl))->save_parsers()"
On a custom system build the following RPMS:
. fys.set.gfortran.default.sh . fys.set.openmpi.default.sh . fys.set.atlas.default.sh thread_mode=nomp . fys.build.exciting.9.10.sh
On Niflheim build the following RPMS:
# only on fjorm . fys.set.Niflheim.g95.0.91.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.blas.acml.4.3.0_mp.sh . fys.set.lapack.acml.sh thread_mode=nomp . fys.build.exciting.9.10.sh # only on thul . fys.set.Niflheim.ifort.11.0.sh . fys.set.Niflheim.openmpi.1.3.3.sh . fys.set.Niflheim.mkl.10.1.3.027.sh thread_mode=nomp . fys.build.exciting.9.10.sh
fleur
fleur
fleur. Download the source to ~/rpmbuild/SOURCES.
Build the following RPMS:
cd ~/rpmbuild/SPECS major_version=25 version1=g if [ "$version1" == "None" ]; then fleur_version=${major_version} else fleur_version=${major_version}.${version1} fi compiler=gfortran43 # compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=blacs blacsdir=/opt/blacs/1.1/24.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}/lib64 scalapack=scalapack scalapackdir=/opt/scalapack/1.8.0/1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version}/lib64 release=1.${disttag_network}.${compiler}.${compiler_version}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} rpmbuild --bb --with major_version=${major_version} --with version1=${version1} \ --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with parallel=1 --with openmpi=openmpi --with openmpi_version=${openmpi_version} \ --with ompi_libdir=${ompi_libdir} --with ompi_includedir=${ompi_includedir} --with ompi_bindir=${ompi_bindir} \ --with blacs=${blacs} --with blacsdir=${blacsdir} \ --with scalapack=${scalapack} --with scalapackdir=${scalapackdir} \ ${modules_usage} --with default_version=1 \ --with system=${rpm_platform} \ --with prefix=/opt/fleur/${fleur_version}/${release} fleur.spec module load g95 # only on fjorm compiler=g95 # only on fjorm compiler_version=0.91 compiler_libdir=/opt/g95/0.91/lib/gcc-lib/x86_64-unknown-linux-gnu/4.0.3 compiler_bindir=/opt/g95/0.91/bin blas=blas blas_version=3.0.37.el5 blasdir=/usr/lib64 lapack=lapack lapack_version=${blas_version} lapackdir=/usr/lib64 openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=none scalapack=none release=1.${disttag_network}.${compiler}.${compiler_version}.${thread_mode}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above module load ifort # only on thul compiler=ifort # only on thul compiler_version=11.0 compiler_libdir=/opt/intel/Compiler/11.0/083/lib/intel64 compiler_bindir=/opt/intel/Compiler/11.0/083/bin/intel64 blas=mkl blas_version=10.1.3.027 blasdir=/opt/intel/${blas}/${blas_version}/lib/em64t lapack=mkl_lapack lapack_version=${blas_version} lapackdir=${blasdir} openmpi_version=1.3.3 ompi_libdir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/lib64 ompi_includedir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/include ompi_bindir=/opt/openmpi/${openmpi_version}-1.${disttag_network}.${compiler}.${compiler_version}/bin blacs=mkl blacsdir=${blasdir} scalapack=mkl scalapackdir=${blasdir} release=1.${disttag_network}.${compiler}.${compiler_version}.${thread_mode}.openmpi.${openmpi_version}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above
gulp
Installed with restricted access under gulp UNIX group (needs te be created in advance).
gulp
gulp. Download gulp.3.4.source.tar.gz ~/rpmbuild/SOURCES.
Build the following RPMS:
major_version=3 version1=4 cd ~/rpmbuild/SPECS thread_mode=nomp compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64/lib release=1.${disttag}.${compiler}.${compiler_version}.${thread_mode}.${blas}.${blas_version}.${lapack}.${lapack_version} rpmbuild --bb --with major_version=${major_version} --with version1=${version1} \ --with blas=${blas} --with blas_version=${blas_version} --with blasdir=${blasdir} \ --with compiler=${compiler} --with compiler_version=${compiler_version} \ --with compiler_bindir=${compiler_bindir} --with compiler_libdir=${compiler_libdir} \ --with lapack=${lapack} --with lapack_version=${lapack_version} --with lapackdir=${lapackdir} \ --with thread_mode=${thread_mode} --with parallel=1 \ ${modules_usage} --with default_version=1 \ --with system=${rpm_platform} \ --with prefix=/opt/gulp/${major_version}.${version1}/${release} gulp.spec # multithreaded thread_mode=mp compiler=gfortran43 compiler_version=4.3.2 compiler_libdir=/usr/lib64 compiler_bindir=/usr/bin blas=acml blas_version=4.3.0 blasdir=/opt/${blas}/${blas_version}/${compiler}64_${thread_mode}/lib lapack=acml lapack_version=4.3.0 lapackdir=/opt/${lapack}/${lapack_version}/${compiler}64_${thread_mode}/lib release=1.${disttag}.${compiler}.${compiler_version}.${thread_mode}.${blas}.${blas_version}.${lapack}.${lapack_version} !rpmbuild # as rpmbuild above
Testing packages
Test dacapo installation (as normal user!).
If you use modules:
module load campos-dacapo # fulfill all dependencies requested by module ulimit -s 500000 # dacapo needs a large stack
Test with (make sure that /scratch/$USER exists):
cp -r `rpm -ql campos-dacapo-python | grep "share/campos-dacapo-python$"` /tmp cd /tmp/campos-dacapo-python/Tests python test.py 2>&1 | tee test.log
It can take up to 1 day. Please consider disabling these "long" tests in test.py:
tests.remove('../Examples/Wannier-ethylene.py') tests.remove('../Examples/Wannier-Pt4.py') tests.remove('../Examples/Wannier-Ptwire.py') tests.remove('../Examples/Wannier-Fe-bcc.py') tests.remove('../Examples/transport_1dmodel.py')
Note all vtk related tests will fail.
Test gpaw installation (as normal user!):
If you use modules:
module load campos-gpaw # fulfill all dependencies requested by module
Test with:
cp -r `rpm -ql campos-gpaw | grep "share/campos-gpaw/test$"` /tmp/test.gpaw.$$ cd /tmp/test.gpaw.* python test.py 2>&1 | tee test.log
It takes about 20 minutes.