Template:Installing BLAS and LAPACK

From AbInitio

(Difference between revisions)
Jump to: navigation, search
Revision as of 22:59, 22 October 2005 (edit)
Stevenj (Talk | contribs)

← Previous diff
Revision as of 01:42, 23 October 2005 (edit)
Stevenj (Talk | contribs)

Next diff →
Line 3: Line 3:
The first thing you must have on your system is a BLAS implementation. "BLAS" stands for "Basic Linear Algebra Subroutines," and is a standard interface for operations like matrix multiplication. It is designed as a building-block for other linear-algebra applications, and is used both directly by our code and in LAPACK (see below). By using it, we can take advantage of many highly-optimized implementations of these operations that have been written to the BLAS interface. (Note that you will need implementations of BLAS levels 1-3.) The first thing you must have on your system is a BLAS implementation. "BLAS" stands for "Basic Linear Algebra Subroutines," and is a standard interface for operations like matrix multiplication. It is designed as a building-block for other linear-algebra applications, and is used both directly by our code and in LAPACK (see below). By using it, we can take advantage of many highly-optimized implementations of these operations that have been written to the BLAS interface. (Note that you will need implementations of BLAS levels 1-3.)
-You can find more BLAS information, as well as a basic implementation, on the [http://www.netlib.org/blas/ BLAS Homepage]. Once you get things working with the basic BLAS implementation, it might be a good idea to try and find a more optimized BLAS code for your hardware. Vendor-optimized BLAS implementations are available as part of the Compaq CXML, IBM ESSL, SGI sgimath, and other libraries. Recently, there has also been work on self-optimizing BLAS implementations that can achieve performance competitive with vendor-tuned codes; see the [http://math-atlas.sourceforge.net/ ATLAS] homepage (and also [http://www.icsi.berkeley.edu/~bilmes/phipac/ PhiPACK]). Links to more BLAS implementations can be found on [http://SAL.KachinaTech.COM/B/0/BLAS.html SAL]. I recommend ATLAS, but it does take some time to compile.+You can find more BLAS information, as well as a basic implementation, on the [http://www.netlib.org/blas/ BLAS Homepage]. Once you get things working with the basic BLAS implementation, it might be a good idea to try and find a more optimized BLAS code for your hardware. Vendor-optimized BLAS implementations are available as part of the Compaq CXML, IBM ESSL, SGI sgimath, and other libraries. Recently, there has also been work on self-optimizing BLAS implementations that can achieve performance competitive with vendor-tuned codes; see the [http://math-atlas.sourceforge.net/ ATLAS] homepage. ATLAS works well, but it does take some time to compile.
Note that the generic BLAS does not come with a <code>Makefile</code><nowiki>; compile it with something like: </nowiki> Note that the generic BLAS does not come with a <code>Makefile</code><nowiki>; compile it with something like: </nowiki>
Line 19: Line 19:
=== LAPACK === === LAPACK ===
-LAPACK, the Linear Algebra PACKage, is a standard collection of routines, built on BLAS, for more-complicated (dense) linear algebra operations like matrix inversion and diagonalization. You can download LAPACK from the [http://www.netlib.org/lapack/ LAPACK Home Page]. More LAPACK links can be found on [http://SAL.KachinaTech.COM/B/0/LAPACK.html SAL].+LAPACK, the Linear Algebra PACKage, is a standard collection of routines, built on BLAS, for more-complicated (dense) linear algebra operations like matrix inversion and diagonalization. You can download LAPACK from the [http://www.netlib.org/lapack/ LAPACK Home Page].
-Note that MPB looks for LAPACK by linking with <code>-llapack</code>. This means that the library must be called <code>liblapack.a</code> and be installed in a standard directory like <code>/usr/local/lib</code> (alternatively, you can specify another directory via the <code>LDFLAGS</code> environment variable as described earlier). (See also below for the <code>--with-lapack=''lib''</code> option to MPB's <code>configure</code> script, to manually specify a library location.)+Note that our software looks for LAPACK by linking with <code>-llapack</code>. This means that the library must be called <code>liblapack.a</code> and be installed in a standard directory like <code>/usr/local/lib</code> (alternatively, you can specify another directory via the <code>LDFLAGS</code> environment variable as described earlier). (See also below for the <code>--with-lapack=''lib''</code> option to our <code>configure</code> script, to manually specify a library location.)

Revision as of 01:42, 23 October 2005

BLAS

The first thing you must have on your system is a BLAS implementation. "BLAS" stands for "Basic Linear Algebra Subroutines," and is a standard interface for operations like matrix multiplication. It is designed as a building-block for other linear-algebra applications, and is used both directly by our code and in LAPACK (see below). By using it, we can take advantage of many highly-optimized implementations of these operations that have been written to the BLAS interface. (Note that you will need implementations of BLAS levels 1-3.)

You can find more BLAS information, as well as a basic implementation, on the BLAS Homepage. Once you get things working with the basic BLAS implementation, it might be a good idea to try and find a more optimized BLAS code for your hardware. Vendor-optimized BLAS implementations are available as part of the Compaq CXML, IBM ESSL, SGI sgimath, and other libraries. Recently, there has also been work on self-optimizing BLAS implementations that can achieve performance competitive with vendor-tuned codes; see the ATLAS homepage. ATLAS works well, but it does take some time to compile.

Note that the generic BLAS does not come with a Makefile; compile it with something like:

mkdir blas && cd blas # the BLAS archive does not create its own directory
get http://www.netlib.org/blas/blas.tgz
gunzip blas.tgz
tar xf blas.tar
f77 -c -O3 *.f   # compile all of the .f files to produce .o files
ar rv libblas.a *.o    #  combine the .o files into a library
su -c "cp libblas.a /usr/local/lib"   # switch to root and install

(Replace -O3 with your favorite optimization options. On Linux, I use g77 -O3 -fomit-frame-pointer -funroll-loops, with -malign-double -mcpu=i686 on a Pentium II.) Note that MPB looks for the standard BLAS library with -lblas, so the library file should be called libblas.a and reside in a standard directory like /usr/local/lib. (See also below for the --with-blas=lib option to MPB's configure script, to manually specify a library location.)

LAPACK

LAPACK, the Linear Algebra PACKage, is a standard collection of routines, built on BLAS, for more-complicated (dense) linear algebra operations like matrix inversion and diagonalization. You can download LAPACK from the LAPACK Home Page.

Note that our software looks for LAPACK by linking with -llapack. This means that the library must be called liblapack.a and be installed in a standard directory like /usr/local/lib (alternatively, you can specify another directory via the LDFLAGS environment variable as described earlier). (See also below for the --with-lapack=lib option to our configure script, to manually specify a library location.)

Personal tools