Thanks to visit codestin.com
Credit goes to github.com

Skip to content

geezerrrr/matrix-toolkits-java

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

matrix-toolkits-java Build Status

MTJ is a high-performance library for developing linear algebra applications.

MTJ is based on BLAS and LAPACK for its dense and structured sparse computations, and on the Templates project for unstructured sparse operations.

MTJ uses the netlib-java project as a backend, which will automatically use machine-optimised natives, if they are available. Please read the netlib-java documentation for the extra steps needed to ensure that you are getting the best performance for your system.

Performance to Other Libraries

The java-matrix-benchmark clearly shows MTJ to be the most performant Java library for large matrices:

Relative Performance

A more complete breakdown is available: MTJ with system optimised natives wins almost every benchmark.

We recommend common-math for small matrix requirements as it provides a large variety of mathematics features, and EJML if performance on small matrices is more important than features.

Sparse Storage

A variety of sparse matrix / vector storage classes are available:

The LinkedSparseMatrix storage type is a novel storage type developed under this project. It maintains two tail links, one for the next matrix element by row order and another by column order. Lookups are kept into each row and column, making multiplication and transpose multiplication very fast.

The following charts compare the LinkedSparseMatrix against DenseMatrix for increasing matrix size (n x n) and number of non-zero elements, m. Rainbow lines indicate m varied from 10,000 to 100,000. Solid lines are for dense matrix, dashed lines are the sparse matrix.

The following is time to initialise the matrix:

init

The following is the memory consumption:

mem

The following is the time to perform a multiplication with a dense matrix and output into a dense matrix:

mult

Sparse Solvers

MTJ provides ARPACK for very large symmetric matrices in ArpackSym (see the example usage in ArpackSymTest). ARPACK solves an arbitrary number of eigenvalues / eigenvectors.

In addition, implementations of the netlib Templates are available in the no.uib.cipr.matrix.sparse package.

Users may wish to look at Sparse Eigensolvers for Java for another solver.

Legal

  • Copyright (C) 2003-2006 Bjørn-Ove Heimsund
  • Copyright (C) 2006-2014 Samuel Halliday

History

This project was originally written by Bjørn-Ove Heimsund, who has taken a step back due to other commitments.

Installation

Releases are distributed on Maven central:

<dependency>
    <groupId>com.googlecode.matrix-toolkits-java</groupId>
    <artifactId>mtj</artifactId>
    <version>1.0.1</version>
</dependency>

NOTE: There is a bug in the netlib-java that is required by MTJ 1.0.1. To workaround it, also depend on

<dependency>
  <groupId>com.github.fommil.netlib</groupId>
  <artifactId>all</artifactId>
  <version>1.1.2</version>
  <type>pom</type>
</dependency>

Unofficial single-jar builds may be available from java-matrix-benchmark for laggards who don't have 5 minutes to learn Maven.

Snapshots are distributed on Sonatype's Snapshot Repository:

<dependency>
  <groupId>com.googlecode.matrix-toolkits-java</groupId>
  <artifactId>mtj</artifactId>
  <version>1.0.2-SNAPSHOT</version>
</dependency>

Donations

Please consider supporting the maintenance of this open source project by starring it, above, and with a donation:

Donate via Paypal

Contributing

Contributors are encouraged to fork this repository and issue pull requests. Contributors implicitly agree to assign an unrestricted licence to Sam Halliday, but retain the copyright of their code (this means we both have the freedom to update the licence for those contributions).

About

Java linear algebra library powered by BLAS and LAPACK

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Java 99.6%
  • R 0.4%