Geek

Taco: This New Code From MIT Offers 100-Times Speed Increase

When we take sparse data into consideration, the analytic algorithms have to perform lots of additions and multiplications by zero. To imagine such situation, take an example of a big table of Amazon customers against all its products–1 is assigned for each product bought by a customer and 0 if no purchase is made. To avoid this hassle of dealing with zeroes, programmers write custom code, which is very complex and applicable to few problems.

To take care of this issue, a team of MIT researchers, in collaboration with Adobe Research and French Alternative Energies and Atomic Energy Commission, has created a new system named Tensor Algebra Compiler (Taco). In computer science, Tensor term is used for a higher dimensional matrix.

New Taco code offers a 100-times increase in speed over existing, non-optimized software packages. The performance of the system competes with the earlier-mentioned hand-optimized code for dealing with sparse data. This means a programmer needs to do lesser work on his end.

Removing the need for customized sparse matrix operations, the team has given us the “ability to generate code for any tensor-algebra expression when the matrices are sparse,” according to Saman Amarasinghe, an MIT professor of electrical engineering and computer science.

With Taco, a programmer simply needs to specify the size of a tensor and the location of the file from which it should import values. It then builds a hierarchical map that discards zero pairs and fastens the computation.

Moreover, Taco uses a better indexing scheme to only store nonzero value of sparse tensors. A publicly released tensor by Amazon, which takes up about 107 exabytes of data with zeroes, takes up only 13GB without zeroes.

You can read more about this recent advancement in sparse-matrix computation on MIT News website.

To Top

Pin It on Pinterest

Share This