After reading Conal Elliott’s recent Reimagining Matrices post on defining matrices as inductive linear maps, I was inspired to try out denotational design too. Daniel Rowlands pointed out that Conal’s approach is tensor-inspired and further generalizable, so I wanted to find out what that meant.

I set out to define tensors as nicely as I could. I didn’t end up needing to do equational reasoning, but the notion of having a type-level representation and a semantic function fit the domain well.

The definition I’m using is that, for a given vector space over field F (say real numbers), a tensor on the space is a multilinear map from some number of vectors and dual vectors to F. This raises the notion of type, where a tensor of type (n, m) accepts **m** vectors and n dual vectors.

A little bit of algebra shows that tensors can be represented with **n** vectors and m dual vectors. You can combine a dual vector and a vector to get a value in the field, so you need to match the numbers of vectors and dual vectors and take the product of the values.

This is a meaningful representation mathematically (taking tensor products of vector spaces and dual spaces), but we are content it gives us an inductive definition of tensors. We have a TUnit wrapper that gives a tensor from a vector and a TCoUnit wrapper that gives a tensor from a dual vector. We combine these using TProduct.

My first go is having the type as implicitly computed variables, but given you are using classical tensor operations, tensor type can be computed at compile-time. So I also tried to piece up a version using the shapeless library, its natural number types and sized collections. With help from Travis Brown on StackOverflow, I got it to work. I used Double for the field for simplicity and V as an arbitrary vector space over Double.

The code is below and includes a simple test for the typed version.