Tensors are totally useful. However these particular references seem difficult and rather specialized. There must be some elementary reference or cheat sheet on tensors for classical theory of rigid body motion. The classic works on mechanics are by Lagrange and Hamilton.

Here's a text that may be a good systematic starting point for someone already trained to read math:

Especially chapters 1,2, and 10 (Elasticity).

The top-level clarification I'd like to make is the following:

A vector space V of dimension n is a set with the following operations : vector addition + and scalar multiplication, which is isomorphic to Euclidean n-dimensional space. (I'll assume you can look that up.)

A tensor T is a linear functional mapping from a product of vector spaces to the scalar field, say real numbers R.

(For simplicity let's say the vector spaces are all the same V.):

T: V x V x V x … x V ---> R

(k times)

The degree of the tensor is the number of arguments -- the number of vector spaces in the product domain.

(In our case, degree T = k because there are k copies of V) Each argument is a vector (hence __each__ argument has a number components = dimension of the vector space V)

T(v1, v2, v3, …, vk) is a scalar number.

So far this is not yet a tensor but a general map. A __tensor__ has two basic features:

(1) It is linear in each argument:

For each argument, say __in the i-th argument for each i:__

T(… , u + v, …) = T(… , u , …) + T(… , v, …) for all vectors u, v in V

and __in the i-th argument for each i__

T(… , s * v , …) = s * T(… , u , …)

for all scalars s in R, and all vectors v in V

(2) The value of T does not change when you change the coordinate basis of V according to which you represent the vectors in V.

The way this is written out in classical books on tensors is in terms of the orthogonal matrices for transforming a vector written w/r to one basis to that same geometric "arrow"'s coordinates w/r to another basis. It's messy, but systematic.

Elegant modern maths write tensor in "co-ordinate-free" notations, but that hides some of the gritty intuition, and permits some basic confusions such as confusing vectors with tensors.

Tensors "eat" vectors and spit out scalar numbers. And they can't be fooled by camouflaging the vectors changing their numeric coordinates by changing the basis w/r to which they are represented.

PS. By definition there are maps: V x V x V … x V --> R called "forms" that eat vectors multi-linearly and produce scalars. But they

__do__ change value when the basis is changed. Spaces of linear forms on a vector space V have as their bases, the duals to the basis vectors of V. If the basis of V is say an orthonormal set of vectors v1, v2, …, vn, the dual forms dx1, dx2, …, dxm are defined by:

dxi ( vj ) = δij

where δij is the Kronecker delta function.