Understanding Tensor Dimensions: From Scalars to High-Dimensional Tensors
In the realm of data science and machine learning, the concept of tensors is fundamental. A tensor can be thought of as a generalization of matrices to higher dimensions. This blog post will take you through the hierarchy of tensors, starting from the simplest form, order 0, and progressing to higher dimensions.
Order 0 Tensors: Scalars
An order 0 tensor is the simplest form of tensor, often referred to as a scalar. Scalars are single numerical values and do not have any dimensions. They are just plain numbers, like 5, -3.14, or 42.
Example:
Here, ( a ) is a scalar.
Scalars are fundamental in mathematical operations, providing the simplest building block for more complex tensor structures.
Order 1 Tensors: Vectors
When we move to order 1 tensors, we encounter vectors. A vector is a one-dimensional array of numbers. Vectors have magnitude and direction, making them useful in a variety of applications, from physics to machine learning.
Example:
OR
Here, v is a vector with three elements.
Vectors can be thought of as a list of scalars arranged in a specific order. They are often used to represent data points in space. They can also be thought of as analogous to an array in data-structures.
Order 2 Tensors: Matrices
Order 2 tensors are more commonly known as matrices. A matrix is a two-dimensional array of numbers arranged in rows and columns. Matrices are ubiquitous in various fields, especially in linear algebra, computer graphics, and machine learning.
Example:
Here, A is a 3x3 matrix.
Matrices enable the representation and manipulation of linear transformations, making them indispensable in solving systems of linear equations and performing operations like rotations and translations.


