What is a Tensor?

If you're new to machine learning, you've almost certainly seen the word "tensor." Tensors are common data structures in machine learning and deep learning (Google's open-source software library for machine learning is even called TensorFlow). But what is a tensor, exactly?

In simple terms, a tensor is a dimensional data structure. Vectors are one-dimensional data structures and matrices are two-dimensional data structures. Tensors are superficially similar to these other data structures, but the difference is that they can exist in dimensions ranging from zero to n (referred to as the tensor's rank, as in a first-rank tensor which is one-dimensional).

This surface similarity is often what makes tensors difficult for people to grasp at first. For instance, we can represent second-rank tensors as matrices. This stress on "can be" is important because tensors have properties that not all matrices will have. Using the logic of "all squares are rectangles, but not all rectangles are squares," Steven Steinke explains that "any rank-2 tensor can be represented as a matrix, but not every matrix is a rank-2 tensor."

Tensor vs Matrix - What is the Difference?

The critical difference that sets tensors apart from matrices is that tensors are dynamic. This mathematical entity means that tensors obey specific transformation rules as part of the structures they inhabit. If other mathematical entities in the structure transform in a regular way, the tensor will transform as well, according to the rules established for the system. Not all matrices have this property, hence the fact that not all matrices are second-rank tensors.

Why are Tensors Used in Machine Learning?

Now that we have a working definition for tensors, why are they so popular in machine learning? Well, computers need data to learn, and tensors are a more natural, intuitive way of processing many kinds of data, especially big and complex data sets. Here's an example:

Video is a series of images correlated over time. We can use tensors to represent that correlation better and more intuitively than trying to convert it down to two-dimensional matrices. A third-rank tensor can encode all the aspects of each image (height, width, and color), while a rank-4 tensor could also hold information about time or order for the images.

Thus, tensors allow powerful computers to solve big data problems more quickly and allow deep learning devices and neural networks (which usually require hundreds or thousands of dimensions of data) to process the data more intuitively.


最新消息

Sorry, your filter selection returned no results.

请仔细阅读我们近期更改的隐私政策。当按下确认键时,您已了解并同意艾睿电子的隐私政策和用户协议。

本网站需使用cookies以改善用户您的体验并进一步改进我们的网站。此处阅读了解关于网站cookies的使用以及如何禁用cookies。网页cookies和追踪功能或許用于市场分析。当您按下同意按钮,您已经了解并同意在您的设备上接受cookies,并给予网站追踪权限。更多关于如何取消网站cookies及追踪的信息,请点击下方“阅读更多”。尽管同意启用cookies追踪与否取决用户意愿,取消网页cookies及追踪可能导致网站运作或显示异常,亦或导致相关推荐广告减少。

我们尊重您的隐私。请在此阅读我们的隐私政策。