## Index contraction tensor

contravariant indices and subscripts for covariant indices. Example 1.2.1. tensor products with g or g∗ followed by a contraction; now several indices may. 30 Apr 2019 over other state-of-the-art tensor contraction libraries and code. generators. Index Terms—Code Generation, Tensor Contractions, GPU. 18 Dec 2016 We begin by focusing on single- index contractions involving all the possible configurations of second-order and third-order tensors. Then, we 10 May 2017 the tensor or index—for a general network however, neither carry any the contraction between two pairs of indices of two rank-3 tensors, multiplication libraries available for nearly all systems. In general, this requires a layout transfor- mation of the tensors into a form where all contracted indices of

## 28 Dec 2006 Sometimes we refer to this summation as index contraction because the summed indices disappear at the end. This is also why summed indices

riemann_to_ricci, Convert contractions of Riemann tensors to Ricci tensors or A, asym, Anti-symmetrise or symmetrise an expression in indicated indices or 28 Aug 2006 Index. Notation Notation scalar a a vector a ai tensor. A. Aij. In either notation, we (h) Contraction or Trace of a tensor (sum of diagonal terms):. 5 Feb 2011 Contraction. If two free indices are set equal, they are turned into dummy indices, and the rank of the tensor is decreased by two. This. Tensors 28 Dec 2006 Sometimes we refer to this summation as index contraction because the summed indices disappear at the end. This is also why summed indices 30 Sep 2010 Tensor Contraction Schemes for 2D Problems in. Quantum Simulation 4.4 Index set arrangement for six tensors . . . . . . . . . . . . . . 33.

### riemann_to_ricci, Convert contractions of Riemann tensors to Ricci tensors or A, asym, Anti-symmetrise or symmetrise an expression in indicated indices or

Figure 3.1: A tensor of rank 3. and covariant rank 3 (i.e. total rank 5): Bαβµνφ. A similar tensor, Cαµνφβ, is also of contravariant rank 2 and covariant rank 3. Typically, when tensor mathematics is applied, the meaning of each index has been deﬁned beforehand: the ﬁrst index means this, the second means that etc. The diagrammatic tensor notation is useful for describing networks comprised of multiple tensors. An index shared by two tensors denotes a contraction (or summation) over this index.

### $\begingroup$ $\epsilon$ is a completely antisymmetric tensor, the negative comes from the exchange of a pair of indices. The $\sigma$ is a Pauli matrix and the dots indicate that the index refers to a conjugated spinor. The dots shouldn't be important however, they can just be regarded to be different indices than their dotless counterparts.

2 Aug 2014 The indices of the tensors are then represented by index “strings” that are pre- generated and then looped over to form the final product. The 21 Dec 2019 Tensor indices are contracted with the Einstein summation convention. An index can be in contravariant or in covariant form; in the latter case it 27 Feb 2017 GEMM indices: m, n, k. Loop order. Batched index. Paul Springer (AICES). High- Performance Tensor Contractions. Feb. 24th 2017. 5 / 17 which is the main focus and motivation for this work, we represent contractions as tensor index reordering plus matrix-matrix multiplications (GEMMs). This is a

## The diagrammatic tensor notation is useful for describing networks comprised of multiple tensors. An index shared by two tensors denotes a contraction (or summation) over this index.

In simple terms, Tensor Contraction refers to the process of summing over a pair of repeated indices. This reduces the order of a tensor by 2. Contraction can be applied to any tensor or product of tensors with an upper and a lower index free. It is just a sum over all tensor components for which these indices will take up the same value.

The index notation is a very powerful notation and can be used to concisely represent many complex equations. For the remainder of this section there is presented additional de nitions and examples to illustrated the power of the indicial notation. This notation is then employed to de ne tensor components and associated operations with tensors. Figure 3.1: A tensor of rank 3. and covariant rank 3 (i.e. total rank 5): Bαβµνφ. A similar tensor, Cαµνφβ, is also of contravariant rank 2 and covariant rank 3. Typically, when tensor mathematics is applied, the meaning of each index has been deﬁned beforehand: the ﬁrst index means this, the second means that etc. The diagrammatic tensor notation is useful for describing networks comprised of multiple tensors. An index shared by two tensors denotes a contraction (or summation) over this index. The contraction of a tensor with respect to any pair of upper and lower indices is defined similarly. The -fold contraction of a tensor that is -times covariant and -times contravariant is an invariant. Thus, the contraction of the tensor with components is an invariant , called the trace of the tensor; it is denoted by , 2.1. Vector and tensor components. Let x be a (three dimensional) vector and let S be a second order tensor. Let {e1, e2, e3} be a Cartesian basis. Denote the components of x in this basis by (x1, x2, x3), and denote the components of S by. I'm really confused by the notation of raising and lower indices in tensors when mixed with einstein summation notation and referencing the metric tensor. I need help separating several conflicting concepts and notations into a single framework that I can work with for reading literature. Efficient tensor contraction in python. I have a list L of tensors (ndarray objects), with several indices each. I need to contract these indices according to a graph of connections. The connections are encoded in a list of tuples in the form ((m,i),(n,j)) signifying "contract the i-th index of the tensor L[m] with the j-th index of the tensor L[n].