Some understanding of Grassmann numbers out of intuition
This article (except the introduction and the afterwords) is my answer to one of the homework problems that I did when I took a quantum field theory course. The original problem asked to verify the formula for linear change of variables in integration. It was originally written on 2024-02-06.
Introduction
Although Grassmann numbers are purely mathematical concept, but like most people, I was introduced to them in physics class. I then had the natural question: how to formally define Grassmann numbers? In a homework given by my professor of QFT course, I found that I had to answer the question to do a problem in the homework in a way that I am satisfied with.
Numbers
Let and be two abelian groups such that . For convenience, for any , define . Define a multiplication on such that
- multiplication is associative, non-degenerate, and distributive over addition;
- are commuting numbers and are anticommuting numbers:
- and there is a unity such that for any finite number of summands.
We then have to have Therefore, is a commutative ring with characteristic zero, and is a -module. We can then define linear functions with this structure. In this sense, the multiplication on defines a symplectic bilinear form.
These are not enough to define every property we need for and . I will introduce more properties as axioms later.
Tensors
It seems that we need this property as an axiom: for any linear function , I call this property the first representation property, analog to the Riez representation theorem. I will call linear functions that maps objects to linear functionals, and the dual space of a -module as the set of all linear functionals on it.
With the fist representation property, we can identify with its dual space so that any multilinear map (tensor) have well-defined components. For any -linear map (or alternatively called a rank- tensor on ), we can write it uniquely in the form where the components , and the dummy indices are summed from to . Denote the set of all rank- tensors on as .
Similarly, we can define -linear maps (or rank- tensors on ), whose components are in , and denote the set of all of them as . Tensors from and those from can be multiplied and contracted together without any problems. However, the result of these operations may not be in or , but some tensor that takes arguments from both and .
Linear endomorphisms
Here we will need another property as an axiom: for any linear function , I call this property the second representation property. This is very similar to the first representation property, but it covers linear endomorphisms on instead of linear functionals on .
With the second representation property, we can prove that any possible linear endomorphism on can be written as a unique matrix in acting on the components of the argument: where are called the components of the linear endomorphism . From now on, we do not need to distinguish between matrices in and linear endomorphisms on .
For a matrix , we can define its determinant as where is the Levi-Civita symbol, which is a completely antisymmetric tensor on whose components take values in .
Analytic functions
For any , define a degree- monomial on as which is a degree- homogeneous function on . Note that different tensors may correspond to the same monomial. Especially, for any , a degree- monomial must be trivial (send any input to zero). Also, if there is any pair of indices such that is symmetric in exchanging them, then the monomial must be trivial. Therefore, we only need to consider the those completely antisymmetric tensors when studying monomials. Denote the set of all completely antisymmetric rank- tensors on as , and then the fact that we only need antisymmetric tensors to define monomials can be written as .
An analytic function on is defined as a sum of monomials: where , whose components may be referred to as expansion coefficients. We do not need to worry about the convergence because this is a finite sum (). Denote the set of all analytic functions on as .
Two properties of analytic functions:
- If , then for any , the translation .
- If , then for any , the linear transformation in the argument .
Integrals
Now we define that a linear function is called an integral if it satisfies the following property: which intuitively means that an integral is invariant under translation.
With this definition of an integral, we are now interested in the most general form of an integral.
Because is linear, we can find its form on monomials, and then sum them up to get the form on all analytic functions. As a linear function on monomials, it must be of the form (by the second representation property) where does not depend on . Plug this form into the translational invariance of , and we have (here the binomial coefficient should be regarded as its image under the natural ring homomorphism from to , which must be non-zero because has characteristic zero). Regarding as the independent variable, this equation is a homogeneous linear equation associated with the linear operator on defined as For the solution set of the linear equation to be the whole space , we need . Again by the second representation property, we need all the components to vanish (strictly speaking, we need the completely antisymmetric part to vanish, but they are already completely antisymmetric): The first term cancels with the term in the sum, so this equation does not impose any requirement for but only impose requirements for with . Then, we can induce on : the equation for does nothing; the equation for requires to vanish; the equation for , given that vanishes, now requires to vanish; and so on. For each , the equation additionally requires to vanish. Finally, when we reach , which is the end of the induction, we require to vanish for all , and there is no requirement for . Therefore, the integral of any monomial is zero except for the degree- monomial, and thus we only need to consider the th degree term when finding the integral of an analytic function.
Note that (in other words, the most general form of a completely antisymmetric rank- tensor on is a constant in times the Levi-Civita symbol). Therefore, where and . The definition of an integral does not impose any requirement for , so it can be any element in . For convenience, define for all , and then we have where is the image of under the natural ring homomorphism from to . The integral of any monomial with its degree different from is zero, so the integral of any analytic function is just that of its degree- term:
Linear change of integrated variable
Now, for a linear endomorphism and an analytic function , consider the integral . We only needs to consider the degree- monomial term, which is where is used. Notice that itself is a rank- completely antisymmetric tensor on , so it can also be written as a constant times . By letting be respectively, we see that the constant is just . Therefore, By the linearty of the integral, we have
Afterwords
Actually, before I wrote my answer, I already know the exterior algebra. In this article, my definition to Grassmann numbers is more abstract and puts the commuting numbers and anticomuuting numbers in more equal footings. This definition is closer to what I intuitively think Grassmann numbers could be.
There are several potential problems in this article:
- Some axioms are given, but I did not prove that they are consistent.
- Some claims are made without proof. They may turn out to be wrong.
- I did not prove that the usual definition of Grassmann numbers (with exterior algebra) can be formulated as a special case of my definition.
- I am not educated in supersymmetry, which is where Grassmann numbers are applied most. I only made my definition comply with the properties of Grassmann numbers that I have learned for doing the path integral of fermionic fields.