Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Analysis of a Power Series and Vertex Algebra Concepts, Exams of Construction

The concepts of power series, vertex algebras, and their related properties. It covers topics such as the definition and well-definedness of power series, multiplication of power series, and the definition and axioms of vertex algebras. The document also discusses the relationship between power series and vertex algebras, as well as the concept of locality.

What you will learn

  • What is the definition of a power series?
  • What is the definition of vertex algebra?
  • How is the well-definedness of a power series proven?
  • How are power series and vertex algebras related?
  • What are the axioms of a vertex algebra?

Typology: Exams

2021/2022

Uploaded on 09/12/2022

barnard
barnard 🇺🇸

3.9

(9)

230 documents

1 / 36

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Introduction to Vertex Operator Algebras
Prof. Christoph A. Keller
ETH urich
HS 2017
UNDER CONSTRUCTION
(VERSION AS OF 13.12.17)
Contents
1 Formal Power Series 2
1.1 OneVariable ................................ 2
1.2 Multiplevariables.............................. 4
2 Fields and Locality 5
2.1 Fields .................................... 5
2.2 Locality ................................... 7
2.3 Heisenbergalgebra ............................. 8
3 Vertex Algebras 9
3.1 Definition of a vertex algebra . . . . . . . . . . . . . . . . . . . . . . . 9
3.2 An Example: Heisenberg VA . . . . . . . . . . . . . . . . . . . . . . . . 12
3.3 Jacobiidentity ............................... 13
4 Conformal invariance 15
5 Vertex Operator Algebras 15
5.1 HeisenbergVOA .............................. 17
5.2 VirasoroVOA................................ 18
5.3 TensorProducts............................... 19
6 Modules 20
7 Correlation functions 22
7.1 MatrixElements .............................. 22
7.2 Cluster decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
7.3 obius transformations . . . . . . . . . . . . . . . . . . . . . . . . . . 25
1
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24

Partial preview of the text

Download Analysis of a Power Series and Vertex Algebra Concepts and more Exams Construction in PDF only on Docsity!

Introduction to Vertex Operator Algebras

Prof. Christoph A. Keller UNDER CONSTRUCTION (VERSION AS OF 13.12.17)

 - HS ETH Z¨urich 
  • 1 Formal Power Series Contents
    • 1.1 One Variable
    • 1.2 Multiple variables
  • 2 Fields and Locality
    • 2.1 Fields
    • 2.2 Locality
    • 2.3 Heisenberg algebra
  • 3 Vertex Algebras
    • 3.1 Definition of a vertex algebra
    • 3.2 An Example: Heisenberg VA
    • 3.3 Jacobi identity
  • 4 Conformal invariance
  • 5 Vertex Operator Algebras
    • 5.1 Heisenberg VOA
    • 5.2 Virasoro VOA
    • 5.3 Tensor Products
  • 6 Modules
  • 7 Correlation functions
    • 7.1 Matrix Elements
    • 7.2 Cluster decomposition
    • 7.3 M¨obius transformations

8 Lattice VOAs 26 8.1 Lattices................................... 26 8.2 Ingredients.................................. 27 8.2.1 Fock space.............................. 27 8.2.2 State-field map........................... 27 8.2.3 Conformal vector.......................... 29 8.3 Construction................................ 30

9 Characters, Modular Forms and Zhu’s Theorem 31 9.1 Characters.................................. 31 9.2 Rational VOAs and Zhu’s Theorem.................... 32 9.3 Modules of lattice VOAs.......................... 33

10 Monstrous Moonshine 34 10.1 Niemeier lattices.............................. 34 10.2 Sporadic groups............................... 35 10.3 The Monster VOA V ^........................... 35

1 Formal Power Series

1.1 One Variable

Let an ∈ U , U some vector space. Define formal power series

a(z) =

n∈Z

anzn^. (1.1)

Clearly these form a vector space, which we denote by U [[z, z−^1 ]]. We denote formal Laurent polynomials as U [z, z−^1 ]. We will also encounter U [[z]][z−^1 ]. Sometimes these are called formal Laurent series, and are denoted by U ((z)). (The logic of the notation is that [[z]] denotes infinitely many non-negative powers in z, and [z] only finitely many non-negative powers.) For λ ∈ C we define u(λz) :=

n∈Z λ

nu nz n. Example: ‘multiplicative δ-distribution’

δ(z) :=

n∈Z

zn^ (1.2)

Let us now take U to be an associative algebra with unit 1, so that multiplication is defined. In practice, we will work with two types of examples:

  • U = End(V ) for some vector space V.
  • U = C

and residues, Resz a(z) := a− 1. (1.9)

We can solve differential equations:

Lemma 1.2. R(z) ∈ U [[z]], initial value f 0. Then

∂z f (z) = R(z)f (z)

has a unique solution with f (z) ∈ U [[z]] and with intial data f 0 , i.e. f (z) = f 0 + O(z).

Proof. Recursion (Lemma 4.1 in [1]).

1.2 Multiple variables

Similar definition, e.g. for two variables:

a(z, w) =

n,m∈Z

an,mznwm^ ∈ U [[z, z−^1 , w, w−^1 ]] (1.10)

Multiplication of two different variables is fine: a(z)b(w) ∈ U [[z, z−^1 , w, w−^1 ]]. We define the binomial expansion convention:

(z + w)n^ =

∑^ ∞

k=

n k

zn−kwk^ ∈ U [[z, z−^1 , w]] (1.11)

Note that for n < 0, (z + w)n^6 = (w + z)n! Sometimes people use the notation ιz,w(z + w)n^ := (z + w)n, ιw,z (z + w)n^ := (w + z)n

Exercise 1.3. Show

  • (z + w)n(z + w)−n^ = 1
  • (1 − z)−^1 − (−z + 1)−^1 = δ(z)

We can use this to ‘shift’ the argument of power series to get:

a(z + w) ∈ U [[z, z−^1 , w]] (1.12)

(Proof: znwm^ has only one contribution, only for m ≥ 0.) We can ‘set w = 0’, that is extract the term w^0 , to get back a(z + 0) = a(z). We can now work with the δ function in two variables, δ(z/w). (Note that there is a clash of notation with [1]! We are using multiplicative notation, and also differ by a factor of w.) We can now multiply by any power series in z, not just Laurent polynomials: f (z)δ(z/w) = f (w)δ(z/w) ∀f (z) ∈ U [[z, z−^1 ]]. (1.13)

Some more useful properties of the delta distribution:

Proposition 1.3. δ satisfies the following properties:

  1. ∀f (z) ∈ U [[z, z−^1 ]] : Resz f (z)w−^1 δ(w/z) = f (w)
  2. δ(z/w) = δ(w/z)
  3. (z − w)∂ wj+1 δ(w/z) = (j + 1)∂wjδ(w/z)
  4. (z − w)j+1∂wjδ(w/z) = 0

Proof. Exercise.

2 Fields and Locality

2.1 Fields

From now on we will fix U = End(V ). In that case, note that for multiplications to exist, a weaker condition than (1.5) can be imposed: It is enough that for every v ∈ V ,

Inv = {(n 1 ,... , nk) : n 1 + n 2 +... + nk = n, a^1 n 1 a^2 n 2 · · · aknk v 6 = 0}

only has finitely many elements. a^1 (z)a^2 (z) · · · ak(z) is then a well defined element of End(V )[[z, z−^1 ]], since

a^1 n 1 a^2 n 2 · · · aknk v is a finite sum for all v. Just to annoy everybody, we now switch conventions.

Definition 2.1. A formal power series

a(z) =

n∈Z

anz−n−^1 ∈ End(V )[[z, z−^1 ]] (2.1)

is called a field if for all v ∈ V there is a K (sometimes called the order of truncation of a on v) such that anv = 0 ∀n ≥ K. (2.2)

An equivalent (and sometimes more useful) definition is that ∀v ∈ V , a(z)v ∈ V [[z]][z−^1 ]. That is, a(z) ∈ Hom(V, V [[z]][z−^1 ]). We will call the space of fields E(V ). Note: ∂a(z) is again a field, so E(V ) is closed under taking derivatives. We now want to define a multiplication on E(V ). To do this, we first need to define the notion of normal ordering. For a(z) =

n∈Z anz −n− (^1) define the ‘annihilation’ and

‘creation’ part as

a(z)− =

n≥ 0

anz−n−^1 a(z)+ =

n< 0

anz−n−^1.

Note: We choose this particular definition to ensure that (∂a(z))± = ∂(a(z)±). We can then define the normal ordered product:

: a(z)b(w) : := a(z)+b(w) + b(w)a(z)− (2.3)

2.2 Locality

We say a(z) and b(z) are mutually local if ∃K > 0 such that

(z − w)K^ [a(z), b(w)] = 0 (2.7)

K is sometimes called the order of locality. One might be tempted to multiply by (z − w)−K^ and conclude that the commutator vanishes. This however is in general illegal. In fact, we know from prop 1.3 that δ distributions are counterexamples. All counterexamples are of this form:

Theorem 2.5. Let f (z, w) ∈ U [[z, z−^1 , w, w−^1 ]] be such that (z − w)K^ f (z, w) = 0. We can then write

f (z, w) =

K∑− 1

j=

cj^ (w)

j!

z−^1 ∂wjδ(z/w) ,

where the series cj^ (w) are given by

cj^ (w) = Resz f (z, w)(z − w)j^.

Proof. Define b(z, w) := f (z, w) −

∑K− 1

j=0 c

j (^) (w)z−1 1 j! ∂

j wδ(z/w).^ We have Resz^ (z^ − w)nb(z, w) = 0 for all n ≥ 0: For n ≥ K, both terms in the definition of b give zero by 4 in prop 1.3. For n < K we can evaluate Resz. Both contributions cancel by using 3 in prop 1.3. Next write b(z, w) =:

n∈Z an(w)z

n. By taking Resz with

n = 0 we conclude that a− 1 (w) = 0. From this and taking Resz with n = 1 we conclude a− 2 (w) = 0 etc. b(z, w) thus only contains non-negative powers of z. Since (z − w)K^ b(z, w) = 0, it then follows that b(z, w) = 0, establishing the claim.

It follows that we can write

[a(z), b(w)] =

K∑− 1

j=

(a(w)j b(w))

j!

z−^1 ∂wjδ(z/w) (2.8)

If a(z) and b(z) are mutually local, then so are ∂a(z) and b(z). (To see this, take derivative of (2.7) and multiply by (z − w).) We want to establish the analogue of lemma 2.3 for local fields, i.e. establish that local fields form an algebra under normal products. The following Lemma due to Dong does that:

Lemma 2.6 (Dong). If a(z), b(z) and c(z) are pairwise mutually local, then a(z)nb(z) and c(z) are mutually local.

Proof. As in 5.5.15 in [2].

(In particular: : a(z)b(z) : and c(z) are mutually local.) Local fields and their derivatives thus form an algebra under the normal ordered product!

2.3 Heisenberg algebra

Let us now construct an explicit example of a local field. To do this, we need to take a brief detour to the theory of Lie algebras and their universal enveloping algebras.

Definition 2.2. Lie algebra: g vector space. A bilinear map [·, ·]g : g × g → g is a Lie bracket if

  1. [a, b]g = −[b, a]g (antisymmetry)
  2. [a, [b, c]g]g + [c, [a, b]g]g + [b, [a, c]g]g = 0 (Jacobi identity)

The pair (g, [·, ·]g) is called a Lie algebra.

Typical example:

Exercise 2.1. Let g be some associative algebra. Check: [a, b]g := [a, b] = ab − ba defines a Lie bracket.

Heisenberg algebra: h is spanned by the basis vectors αn, n ∈ Z and the central element k. The Lie bracket is defined as

[αm, αn]h = mδm+n, 0 k , [αm, k]h = [k, αm]h = [k, k]h = 0. (2.9)

Exercise 2.2. Show this is a Lie algebra.

We now want to turn a Lie algebra into an (associative) algebra.

Definition 2.3. g Lie algebra. Its tensor algebra is given by

T (g) :=

i≥ 0

g⊗i^ = C ⊕ g ⊕ (g ⊗ g) ⊕...

Its universal enveloping algebra U(g) of g is the associative algebra

U(g) := T (g)/I

where the ideal I is given by

I = 〈a ⊗ b − b ⊗ a − [a, b]g : a, b ∈ g〉

This simply means that we are modding out by the relation a⊗b−b⊗a−[a, b]g ∼ 0. Note: In the future we will simply write ab for a⊗b. U(h) will be the associative algebra for defining our fields. We now want to turn U(h) into some subalgebra of End(V ) for some vector space V. To this end, we first construct the Fock space V. Decompose h = h+ ⊕ h 0 ⊕ h− with

h+ =

n> 0

Cαn h− =

n< 0

Cαn h 0 = Cα 0 ⊕ Ck. (2.10)

such that the following axioms hold:

VA1 Y (a, z)| 0 〉 = a + O(z) (creativity)

VA2 [T, Y (a, z)] = ∂Y (a, z) and T | 0 〉 = 0 (translation covariance)

VA3 For all a and b, Y (a, z) and Y (b, z) are mutually local (locality)

Remark: From VA1 and VA2 it follows that the translation operator is given by T a = a− 2 | 0 〉. We will also see below that Y (| 0 〉, z) = IV , the identity on V. We want to turn our Heisenberg algebra into a vertex algebra. Clearly we want to define Y (α, z) = Y (α− 1 | 0 〉) := α(z). What about more general states in the Fock space V such as Y (α− 1 α, z)? It turns out that there is no choice: their fields are uniquely determined. To show this and find the correct expression, we need to show a few propositions first.

Proposition 3.1. 1. Y (a, z)| 0 〉 = ezT^ a

  1. ezT^ Y (a, w)e−zT^ = Y (a, w + z)
  2. ezT^ Y (a, w)±e−zT^ = Y (a, w + z)±

Proof. Show that both sides satisfy the same differential equation, then apply lemma 1.

Proposition 3.2 (Skewsymmetry). Y (a, z)b = ezT^ Y (b, −z)a

Proof. Using locality with proposition 3.1 gives

(z − w)K^ Y (a, z)ewT^ b = (z − w)K^ Y (b, w)ezT^ a.

Now apply proposition 3.1 to get

(z − w)K^ Y (a, z)ewT^ b = (z − w)K^ ezT^ Y (b, w − z)a

We want to obtain the result by ‘setting w = 0’ and then dividing by zK^. This we can do if we choose K large enough, since then everything is in (EndV )[[z, z − w]][z−^1 ].

Corollary 3.3. Y (| 0 〉, z) = IV

Proof. For all b we have Y (| 0 〉, z)b = ezT^ Y (b, −z)| 0 〉 = ezT^ e−zT^ b = b.

Theorem 3.4 (Uniqueness). Let B(z) be field that is local with respect to all fields in {Y (a, z) : a ∈ V }. Suppose for some b ∈ V

B(z)| 0 〉 = ezT^ b.

Then B(z) = Y (b, z).

Proof. Locality gives

(z − w)K^ B(z)Y (a, w)| 0 〉 = (z − w)K^ Y (a, w)B(z)| 0 〉

which we can write as

(z − w)K^ B(z)ewT^ a = (z − w)K^ Y (a, w)ezT^ b = (z − w)K^ Y (a, w)Y (b, z)| 0 〉

Choosing K large enough, we can apply locality to the RHS to get

(z − w)K^ B(z)ewT^ a = (z − w)K^ Y (b, z)ewT^ a

Since both sides only contain positive powers of w, we can set w = 0 and divide by zK to get B(z)a = Y (b, z)a for all a, establishing the claim.

Using all this, we are now in a position to prove a central result, which allows to evaluate fields recursively:

Proposition 3.5. Y (anb, z) = (Y (a, z)nY (b, z)) (3.1)

Proof. Define B(z) := Y (a, z)nY (b, z). The idea is of course to use theorem 3.4 to show that B(z) = Y (anb, z). To do this, first note that by Dong’s Lemma B(z) is indeed local. Next we want to show B(z)| 0 〉 = ezT^ anb by using differential equations. First note that from (2.6) B(z)| 0 〉 = anb− 1 | 0 〉 + O(z). (3.2)

Both sides of (3.1) thus satisfy the same initial condition. They also satisfy the same differential equation ∂z f (z) = T f (z).

For Y (anb, w) this is immediate. For Y (a, z)nY (b, z) this follows that from Lemma 2. and the fact that commutators satisfy the Leibniz rule. By Lemma 1.2 they thus agree. It then follows by theorem 3.4 that indeed Y (anb, z) = B(z).

Remembering definition (2.6) gives a very useful and explicit formula to evaluate fields coming from general states:

Y (anb, w) = Resz (Y (a, z)Y (b, w)(z − w)n^ − Y (b, w)Y (a, z)(−w + z)n) (3.3)

As immediate corollaries we can write:

Corollary 3.6.

[Y (a, z), Y (b, w)] =

K∑− 1

j=

Y (aj b, w)

j!

z−^1 ∂jwδ(z/w) (3.4)

VA3: Locality, is actually the easiest to check: Dong’s Lemma ensures that all Y (a, z) are mutually local, since α(z) is already local.

This establishes that V is indeed a vertex algebra. It is clear that we can generalize this construction:

Theorem 3.9 (Existence). Let V be a vector space, | 0 〉 a vector, T an endomorphism. Let {aα(z)}α∈A (A some index set) a local set of fields satisfying

  1. [T, aα(z)] = ∂aα(z)
  2. T | 0 〉 = 0
  3. aα(z)| 0 〉 ∈ V [[z]] and with aα^ := aα(z)| 0 〉z=0 the linear map

α Ca

α(z) → ∑ α Ca α (^) defined by aα(z) 7 → aα (^) is injective

  1. the vectors aα j 11 aα j 22 · · · aα jnn | 0 〉 with js ∈ Z−, αs ∈ A span V.

Then the map

Y

aα j 11 aα j 22 · · · aα jnn | 0 〉, z

:= aα^1 (z)j 1 (aα^2 (z)j 2 (· · · (aαn^ (z)jn IV )) (3.10)

defines a vertex algebra (V, T, | 0 〉, Y ).

Proof. See Theorem 4.5 in [1]. Idea: Choose basis of aα. 3 makes (3.10) well defined. Use Dong’s lemma for locality. Both ∂ and [T, ·] are derivatives wrt normal ordered product.

3.3 Jacobi identity

Let us discuss some more structural aspects of vertex algebras. In particular, there are other equivalent definitions of VAs. First let us address the question: In what sense is a vertex algebra actually an algebra? Note that we can consider the formal power series Y (a, z)b as a generating function of an infinite list of product operations ∗n : V × V → V , a ∗n b := anb. Are these products commutative, i.e. anb = bna? (3.11)

A quick look at e.g. Proposition 3.2 implies that they are not. Are they associative, i.e. do we have something like

an(bmc) = (anb)mc? (3.12)

Again, they are not, as e.g. (3.3) shows. The situation is however not quite as bad as it seems. Locality implies that two fields almost commute, i.e. they commute once we multiply with a factor. Locality is therefore sometimes also called weak commutativity, Similarly one can show that fields of a VA satisfy weak associativity:

Proposition 3.10 (Weak associativity). For all a, c ∈ V ∃k (depending only on a and c, not on b!) such that for any b

(z + w)kY (a, z + w)Y (b, w)c = (z + w)kY (Y (a, z)b, w)c

A slightly different point of view is given by identities of the form (3.3): They imply that non-commutativity and non-associativity are related somehow. That is, there is an infinite number of identities that the products ∗n have to satisfy. In fact, using the language of formal power series as generating functions, we can write the totality of these identities in the form of the so-called Jacobi identity:

Proposition 3.11 (Jacobi Identity, VA4). For any three a, b, c in a vertex algebra V we have

z− 0 1 δ

z 1 − z 2 z 0

Y (a, z 1 )Y (b, z 2 ) − z 0 − 1 δ

z 2 − z 1 −z 0

Y (b, z 2 )Y (a, z 1 )

= z 2 − 1 δ

z 1 − z 0 z 2

Y (Y (a, z 0 )b, z 2 ) (3.13)

All terms of this expression are well-defined. One way to see this is to explicitly expand out (3.13) and read off the coefficient of say z− 0 l−^1 z 1 − m−^1 z 2 − n−^1 , giving

i≥ 0

(−1)i

l i

(al+m−ibn+i − (−1)lbl+n−iam+i) =

i≥ 0

m i

(al+ib)m+n−i (3.14)

Since a, b, c are fields, the sums over i are actually finite. This is how we obatin the promised infinite list of identities. (3.14) is called the Borcherds identity, and from what we have said it is clearly equivalent to the Jacobi identity (3.13). Why do we call it the Jacobi identity? It is indeed a generalization of the Jacobi identity of Lie algebras. To see this, write the adjoint action of u on v as (adu)v := [u, v]g. The Jacobi identity is then

(adu)(adv) − (adv)(adu) = ad((adu)v) , (3.15)

which is exactly of the form (3.13). (In fact, (3.15) is a special case of (3.13).)

Proof. (Sketch) Indeed one can show (see theorem 4.8 of [1]) that translation and locality imply (3.14). The components of the proof are (3.4) written as

[am, Y (b, w)] =

j≥ 0

m j

Y (aj b, w)wm−j^ , (3.16)

and corollary 3.7. These are two special cases of (3.14), and it turns out that combining them is enough to establish the general case of (3.14), which then in turn is equivalent to (3.13).

VOA3 T = L− 1 (translation generator)

Let us list a few immediate consequences of these axioms:

  • We can replace the requirement T = L− 1 by the weaker requirement Y (L− 1 a, w) = ∂Y (a, w). To see this, use (3.16) to get

Y (L− 1 a, w) = Y (ω 0 a, w) = [L− 1 , a(w)]. (5.3)

  • From VA1 it follows that

Ln| 0 〉 = 0 ∀n ≥ − 1. (5.4)

  • In particular from L 0 | 0 〉 = 0 it follows that | 0 〉 ∈ V(0).
  • We also have L 0 ω = L 0 L− 2 | 0 〉 = [L 0 , L− 2 ]| 0 〉 = 2ω , (5.5) so that ω ∈ V(2).

Lemma 5.1. Let a ∈ V(wt a), i.e. a homogeneous of weight wt a. Then the mode an (defined from Y (a, z) =

n anz −n− (^1) maps

an : Vm → Vm+wt a−n− 1 , (5.6)

i.e. an is a homogeneous operator of weight wt an = wt a − n − 1.

Proof. Use (3.16) to obtain

[L 0 , a(w)] = Y (L 0 a, w) + Y (L− 1 a, w)w = (wt a)a(w) + w∂wa(w). (5.7)

Multiplying by wn^ and extracting the residue gives [L 0 , an] = (wt a − n − 1)an, which together with VOA2 gives the result.

Physicists prefer a different convention for the modes: For a homogeneous state a they write Y (a, z) =

n

a(n)z−n−wt^ a^ , (5.8)

such that a(n) = an+wt a− 1. The advantage of that convention is that wt a(n) = −n. The disadvantage is of course that it only works for VOAs, and only for homogeneous states. We will continue to use the mathematicians’ convention. Remark: Let a and b be homogeneous. Then almost all terms in Y (a, z)b have positive weight. More precisely, if k > 0 is such that anb = 0 for all n > k, then

Y (a, z)b ∈

n≥wt b+wt a−k− 1

V(n)

[[z]][z−^1 ] (5.9)

5.1 Heisenberg VOA

We have already established that the Heisenberg algebra is a vertex algebra. Let us now show that it is actually also a vertex operator algebra. The grading of V was introduced in the proof of proposition 2.7, i.e.

wt (α−n 1 · · · α−nk | 0 〉) =:

i

ni , (5.10)

so that clearly V(n) = 0 for n < 0. Next we define the conformal vector

ω :=

α− 1 α. (5.11)

From corollary 3.7, we immediately get

Ln = Resz zn+1Y (ω, z) =

Resz zn+1^ : Y (α, z)Y (α, z) :=

m∈Z

: aman−m : (5.12)

Note in particular that L− 1 = T , which establishes VOA3. Next we want to show VOA2. For this we use the following lemma:

Lemma 5.2. [Lm, αn] = −nαm+n (5.13)

Proof. Start with the commutator formula in the form (3.16) and read off the coefficient of w−m−^2 , giving

[αn, Lm] =

j≥ 0

n j

(αj ω)m+n−j+1 = nαm+n , (5.14)

where we have used the fact that the terms αj ω in the sum vanish unless j = 1.

To check VOA2, first note that we have L 0 | 0 〉 = 0. Next specializing (5.13) to m = 0 shows that each α−n increases the eigenvalue by n, since L 0 α−nb = [L 0 , α−n]b + α−nL 0 b = α−n(L 0 + n)b. By induction it follows that indeed L 0 a = wt (a)a. Finally we need to show VOA1, and establish the value of the central charge cV :

Lemma 5.3.

[Lm, Ln] = (m − n)Lm+n +

m(m^2 − 1)δm+n, 0 (5.15)

Proof. Again read off from (3.16) (beware of the shift in moding!)

[Lm, Ln] = [ωm+1, Ln] =

j≥ 0

m + 1 j

(ωj ω)m+n−j+2 =

j≥ 0

m + 1 j

(Lj− 1 ω)m+n−j+

(5.16)

By the Poincar´e-Birkhoff-Witt theorem, as a vector space we have

VV ir(`, 0) = U(L(≥2)) ' S(L(≥2)). (5.25)

That is, VV ir(`, 0) is spanned by vectors of the form

L−n 1 · · · L−nk | 0 〉` n 1 ≥ · · · ≥ nk ≥ 2. (5.26)

The weight of this vector is given by

∑k i=1 ni. Note that from the commutation relations (5.19) the grading is exactly given by the eigenvalues of L 0. We now want to construct a VA and VOA structure on V := VV ir(, 0) by applying theorem 3.9. We define T := L− 1 and | 0 〉 := | 0 〉. We define

ω(z) :=

n∈Z

Lnz−n−^2. (5.27)

This is clearly a field, as any vector in V has finite weight, and there are no vectors of negative weight. Next we want to show that it is self-local:

Exercise 5.1. Show: (z − w)^4 [ω(z), ω(w)] = 0.

Because of [L− 1 , ω(z)] =

n∈Z

(−n − 1)Ln− 1 z−n−^2 = ∂ω(z) (5.28)

  1. in theorem 3.9 is satisfied. 2.–4. are satisfied by construction, so that VV ir(, 0) is indeed a vertex algebra. To see that it is also a VOA, note that the grading conditions are automatically satisfied. We define ω = L− 2 | 0 〉, which is automatically ω ∈ V(2), since

L 0 ω = L 0 L− 2 | 0 〉 = [L 0 , L− 2 ]| 0 〉 = 2ω. (5.29)

By construction the modes of ω(z) satisfy the Virasoro algebra with central charge cV = . The translation operator and the grading are also satisfied. In conclusion we h ave that VV ir(, 0) is a VOA with central charge `.

5.3 Tensor Products

Let V 1 ,... , V r^ be VAs. The tensor product VA is constructed from the tensor product

V = V 1 ⊗ · · · ⊗ V r^. (5.30)

Its state field map is given by

Y (a(1)^ ⊗ · · · ⊗ a(r), z) = Y (a(1), z) ⊗ · · · ⊗ Y (a(r), z) (5.31)

and the vacuum state is | 0 〉 = | 0 〉 ⊗ · · · ⊗ | 0 〉 (5.32)

and the translation operator is

T =

∑^ r

i=

IV 1 ⊗ · · · ⊗︸ ︷︷ ︸T ⊗

i

· · · ⊗ IV r. (5.33)

VA1 follows immediately. VA2 can be shown using the Leibniz rule for the derivative. To show VA3, we take the order of locality K := max(K 1 ,... , Kr), and then use

(z − w)K^ Y (a(i), z)Y (b(i), w) = (z − w)K^ Y (b(i), w)Y (a(i), z) (5.34)

repeatedly to establish locality. We therefore have

Proposition 5.4. Let V 1 ,... , V r^ be vertex algebras. Then the tensor product vertex algebra (V, Y, | 0 〉, T ) as defined above is indeed a vertex algebra.

Now suppose that V i^ is also a VOA with central charge ci. Then the tensor product is also a VOA. Its grading is given by

V(n) =

n 1 +...+nr =n

V (^) (^1 n 1 ) ⊗... ⊗ V (^) (rnr ) (5.35)

and the conformal vector is

ω =

∑^ r

i=

| 0 〉 ⊗ · · · ⊗︸ ︷︷ ︸ω⊗ i

so that the Virasoro modes are given by

Ln =

∑^ r

i=

IV 1 ⊗ · · · ⊗ ︸ ︷︷ ︸Ln⊗ i

· · · ⊗ IV r. (5.37)

A straightforward computation shows that the Ln satisfy the Virasoro algebra with central charge c =

i ci. It is also straightforward to check^ VOA3^ and^ VOA2. We therefore have:

Proposition 5.5. The tensor product of finitely many VOA is a VOA whose central charge is the sum of the central charges.

6 Modules

Operational definition: A module of a VA V is a space W with a map V → E(W ) ‘such that all VA axioms that make sense hold’. To make this more precise, it is better to use the Jacobi axiom for a VA — that is, we will use VA1 and VA4. What structure do we want to maintain? We do not want to require the existence of a vacuum in the module W , and it therefore makes no sense to require creativity. We still want