Transcript for gr-sup1

Well, folks. This is the the, the,0:07

the first of I think 2 supervisions,0:12

but there's also tutorial.0:14

Now, I've never been,0:16

never been clear to me what the0:17

difference is between these things0:18

because it's not as if we do anything0:19

particularly different in each of these.0:21

I have various times trying to think0:23

is there an exciting thing we could0:25

do in in in this context, and I don't.0:28

I've never really thought0:31

of one that convinced me.0:33

Because the point of all all those exciting0:36

things to do in attribution computational,0:38

is to find some way of0:40

encouraging a conversation.0:41

Because the the basic idea is you have0:44

there are things you don't understand.0:47

Good to ask and I can try and sort out those0:50

problems which might be from last lecture,0:52

I might be from lecture 1 because0:55

there might be some of the there's just0:56

being a block at the very beginning,0:58

the whole thing, or maybe there's a1:00

little bit of tuning at the end that1:02

you that you just smoothed over.1:04

So this is the latest version of1:06

of doing the useful thing here.1:08

Which is I want to put up a questions1:11

in a bit of the padlet and so thank1:13

you to those who did that.1:15

So there aren't any of those.1:20

No one liked any of them,1:22

which is a pity, but that's not1:23

actually quite in this context.1:25

That's quite a useful button to like this.1:27

You know, this is an idea.1:30

I have this question too.1:31

So maybe when we do this for the1:32

next supervision of the tutorial,1:34

I'll encourage you again to to do that,1:35

but I think they are1:38

useful way of proceeding.1:39

Because always easier to ask a1:40

supplementary question than it is to ask.1:43

So the initial question but your1:45

hand up I think you're proceeding1:47

is to talk about these questions1:49

here which I I think are.1:52

I think that the the illustrate1:58

quite small but important,2:01

you know, Rd bumps here and I think2:03

though we'll use those I hope,2:06

as a way of asking for the for the questions,2:08

digging into other things2:10

that are a bit obscure.2:12

And if you have any,2:14

you feel free to add to add things2:16

to that even while we're talking or2:17

just put your hand up and see ask2:19

question you the old fashioned way.2:21

Just I have this question.2:22

OK, let's go through those2:25

vector in one form thing.2:28

Yes, and thinking about this it is the2:30

way that I introduce vectors 1 forms,2:32

vector one form and tensors does2:35

seem a bit self referential.2:38

It seems as if it's it's2:39

building on building on itself2:41

before it has an opportunity.2:43

And I think that's partly because the. Um.2:45

Because the definition.2:53

Of of a tensor is something2:54

which takes N1 forms and vectors2:57

as arguments, blah blah blah.2:59

Is presented to you before3:03

I've said what vectors in one3:04

forms are. So it feels a bit.3:06

You know, which is an inconsistency here.3:10

It's something missing this3:13

in this definition and I3:14

think it's just a matter of.3:15

I think it's just a matter3:17

of terminology in that.3:18

I think it's just a matter of terminology,3:21

so let's let's go through that.3:23

I think in, in, in in an order3:25

which doesn't confuse things.3:27

Vision. Can we see that? Nope.3:31

Oh God.3:41

So let's see a. I think the first thing.3:44

Is to say that there are there3:51

are there exist sets of.3:54

Things which we'll call centers.3:58

And there are multiple sets of these tensors.4:01

Or rather, there are we which have4:04

the structure first of all that4:06

they exist in the vector spaces,4:08

so each of those tensors of those spaces,4:10

so we'll call it tensor.4:14

And M. Uh. Is a.4:17

Is a vector space.4:29

And if you remember from the beginning,4:31

Part 2, by simply saying that4:34

I've said quite a lot about what4:37

these things called tensors are.4:40

OK, I said you could add them4:42

together to get another tensor.4:45

You can multiply by scalar,4:47

there exists a a unit.4:49

No, there's an inverse,4:53

and there exists a 0.4:54

So I've said a lot about these4:55

things just by writing that down, OK?4:57

And I'm saying that there are multiple5:01

sets of the so for each of this NM,5:03

and haven't said LMR yet,5:06

this thing called tensor is a vector space.5:08

And then I'm going to,5:13

for reasons which will become,5:15

you know, which will emerge5:16

as important later on.5:17

The definition, I'm going to say,5:18

going to have special names.5:20

To two of these spaces. So um.5:22

10. Tenters.5:28

Are called.5:32

Victors and 01. Called one.5:36

Go 1 forms and that's just I'm doing nothing5:43

other than giving a conventional name.5:46

To those two vector vector spaces ohh and.5:48

00.5:56

00 the 00 vector space are,6:01

we'll call those the the the members6:04

of that vector space fillers.6:07

We've done no math there.6:10

We've just called.6:12

We've labeled things.6:13

And then and that and it's at6:16

this point that we can in a sense6:18

legitimately give the the the6:20

definition that we sort of started off6:21

with in in the notes that a tensor.6:23

Is a function.6:30

OK. So. I'm at that point seeing already6:35

giving quite a lot of information is a6:40

thing which takes things and turns them6:43

into other things, and that sounds like6:45

another baby definition of function.6:48

It's usually that's trivial and silly,6:50

but it's a very general thing.6:52

So so you we are used to6:53

thinking of functions.6:56

We learn functions in school.6:57

There are some of you.6:60

There's something which things like X ^2,7:01

you take a number and it comes7:03

out with another number.7:06

And so it's easy to fall into the7:08

rut of thinking of a function as7:09

something which manipulates numbers.7:12

But a function in mathematical terms,7:14

I think, which manipulates7:15

things have done to other things.7:16

So it's a very general notion.7:18

So we're seeing a tensor.7:20

It's a function, obviously.7:23

Component.7:27

So how to write the components of7:31

potential in terms of the metric7:32

that's weird on the line here.7:34

So that's that we've because7:36

you've several several a couple of7:39

chapters at like half a chapter to7:42

go forward before we get to there.7:43

So we haven't got nearly as far7:45

as talking about components yet.7:47

So we know that there will be components.7:48

Yeah, yes. So in in Part 2 we we go on to7:54

talk about our components quite quickly.7:57

So the fact that we are saying this7:59

is a vector space means that we're8:01

importing all that technology.8:03

So we know that we will be at some8:04

point talking about components8:06

and basis and and bases.8:08

So there will be a basis,8:10

a basis which spans each of8:12

these of these spaces.8:15

So yes, so I think that yes,8:17

we will be talking about8:19

components eventually because8:21

we've talked about vector spaces,8:22

but that's still a little while yet8:24

because we haven't yet even really8:27

got a notion of what a tensor is.8:28

And.8:30

But it's important to be8:34

able to do that, but not yet.8:35

So it's a function.8:40

Which is something to do other things.8:41

So the question is.8:43

What does it turn into what?8:44

What does it turn into what?8:46

And the domain.8:50

Sorry, the range. Of each of all tensors.8:53

Is. The real line.9:00

So from all from all the things9:04

that the attention could be,9:06

we're now down to things which are.9:07

We mapped to the real line.9:11

We mapped R, not RN, just R.9:12

So tension thing which takes things9:15

and gives and turns them into numbers.9:18

What's what's the domain of9:20

of of of tensors? Um. The.9:23

To mean of. Over. In. And.9:30

Is uh.9:37

Well, and and and I'll say put it.9:40

I'll write this this way so that I can.9:43

Link to to that question is.9:48

And.9:55

Papa bomb.9:58

View one.10:02

Cross.10:04

Now I live in perpetual terror of10:20

getting you the wrong way around, OK?10:21

And and the wrong way around.10:23

I I may I may correct myself in a moment.10:25

So we're seeing that that's a that's a,10:29

a, a formal way of saying that the.10:33

Attention or not just has the10:39

the range of through line,10:42

but the things that go into it.10:44

The things that turns into a real number are.10:45

These things would say to10:51

call 1 forms and vectors.10:52

You said that vectors and one forms are just10:57

functions that take something into a real.10:60

Yes? But how can you have a denser11:02

which is something that there is a11:05

real into a real but? It doesn't.11:08

It's very weird that you say that11:12

the domain is waveforms in one form,11:14

which are theoretically real numbers.11:16

But you can't put the scale all right,11:17

because yeah, make something into it, yes.11:19

To another if if if I'm following you this,11:23

this isn't the the result of of the11:27

application of the of the of the of the11:31

tensor, it's the the function itself.11:34

So This is why I'm being quite out.11:38

So this is a function which takes11:40

functions as arguments. Yeah.11:42

That's why. That's why I was being.11:45

Very general about things and things.11:48

Not scale when you apply them to.11:52

So what this means?11:56

A10 tensor. It's called a vector.11:60

By this definition,12:02

that means that 10. Tensor.12:03

It's something which maps.12:10

The set rather informal notation.12:13

01.12:18

Suppose the set of.12:21

Yeah, we call that a vector that maps12:26

the set of 1 forms to the real line.12:30

So attention is something which maps.12:34

It's not not the not the result12:35

of applying that effect function,12:37

but the function itself.12:39

So that sounds very strange.12:42

A function which takes function as argument.12:46

But that's in a sense is why we introduced.12:48

It occurs to me, not just,12:52

not just now talking about it, that12:53

perhaps is why we we introduced this this,12:54

this terminology vectors in one forms12:56

quite promptly because what we want to12:59

immediately start thinking of those of13:02

those vectors 1 forms as as pointy things.13:04

So in this sense,13:07

what the attention is doing is it's taking.13:09

And you know that.13:14

Yeah, that's why we jump to.13:15

Not this rather formal13:18

way of thinking about it,13:19

but we jump to the terminology where13:20

we were talking about vectors in13:22

one forms because at this point we13:24

want to immediately start thinking13:26

of those as geometrical objects.13:27

So in that case I wanted you13:29

to answer some of which takes.13:32

A vector, no one form, sorry.13:35

Enter into number.13:38

Um.13:42

So.13:46

We did does that.13:52

You're still not looking.13:54

Totally convinced because you13:55

said that it takes a function.13:56

Yeah, but the scalar is a function.13:58

But they can't put a scalar13:60

into an argument as a no.14:02

0. Potential is a function.14:04

Yes, but they can put it as14:08

an argument to attention.14:09

You can't know.14:10

So, so sorry. Yes.14:12

So, so this definition of tensor14:14

says that we're going to restrict14:16

ourselves not to potentially don't14:19

take arbitrary tensors as arguments.14:22

We're going to, we've said not only the,14:24

the range of the changes in the real numbers,14:27

but the.14:31

Argument of tensors are going14:32

to be not just uproot tensors,14:34

but specifically.14:36

One forms and vectors,14:39

so of all the things,14:41

so you don't get 10 get A20 tensor14:43

as an argument to a tensor.14:46

We say, we declare.14:49

So it could be.14:51

Otherwise you could imagine a structure,14:52

and quite probably there are in the head14:54

of mathematicians structures where where14:56

you where you have what looked like tensors,14:58

which can take 10.15:00

You know other tensor tensors as arguments.15:01

We don't do that,15:06

so we we so so this is quite15:07

a narrow definition of.15:10

We've narrowed this definition general a lot,15:11

first by restricting the range through line.15:14

Secondly by restricting the arguments15:17

to only vectors in one forms.15:20

And we have divided up the15:23

sensors into things which take.15:27

And one of the other so,15:31

so these so and so,15:33

so the different so the 20.15:34

20 and two, one and 02 tensors15:40

are all separate vector spaces.15:43

In principle,15:45

they have nothing to do with each other.15:46

At this point I'm sorry to I'm I15:50

think I'm wondering worrying I'm sort15:52

of starting to you talk around in15:53

circles enough there's making it so15:55

more confusing than it than it is.15:57

I think the thing I want to15:58

go over is that yes,15:60

the definition as presented in16:01

in the notes does appear to be16:03

uncomfortably self referential,16:06

and it seems to start talking about16:08

vectors before we say what vectors are.16:10

And and. I think that what this might16:13

illustrate is that if you try and avoid that.16:18

By avoiding the word vector until16:22

a little bit later,16:24

you end up with an explanation,16:26

which is possibly a little you know,16:27

possibly more formally correct.16:30

But a bit more confusing.16:31

Does that. Feel better in a sense.16:36

I'm not seeing any nods or16:44

shakes of heads or tears, so.16:47

I think because I mentioned16:52

this this situation here with16:54

with with the other products,16:56

I think it's fairly natural to go on and16:58

talk about the the tensor product here.17:01

Because again, that's another17:04

thing that I think is looks more17:06

more exotic than it really is.17:09

And it's a way of.17:13

Building up a tensor from other tensors.17:17

And you know, the other product17:21

or the tensor product appears in17:23

multiple contexts in mathematics.17:25

It it basically means just17:26

putting two things together.17:28

Most of the cases where you you,17:29

you you are an outer product,17:31

you're you're just in jamming things,17:33

you're not into leaving them,17:35

you're not doing anything complicated,17:36

you just going 1/2 and put them17:37

network each other and and dealing17:40

with the parents together.17:42

That's the intuition that often17:43

is behind the notion of the tensor17:45

product or the product in mathematics.17:48

So in this case.17:50

The.17:54

So if we had.17:57

See function F of X is X ^2.18:01

The function G of. X is equal to.18:06

Um to. X + 1 for example.18:11

That each of those is a function which maps.18:16

I'm making this up as I go18:19

along so many time myself.18:20

Notes here and which maps18:21

the real line through line.18:23

Nice simple function,18:24

nothing exotic about them.18:25

But what about the function F?18:29

Outer product G. As a function.18:32

What does that mean?18:36

We're going to see that the definition of18:40

that outer product there in this context18:42

for functions some, you know, similar.18:45

It's similar to using other things is that?18:48

Is a function. Which has two arguments.18:51

And the. The way in which the definition18:57

of the what this function does to19:02

these two arguments is straightforward.19:04

It is defined. As the first linear19:06

product applied to the first argument.19:10

Multiplied by the 2nd.19:14

Applied to the other argument.19:16

Which in this case is X ^2.19:19

Times two y + 1.19:22

And there's nothing more to it than that.19:26

So the outer product in this context,19:30

when you apply it to functions,19:32

is just a way of composing,19:33

because there's multiple ways you can compose19:35

functions you know that you could talk about.19:38

If after G. And that's.19:41

Of X, which is F applied to G,19:45

of X, and so on.19:49

So there's multiple ways you can you can19:51

you combine function into other functions.19:53

You have you seen that notation after?19:54

Yeah, so this is just one of the ways you19:57

can make what you can combine functions,20:00

and it means nothing more than that.20:03

So. You make a two argument function20:05

from 2/1 argument functions.20:09

Specifically,20:10

by multiplying the first argument,20:11

the first function to the first argument,20:14

the second argument,20:15

the 2nd to set the first function to,20:16

the to the first argument,20:19

and the second function to the second20:20

argument, and the numbers together.20:22

And similarly,20:24

that means that if you have a.20:24

Vector V.20:29

Which is is I'm which is A1 form of argument.20:30

Outer product. Um, P for example.20:35

Which is A1 form takes the vectors20:39

argument and that is. A function.20:42

Is a function which can take.20:48

A1 form. And a vector as argument.20:52

Let's call that that them, um,20:56

Q&A. Which is such that by definition.21:00

We apply the first function.21:05

To the. First argument.21:07

The second function to the21:09

second argument and multiply the.21:12

These are both numbers multiplied together.21:14

So we're not doing anything there,21:18

but we're not doing there except21:19

we're just doing this case.21:20

We're doing it with specific functions21:21

which are vectors of one form,21:24

and you can do this we don't.21:27

But you can do this with21:29

higher order tensors,21:31

higher rank tensors as well.21:32

So you could have a V cross T and that21:34

if T is a is A2 argument tends to,21:37

then V cross T would be a21:39

three argument that tensor.21:41

So this is the way of combining21:43

lower first rank one object21:44

into a rank two objects.21:46

And So what? Yeah.21:50

And and what this is what's happening21:55

here with outer product here is21:57

that also the idea of you know,21:59

just in jamming multiple spaces together?22:01

And and becoming the domain22:04

of the of the Nam tensor.22:06

So it's the same notion.22:10

The sort of straightforward German.22:12

And.22:16

So if we look at that a22:19

particular. Question there.22:22

And. So what we're doing here22:25

is just what we're doing here.22:29

But for the special case where the22:34

vectors we're talking about are.22:36

Why is that blue, I wonder?22:37

Oh, the special case where the vectors22:41

in question are the basis vectors.22:45

So there this this T.22:48

Which now the you know,22:51

there's a couple of things22:54

going on here, one of which is22:55

the question of of components.22:56

This.22:60

And. The the The this object El Cross E.23:04

M. Cross EN. That's our function.23:13

Which takes what at arguments? Someone.23:19

31 forms, yes. So let's call them and23:27

P1P2. P. Three and, and and and that the.23:33

Value of that. Function. It is therefore23:40

A&I A31A30 tensor. Just purely because23:48

it takes 31 forms of argument.23:52

That's the definition and the and23:54

and the value of that function23:57

applied to those arguments is El23:59

applied to P 1 * E M applied P.24:01

2 * E and apply to P. 3.24:07

But we're doing a little bit24:15

more here by saying the 10th were24:16

actually want that's that's so.24:19

So we're defined at a tensor24:20

there AA30 tensor for each of24:23

every possible value of LM&N.24:27

The change that we actually24:30

want in this case.24:31

Is a linear combination of those.24:34

So it's. E 1 cross one, cross one plus,24:39

E some multiple of E 1 cross24:43

one cross E2 plus plus plus E 124:45

cross one cross E 3 and so on.24:48

So it's a large number.24:51

Uh. Ohh sorry. The way is is is different.24:55

That's OK. OK, switching switch,25:00

switching to pressure in the 1st Place Elm.25:05

NOFEL cross EM cross.25:12

Omega. And. That's a linear25:16

combination of n * n * N terms.25:20

Where?25:24

There are. Yeah, so? So there's n25:28

* n * N terms in that sum. What?25:32

Thanks. It was intended on that thumb.25:39

And that means that when you25:44

apply T to things. We apply.25:46

We just drop these arguments.25:54

You want one form vector argument into the.25:57

Argument positions of each of26:01

these things and straightforward.26:04

Real number of application of26:07

the results between them added26:08

the result and in this case.26:10

So there's there's two things happening here.26:12

First of all, in top think we're26:14

defining this tensor as the sum26:17

of N by N by N outer product.26:20

With a with a different coefficient,26:24

a different a different multiple of each26:26

of those which we can choose there. And.26:29

Just because. I mean,26:38

I mean the same thing in both,26:41

in both cases. It just is.26:42

It's it's the same thing, yes.26:43

So I I I'm rather than writing.26:45

And we'll have to avoid to to26:48

to slightly avoid confusion.26:51

I'm, I'm. I'm changing my mind26:54

between right bringing across.26:55

It looks a bit like an X. So yeah.26:57

So there's two things happening here.26:60

First, that's the.27:03

At most places, and it's true.27:05

It's just simple multiplication27:07

of real numbers.27:09

So that there's nothing, nothing,27:10

nothing exciting there, that's just 2 * 3.27:12

OK. Over there?27:15

If so,27:18

that's our sum of multiple terms with27:19

coefficients given by this matrix here.27:23

Essentially,27:25

we'll call it a matrix in that way.27:25

Key elements of27:29

TLM&N with isolation convention.27:32

It's a man. Well, I think right there27:36

there it's we're thinking as a matrix what27:41

we're going to discuss since when we.27:45

OK, so temporarily we're going27:48

to think of it as just matrix.27:50

OK all all is there is a matrix27:53

of numbers of north by N numbers.27:56

And then when we ask what is the value27:59

of that tensor, what the new make?28:03

What is the number to which this28:05

evaluates? We.28:08

You know, we just drop in.28:11

Section 1 forms if we specifically drop in28:15

basis one forms and basis vectors. Then.28:19

By the definition of product, we get this.28:22

Well, these are simple.28:27

Numerical applications.28:30

The next step here is to recall.28:33

That our choice.28:37

Of 1 forms by choice or one form basis,28:38

one forms are always going to be dual.28:42

To our basis vectors, meaning that we28:47

have we choose that the basis one forms.28:50

Will always have.28:55

In this situation,28:56

will always evaluate to the Chronicle delta,28:57

so that would be 0 unless L is equal to I.29:02

When it will be 1.29:05

So we can replace each of these terms29:08

by Delta IIL, Delta LM, Delta NK.29:10

At which point we can do the the three sums.29:14

So if we now do the sum over over,29:17

well, say L.29:21

So we add so that is.29:24

It is the summation sign in29:25

front of that which sums for L.29:28

M&N. From one to N. When we do that29:33

for L that term there will be 0.29:37

Except when L is equal to I.29:41

So the only 10 that will survive from that?29:44

Sum is when L is equal to I.29:46

Who do the sum of M exactly the same.29:50

The only term that survives is when M is29:53

equal to J and we'll do the sum over N.29:55

Go in terms of survive is the29:58

one where N is equal to K.29:60

Should we discover?30:02

By virtue of the definition30:04

of the outer product.30:06

That when we do specifically this and30:08

give not just arbitrary what one forms30:10

of vectors as arguments for the tensor,30:13

but specifically basis vectors in one forms,30:16

what we get out turn the handle.30:19

What we get out is this.30:21

The Matrix was started off with.30:23

Up there. And it's at this point we see ohh,30:27

that's not just a a matrix.30:31

We're going to see if that's30:33

the components of the tensor.30:35

And that's and and and30:36

when we in that context,30:38

we write this matrix carefully with30:39

indexes staggered and and and it's so30:42

although we started off saying well.30:45

We discovered that this,30:47

this set of numbers actually had a, a, A.30:49

We could we think of it in a30:52

slightly enriched way by thinking30:54

of it as the the component,30:57

because that's what they are of,30:59

of the center.31:02

So all tensors will have components.31:03

So for any tensor.31:07

That you do this to,31:09

you could do the same thing31:11

and get that answer there.31:13

So that this doesn't depend on that.31:15

For, that's true for any danger.31:20

What's special about this?31:23

Is that a particular way of31:24

calculating a higher rank tensor?31:27

Which we can see to be consistent,31:29

because when we do this to that tensor,31:30

what we get out is what we started with.31:33

So there's more than so. Uh.31:36

Now let's see could this right around?31:40

Not all tensors are the outer31:45

product of lower rank tensors.31:48

But all tensors can be decomposed into31:51

a sum of outer products like this.31:55

Below the 2.9 equation,32:00

but this is some of them by by.32:01

M by M. Or because these are just.32:06

Indexes was run from 1:00 to32:11

1:00 to end, so if you recall.32:14

So when we see T KLM.32:24

T = t LNEL. Because he. And cross.32:32

When we go and what we mean is.32:38

L = 1 to N. And.32:43

OK, that's confusing, right? Tea.32:52

LMNEL cross E. And cross E Omega32:60

N so remember that the the the.33:05

And the summation convention is33:13

taking the summation for granted,33:15

so there's north. And so I I see yes.33:17

So the problem is I've I've I've33:22

I've used in two different ways.33:23

They have brought.33:28

So I've used in two different ways, right?33:33

Didn't even think of that.33:37

So so yes, this this end here is a33:38

dummy index as opposed to this end here,33:42

which is obviously the the dimensionality33:44

of the of the vector species.33:46

Sorry, I I yeah,33:49

I make it myself to rewrite that.33:50

Questions.33:57

Answer.34:04

Space. Right. But which has more34:06

dimensions than all, right? So if the.34:11

That yes that's another restriction34:18

here to the to to to what we.34:20

Our initial definition definition34:24

of tensors as functions to34:26

take things and other things.34:28

We restricted it in various34:30

ways on the previous page,34:31

but I another restriction is that34:33

the vectors that you add. That you.34:36

Put his arguments in here.34:42

They have to have the same dimensionality.34:44

That should be in our.34:47

Vectors in an N dimensional vector space34:51

the same dimensionality of the tensor.34:54

So all of these one forms and vectors are34:57

all in a four dimensional vector space.34:60

4 dimensional vector spaces,35:02

same formation vector space. Umm.35:04

No, they are all vector spaces35:10

and each so each of the N&M.35:13

Usually sets of M tensors.35:16

Corresponds to a vector space.35:19

And they all must have the35:21

same number of dimensions. And.35:24

Alright, this doesn't work.35:27

But that, that, that's,35:28

that's a good point,35:29

which hadn't really occurred to me and now.35:29

In our case. We are all we are.35:36

We are. We're doing GR here, not math.35:39

So we are primarily interested in and.35:41

The case where the vector space35:44

in question is Minkowski Minkowski35:46

space or or the four dimensional35:48

space-time were interested in.35:50

So I think so I think I've sort of35:51

in some parts of this explanation35:55

I may have implicitly assumed that35:57

energy equal to four all the way35:59

through so but that's that's important36:01

point is yet another way in which36:03

we have the from the void range36:04

of possibilities of tensors we've36:06

narrowed that we've we've you chopped36:08

off possibilities here here and36:10

there and that's further constraint.36:11

So if you.36:16

Cancer, yes. In three dimensions.36:21

How would you feel?36:26

As you've chosen faces, right? Yeah.36:30

If you have like a choice36:35

of three basis vectors.36:36

Ohh, is he right? Yeah, right.36:41

Well, I think here if, if, if,36:44

if I have heard you properly then the.36:48

The way that you would.36:55

So what you're doing here is this is for,36:60

you know, arbitrary I GK.37:03

So pick an IG K when you're when you37:06

apply those three, the Omega one,37:09

Omega 2E4, whatever you apply those37:12

then what you get is is T 11124 here.37:15

So if you were doing this by hand,37:21

as it were, you'd never do this by hand.37:23

You would work your way through37:25

all of the the end dimensions here37:28

and get the NMN numbers here,37:31

and that would allow you to reconstruct37:34

that that, that, that tensor.37:36

But if you change your mind about what37:38

the basis vectors and what one forms were.37:41

You have to do the calculation37:44

all over again.37:45

And you get different numbers here,37:46

which would apply to.37:49

That would be right for the37:52

different set of basis vectors.37:53

1 forms in there so that and.37:55

I think this might be touching on the the the37:60

question on your question that This is why.38:03

The illustrating in a way or38:09

touching on the way in which the.38:11

The components are basically dependent.38:14

You change your mind about the basis.38:17

So you change your mind but need arguments.38:20

So you end up getting different numbers here.38:22

But.38:24

Potential is the same in both cases.38:27

And that's what it you know, in a way, what.38:29

Is into an answer to the other question.38:34

Geometrical objects.38:38

So you can see something has an underlying38:40

geometrical object associated with it.38:42

I think there's a couple of things.38:46

I, I there's a couple things I mean there.38:47

One is. When I see that,38:50

I mean this is not a a basis dependent thing.38:54

There's a thing which isn't basis dependent.38:59

Clearly that discussion there39:00

would highly be dependent.39:02

The point being that if you change your mind39:03

about the basis of 1 forms, you change the,39:04

the components you got out, but the.39:07

What this ends up supporting is a notion39:10

of the of the tense underlying tensor the39:15

the the key on the left hand side there,39:17

which isn't based dependent.39:19

And so one sense of geometrical.39:23

Is just it is some negative sense that by39:27

geometrical I mean not basis dependent.39:31

So it it, it, it's it's the same.39:34

It has the IT has a sort of continuity39:37

irrespective of the obesity you choose.39:40

But the other sense in which I'm talking39:42

about things be geometrical is that for39:46

all these things there is the the way39:49

that this man knowledge is built up.39:53

There is a fairly natural interpretation39:56

in terms of pointy things and and and and39:58

and planes that we can use to think with.40:01

And so the idea of a vector as a direction40:03

and another size is a geometrical notion.40:07

The idea of 1 forms as.40:11

Are foliation of of the space which40:15

has a direction basically and A40:18

and a size in a geometrical notion.40:20

And that matters because what40:25

we where we would end up with,40:27

we want to end up is talking40:30

about generativity in a in an40:32

approach where we're focusing40:35

on the shape of space-time.40:36

And the way we can do that40:39

is it is in a way which.40:41

Um. Are routes to that are route to40:45

that which which doesn't involve40:49

getting bogged does involve.40:51

So the same old fashioned way40:53

of introducing generativity and40:55

different geometry is, you know,40:57

start with components that beginning.40:59

It says a vector,41:01

something which transforms like41:02

a vector and I never liked that41:04

definition because it seemed that41:06

that really seems self referential41:08

in the in the sense that what the41:09

the the idea is there is that.41:13

Let's let's let's let's not let's41:21

not go there because I don't want41:22

to start you know teaching you.41:24

I think we I think even talking41:25

about I think is a slightly41:26

mad way of approaching this.41:27

But that was the same old41:28

fashion way of doing it.41:30

And then in the 70s it was with Mr Stone41:30

Wheeler in the vanguard of this this41:33

change to the way different different41:35

geometry generativity was taught.41:37

They are the ones who emphasized41:39

the the geometry first approach.41:41

And talking about and finally of the41:43

things like tangent vectors and and41:46

tensors defined in this rather abstract way,41:48

I'm talking about privileging41:51

them and interest them first,41:53

and then from that just discovering41:55

components and so the behavior of41:57

components near the transformation vector,41:59

like we it comes out,42:00

comes out in the wash.42:02

If you're like, it's a consequence.42:02

And there's another way you42:05

can talk about these things42:07

called geometric algebra,42:09

which is.42:10

I don't want to see fashion,42:14

but it's a bit of a hobby in some circles42:16

just now and that's the way it was.42:18

Really. Does see geometry at all42:20

and find a way of talking and and42:22

and talks about the the How you can.42:25

Define an algebra for directions.42:28

How do you combine two directions42:30

to give another direction?42:31

And so on. And it's very lovely42:32

and possibly very powerful.42:34

And there's some folk could bang on the42:36

table and say this is how George be taught,42:38

but you know, there's quite a lot of42:40

mathematical knowledge which has to42:42

be rewritten and rethought and so on,42:43

you know, for that to work.42:45

And this and this approach is OK.42:46

So if you are feeling42:48

under under underemployed,42:50

if you're feeling bored,42:52

then Google Geometric algebra and42:54

you'll find some sort of lecture42:56

notes and things about that.42:58

It's very nice.43:01

Do not distract yourself with that, OK?43:01

But we always mention it because that.43:04

Very, very much.43:06

That's really going all the way43:07

along a line which says Geometry43:09

1st and and how do you?43:11

How do you?43:13

Calculate with geometry43:17

with geometrical objects.43:19

Components are one way.43:21

The other way we do, we do,43:21

we do in this, in this case,43:22

we the way folk do in this context.43:24

So, so yeah, I suppose in that43:31

sense I do mean two things.43:33

Geometrical, I mean, I mean there's43:35

this general approach which is.43:37

Shapes first, directions 1st.43:41

And when I see this is geometrical43:42

object, I mean this is a43:45

component independent object.43:47

This is a basis independent object.43:48

And yes. That's your question.43:52

I think ohh. I've seen.43:58

Can we go over like? Like basis.44:02

Because I understand like.44:08

OK.44:12

More than.44:15

Alright, OK. You know is that there is44:18

a particular part of the of the of the44:21

notes that you're thinking of there.44:23

Ah,44:31

OK. It's relating to.44:34

So.44:44

OK.44:50

OK, so.44:52

Is it? And this is it the point44:57

before or after that statement44:59

that we're talking about here?45:01

I didn't read. Ohh yeah I think so.45:06

Like it?45:12

Considering.45:15

Right, so Section 2 to 7, let's see.45:20

Let's go back and see what I see there.45:24

Changing business. OK, so.45:29

I don't know the famous words.45:36

It is easy to show. Um. Right, I I'm.45:39

Thank you.45:49

It's easy to show.45:51

I'm going to try and come up45:55

with that easy show live.45:58

I'm not convinced I am.46:01

Well, how do we do this? We. Um.46:03

Well, let's let's have a go46:13

and see what happens. So um.46:15

Grade A. Ibar Lambda I, bar I. AI.46:21

BJ bar equals Lambda J bar J.46:30

PG. So if then we say T. Um.46:35

Etcetera or I. Let's say we write.46:42

T is our.46:49

If you 20 tensor, so it46:53

takes 2 vectors. So, um T.46:55

IG will be equal to. Um.47:01

TE i.e. G. Right.47:08

OK, but this might work now if47:13

that's we if we rewrite that as E.47:17

I bar Lambda I. Bar i.e.47:22

G bar Lambda G Bar G.47:28

What I have done there is simply done this.47:34

For the basis vectors EI and e.g,47:38

SOEI will be EI expressed in the other basis.47:42

We have an expression like this but.47:46

We said that ohh yeah.47:49

Another restriction on the on on the47:51

functions of the vectors are the tensions47:52

are is that they're linear in each argument.47:54

We said that that's crucial.47:56

And what that means is we47:59

can take these numbers out.48:02

So we can write that as.48:04

Lambda I bar I. Lambda G Bar G.48:06

Thank you, TEI bar.48:13

EG bar.48:17

Which is the Lambda I bar I Lambda G bar.48:18

GTI Bar G. Both. Which I48:25

think is what was required.48:30

So all we've done there is use this48:35

general property of expanding 1 vector.48:39

In terms of I know the basis using48:42

the transformation matrix there.48:47

And. Use the linearity.48:50

To take these numbers out.48:54

Leaving expression there which we48:57

recognize as a way of getting the48:59

T in the components that the IGIBJ49:02

bar components of T in the bar.49:05

Precious. And I haven't.49:09

I didn't draw my memory for49:13

any of these things.49:15

In each of these cases,49:16

I Simply put in the eyes and49:18

eye bars in the only place that49:20

could go that was consistent49:22

with the Association Convention.49:25

The answer to. Getting very direction.49:30

Invite previous. OK. I finally I49:36

finally better be quick. I'm sorry.49:41

Right, because the part of the49:46

definition of the of the tensors49:48

was they were functions which49:51

are linear in each argument.49:52

And what linear means what linear,49:54

So what? Just to recap there,49:56

what linear means F? Is linear.49:57

Means that F of a X is equal to a. F of X.50:02

So Lambda here are so each of the.50:11

Each of the components of that matrix.50:15

So that's that's an end by matrix.50:18

For each of the components of it is a number.50:21

So in this thing here each of the.50:23

Lambda is a number and50:29

therefore can come out.50:31

OK, I'll see you. I think.50:34

I see you next Wednesday and I50:36

think we're in a different room.50:38