Excellent. Lecture 4 and before we get
going in this lecture, we'll finish off.
Part 2 is is the plan and that
shouldn't be too difficult.
The first half of this is going to be a bit
more of a remark through notation I'm afraid.
The second-half should be more examples
and should be a little bit more conceptual,
so basis for the first half,
but it should be over soon
before we get going.
I mentioned a couple of things on the Moodle.
And one of which is that,
oh, that the microphone on.
That, but that's.
I'm putting a button called mute,
mute, mute, OK. One thing is that.
On the middle page there is a link to.
Oh, I can't get to that
piece from here anyway.
A couple of changes of things
as I mentioned in the message.
That said, via Moodle,
there is a podcast of the audio
recordings of the lectures which
should be findable at the end of the
link that was in that modal message.
So that is a bit of an experiment,
both technical pedagogically and technically.
So any comments or thoughts with that?
I am very keen to hear them
other things that are here.
Are though. Let audio podcast.
But we can't get there from this computer.
That thing, they're all notes with HTML.
That's our bundle of the entire
collection of notes format as HTML.
It's sort of readable and might
be useful in some circumstances.
It's a lot less pretty than the PDF version,
but it might be of some interest.
The panel I mentioned a moment.
Now, the stream channel, which,
well, let's not worry about that,
which I think I pointed to you before.
It has a couple of things.
It will have the brief 5 minute overviews
of the various parts will appear there.
Part 2 is there.
Part Three will appear.
Part Four will appear also
as of 10 minutes ago.
It has copies of the first couple of my
lectures from autumn 2020, so they're
the recordings of the zoom lectures.
The the first ones I did so
the little bit Robbie but.
I'll, I'll, I'll,
I thought.
I've been two minds about putting
those up and because.
I was in two minds. I'll put them up.
The deduction is not that.
The that these lectures are are not
important, and I'll put them up
with a bit of a delay in any case,
but there are an extra resource
to use as you think appropriate.
You're all have whole new learning
strategies over the years and
I'm sure you'll be sensible
about these are the imaginative,
so I will put them up there.
Put these two year old lectures up as.
In time. And
I will point you again to the lecture
notes folder in the middle which has the
notes and the screen versions and the
slides of the parts that have completed.
So after today I'll put up
the Part 2 slides PDF.
There's nothing in the slides as
you're as you're aware there isn't
in the notes apart from the sort of
answers to the quick questions things,
but they will appear there in good time.
I think I, yes,
I think I have to date there.
Regarding the padlet,
I see there are some good questions there,
a couple of which I think I've answered.
As I say, if you think I haven't answered,
if I haven't answered that then change the
color back to White and I'll see it again.
And I've just entered to the of resolutions.
I I took compendium version of all
the solutions and the compendium
of of notes on the solutions.
I'll release that either at the end of
the semester or beginning of the next one.
So it's a good time for revision but
appears after after the lectures
and some of the slides.
And I answered these two questions as
they change the color back to white.
If you think that's if you disagree
and I haven't had a look at these
other to quick look look at this one.
Someone asking, I'll post a note on this.
Is there any significance
to this order being swapped?
No, and there isn't,
because the metric is our symmetric tensor.
So in general these two things
would be importantly different.
Because the metric is by
hypothesis asymmetric tensor,
it doesn't matter. OK, so.
The question was we why is the
basis vector being dropped in the
1st place here and 2nd place here?
There's no significance to that,
OK, it's just.
I want to change my mind to the two cases and
it doesn't matter because Jesus symmetric.
So I'll make a note of that later
and I'll get around to this.
Point. In a moment, right,
we shall proceed.
Are there any questions about either
what I've said there or about burning
questions which haven't been in the
middle about what we've covered last time?
No.
Good.
OK, so the plan now is for me to get rid of.
This and.
Get these again.
And.
I think we've got 2.
But there. Didn't we?
Yes, we got a bit there.
So the. The.
Right, we go to the end of section 226.
So what we discussed last time then,
was your first look at the,
well, the fiddly technology of.
Components and all the things
that algebra you can now do with
the components of 10,001 forms.
And now we're going to go on
to the important question of.
Last time I talked about the.
We talked repeatedly obesity vectors and
the basis one forms that were due to them,
but there's nothing special about
any basis that we pick and that's
the that that goes back to the
principle of covariance that
mentioned in the first lecture.
What that principle is,
is statement that there was nothing
special about any basis that you pick it.
There's nothing special but this
initial frame, or that one,
or or or any or any other.
What that means in turn is is that
if you change your mind about
what is a good basic set of basis
basis vectors or corresponding
to good set of basis one forms.
Then you have to be able to go from 1.
Basis set to another and that's we're
going to talk about in the first half,
I hope all of this lecture.
So what that means is some
more bit more notation and.
What's the best way of writing this?
I'll leave that up there.
New projector document camera.
To that point.
Four which for you?
OK. So we have a basis.
Our vector E could you
is is the late annoying.
Can you read that? OK, OK,
which we're gonna write as a IDE.
Aye with acid 4 the implied
some of the dummy index I.
And the components there are
just the. Set of of numbers.
And would you get when you?
Apply the vector A regarded as
a function. To the one form,
the one of the basis one forms.
OK. Now let's change our mind about
what the basis vectors are and
rather than the basis vectors EI.
Well, state have the beta vectors EI bar.
Now that looks a completely
demented notation.
What, not least because it's slightly
fiddly to write what I've done is I put a.
I I'm your bar over the index. Here.
Now there are other ways of writing of there.
Are there other notations?
Are notional alternatives here and different
books sometimes used to different things,
some are prime.
Some use hat some,
but the general consensus is that the.
The the, the the indicator good over
the index and not over as you might
expect over the basis vector itself.
So we don't, so we don't write.
E primed I, for example.
We don't write that, although that that
might be what you might first guess.
And the reason for this?
Well, I hope becomes slightly clear.
So.
That means that just as we can
write a in terms of the.
Vector EI we can also write.
It in terms of basis vectors.
AI bar EEI bar where there's a again an
applied some over the dummy index I bar.
OK. So that's just as good.
Business as before and just as before the.
The components AI bar are the
vector A applied to Omega I bar,
where the Omega I bar is the are
the basis one forms dual to the
basis vectors EI bar. OK. So I.
All I've done this is isn't is notation.
I'm just saying that this is
what I'd changing your mind
looks like in this notation.
No. And again.
So this is the component of
the vector in the. Alternative.
Vicious. But we can expand a.
Her relic? We can,
for example, write A is AI.
EI. When we go. High bar.
I think that that that light is annoying
and we'll see if it might not be possible.
However, that important to think.
I don't think it's possible
to make that as visible.
What I could do is put it on that
other one that's probably smarter.
Do both. Not look at camera.
Should be eligible somewhere.
Ah, right, that's the problem.
There's light, OK,
you just have to deal with. So.
We don't know what that is.
We can't just decide what that,
that, that, that basis vector
applied to that basis of one form is.
So we'll see. Instead it is.
It's some set of numbers Lambda I bar I.
And that Lambda, which is a matrix,
is a collection of numbers.
Of and by numbers.
That matrix Lambda is what characterizes the
relationship between one basis and the other.
And it is just this bit of vector EI
applied to the base of one form EI bar.
OK. So that's your transformation
matrix that takes you from 1.
Vicious to the other.
So, so that that.
But it's probably not a.
This is a long way to something
you possibly have thought of
before you know the context.
Umm.
Yeah, we've written the matrix at that. And.
In the same way. The.
And. Let's see the.
The components of our.
One form P will be P.
Applies to E. Ibar. And.
Again we can write P is equal to Pi Omega.
IE I bar which is equal to Lambda I
I bar Pi and you can see this all
sort of works in terms of the of the.
Commission convention because the the
pattern of of of raising load indexes
hangs together and we end up with,
as here, one eye bar in the bottom and
a pair of of of recent Lord eyes on
the on the right hand side. So. Um.
So I will. Go I I sort of want to see.
Have a look through these yourself.
I'm indecision.
Yeah, let's go through this. I'm, I'm.
I'm slightly nervous here because the
it's always very easy to get these
indexes wrong when so doing this as
it were alive in front of people, but.
The next step? Is to. Uh.
Yeah, I do it that way.
We want to look at the.
I don't make this too complicated,
but I don't wanna make it trivial and
let's write down this Ohga. I e.g.
Which is, so this is the,
the basis one for this happening
just in one in one frame and
we'll write that down as delta I.
Edgy, so I'm on 2 minutes
going through the step by step.
I think I will because it's useful
as an illustration of the handle
turning of of of the relevant algebra.
So we have an expression like that,
but we can also write each of those.
And that says and this is what
written down here is just the
thing we decided on last time that
the that the basis one forms are
going to be dual to the basis of
vectors in this very specific sense
that we will one apply to E1 is 1
when we go one apply to any other
basis basis vector is is 0.
So that's that's all that we're
seeing there that's that's the
definition of of of dullness.
Well, then write down these two things
in terms of the expressions we have here.
That's Lambda I.
I bar Omega. High bar.
Applied to Lambda, G Bar G. E.g bar.
I can't obviously write.
I I buy I here because I've already
used up I already in the sum,
so I've gotta pick another dummy index.
Tensor application is linear in its argument,
so we can take that Lambda out.
Get Lambda I I bar. Lambda G by G.
Omega. I bar E. G bar.
But we are also going to presume that the in
the change basis the the the basis one forms.
I bar are also due to the basis vectors
in that basis, so that will be.
Lambda I I bar Lambda G Bar G Delta
I bar G bar and we do that sum.
You end up with Lambda I bar I. And.
Lambda I bar G.
Which is the unit, the identity matrix.
Remember, Delta is the components
of the the matrix with the the
with all ones on the diagonal.
We're just telling you that Lambda
I I I bar a Lambda I bar I.
Are. Matrix inverses.
So this is an example.
I think I earlier said that if you
look at the components of a tensor.
Then they will are always
representable at a matrix.
So you just because you've got an array of
of end by end by whatever whatever numbers.
But there are the.
The converse is not true.
This is an example.
Lambda is an example of our.
An array of numbers,
a matrix which does not have
a corresponding tensor.
Because there isn't a geometrical object
which those lambdas are the components of.
OK, that's a key thing.
So the the the matrix of components of.
Or the the the vector of the
the the column of of components
of a vector or one form or the
matrix of components of a tensor
are not just a random matrix,
they are linked to the geometrical
object single geometrical object
which is frame independent.
But Lambda is not a tensor and that's
why just just parenthetically we write
the indexes just one above the other.
If you remember when we write the
the matrix the components of our.
Are two taken frontier.
We carefully staggered them
because they're referring to
different arguments of the tensor.
In this case, this isn't attention.
There's no need to stagger the indexes.
And then we write the one above the other.
And in this for something like that,
you can sort of start to see
why the bar goes over the.
Index rather than the rather the vector.
The vector,
because it lets us keep track
of all of which way round.
Or which we up if you're like
the the matrix Lambda is.
And that,
but you couldn't do if if if the
dictation were elsewhere the question, sorry.
2nd.
This should be I, bar J,
bar J yeah, yeah, this one,
that's I I bar I bar G because we have.
Summed over the of the J bar.
So that is if we do the sum over J bar.
This term here will be 0.
Except where G bar is equal to I bar.
So the only time that survived out of
that out of that sum is the term where
G bar is equal to I is equal to I bar.
So that's how we get that.
Thank you. Yes, so it's important.
So part of the point of of of me writing
this out long hand is to go through exactly
that the step by step quite slowly. So.
So yes we are doing that sum over J bar
and that's why that deals with there
you know changes that G bar into an I bar.
So we're left with a sum I bar I.
Do W indexes were left with
an I raised eye a, lowered G?
Just as a relieved I Lord G.
So everything matches. OK.
If the the the indexes you got on one side
do much indexes you got on the other side,
you have done it wrong.
OK, OK, what you've done wrong,
but you've done something wrong.
So that's always a check.
Can be at every step.
Through this calculation, we could check.
Does the does the have one
raised 11 lower G duplicated.
I bar Dublin G bar.
That's good,
that line is good and so on.
So you can check each line against
that with that Santa check.
Umm.
And there's.
Right and and. I'm not going to go
back and forth through the the notes,
but by going through a similar sort of
calculation you can discover that this.
Lambda matrix is transformation matrix.
It doesn't just transform components,
it also ends up being the.
How you turn?
1. Basis. Into another. And Omega.
I bar equal to Lambda I bar I.
Now you may think, Oh my God,
that's an awful lot of things to memorize.
You really don't need to memorize
anything because once you once you've
got the idea that there's a an,
in this case an E Lambda and a knee.
There's only one way the indexes can go.
So you don't have to memorize this,
you just have to know that you know
the general idea and and and the
index is fit in it only one way.
So there's no memorization here.
But there's a a useful exercise.
It might be.
What exercise 214 to 217,
which invites you to form a I I
just a table of all of these things.
It's not terribly exciting,
excited, but it it helps you to
drill that a little bit more.
So that is useful.
And this sort of thing also is why
I think it's useful to use bars
rather than hats or or primes,
because it's just slightly easier
to write them neatly, I find,
rather than having dashes or or or
circumflex or all over the place,
and then later on we start introducing
punctuation to this notation,
things would get a bit hairy
unless we end up with a very neat
handwriting at the end of this course.
Well, you're very neat handwriting
over a few letters your your queues
might look terrible, but your eyes,
G's and keys will be perfect.
Umm.
And and this this is these are
both generalizable, it turns out,
so that something like TI bar.
AG Bar Key bar is equal to Lambda I bar
I Lambda G bar G Lambda K key bar T.
IG. OK. And and again,
I didn't have to remember.
I didn't memorize anything there.
I just remembered the pattern.
There's one Lambda per index,
and there's only one way
that the index is fit in.
Umm.
But, but boom, there's a few other remarks
which in at the end of that section which
I don't think it's useful to belabor.
But in the in the last half hour I
will going to move on a bit but first.
What questions have you about that?
I question the. See.
To the.
Here.
Yeah.
Yes, because I think that's a very good
point that's useful to highlight that
these because they're matrices and
because that's just a matrix of numbers,
these are all numbers. They're all,
they're all things on the real line,
so you can swap them over on all you
like and because the the the sum.
So. So one of the advantages of this
component notation is that it means you
can swap things arbitrarily, because
they're just numbers and numbers commute.
So when we later come on to talk
about differentiating things.
Differential operators don't commute
with numbers, but numbers do.
So is that what you meant?
You. Yeah, yeah.
So yes, you could if you wanted
to write those another order.
Weird, but you're allowed.
OK, your question there.
Swap them around without having
to to change the index.
Indeed, absolutely so I could if I wanted
to if I was really confused myself, right?
T like Lambda I bar.
Key Lambda G bar M.
Sorry. If we bar Lambda K.
Bar and Lambda. And.
G Yeah, G. Cheap bar equals T.
KM. Gee or something?
Now that would be perverse,
but but there's nothing stopping
me doing that because all that I've
done there is I've changed is.
I've changed my mind about the dummy indexes.
And to a stupid thing.
But I'm an allowed allowable thing, OK?
So, so I've just to to check one dummy.
Dummy, yes.
So I've I've still got a IIRG bar.
So I've I've I've given
myself too much leeway there,
so sorry, that should be G bar.
And that's that has to be a key bar.
So there's a a matching.
I bought gbar keybar.
But
KM&G in this case I don't mean
indexes, so so so disappear.
Please.
Yeah.
And second position then we
would have to put K bar.
No. So if it's what those
dreaming no cause, because still.
The, the, the, the the sum would be.
OK, let's write that down.
Like it's just just just just show it.
So if we wrote a Lambda G bar
G Lambda I bar I Lambda key
bar tigg key that you mean?
Yeah. Yeah. And the the again,
it's still just a sum over.
I G&K and because they're just numbers.
And each element in that sum,
all that will have happened in the
there's quite long sum that is an end
by end by end sum is that the the
real numbers in the product of each of
those terms will be in a different order.
So. So yes,
these lambdas are just numbers.
How to decide the order of the T?
Order of the right size,
up or down, right.
In both these cases, this is a 2.
There's this appears to be a A21 tensor.
There's 10s, it's a tensor
which takes I've just you.
I just picked a a rank to to illustrate this.
So it's a A21 tensor.
So there will be a two ways and one Lord.
Indexes. So once,
but once you've got and and and it's,
it's a 21 tensor here, it's a two.
It's the same true one tensor here.
So it'll have the same pattern of indexes.
And once we've got that pattern,
then the pattern of of lambdas
inside in this transformation follow.
The last line, that's right.
It is last line below.
So it is 21 tensor.
It's so true. 21 tensor, yes.
The two up and one down, one down.
Yeah, yeah. You have to show up all three.
Or can you just swap high bar to I
and then just give J bar and J bar?
Yes you would have to because the.
What what this set of numbers is?
Is the components of that tensor
in that transformed basis.
So. In each, so it is. So
TIGI bar G Bar K bar is equal to T Omega.
I bar. When we get a cheap bar.
Ek bar and it wouldn't make any sense,
really. To to, yeah,
if you were to to drop in one forms
and vectors from two different bases,
you get a number, but it wouldn't mean any.
They are just. The different forms
of the tensor to write down.
These. This is one form in Japan.
The below is another form,
so this tends to this tensor.
Here is A21 tensor, yes?
So it's the same.
The same with yes,
it seemed interesting.
It's just the truth.
It seemed unsure. Yeah,
just inform the same tension,
OK, and one better move on.
But one last question there.
Yep.
It equals the components of tensors, yeah,
so equals the components of a tensor, yes.
So and so each of these is a matrix.
That's a matrix which is the the
the matrix components of a tensor.
And similarly that's a metric which
is the component of tender and so,
so this is our.
One tensor is equal to N by N by N.
Terms which are a number times
the components of a tensor.
Is that what you were asking on the next,
but you just got the three members?
So that that's not that.
That's not supposed to be.
Or is it? Yeah, yes.
So not the not equal sign there,
but spotted well someone,
someone is paying close attention.
OK. That is the all of the components,
all the components gymnastics
that we were introducing.
So the last bit is slightly less.
These were heavy and just a
few more examples of BC's and
transformations and spaces.
So the first example of a space
in which we can talk these
things is flat Cartesian space.
Now it's flat in the sense that
I think do I define that here?
I think there's.
Just flight, welcome back to
flat in the morning. And you're
familiar with flat Cartesian space.
Flight Euclidean space if we were flat
Euclidean space with a Cartesian basis.
So Euclidean space is the space
we're familiar with, but if we're,
Pythagoras theorem works.
The Cartesian basis is the X&Y basis,
so so things like are given.
The basis vectors are.
Orthogonal to each other.
No, we and the unit and the other
unit length although right now
because well up to the point where
we define a a metric on that space.
We can't talk about length or about angles.
But you know you you have
a metric in your head.
Pythagoras theorem is the
definition of a metric. It is.
It's how you turn direction.
Directions into into distances.
So in Floaties,
because he's in space, we have.
Umm.
EX. 1 E y = e two and this is
a good point to say that I
will sometimes swap between.
And numbering the basis vectors
and giving them, you know,
I'll say more pneumonic. Remove.
There we can sort of sum over these things.
We can't remove these,
but if if we're different specific things,
it's useful to write things like that, so.
Our vector in fact you clean space
or in any thing A1E1 plus A2E2 and
which we can say right as a ***
just like the informal notation.
And that's something you learned
about in secondary school?
So that's there's nothing exotic there.
So this is a very,
a very long way to come back to.
Some of you learn about school,
but there's nothing,
there's nothing extra at this point.
Um. So what are the?
The one forms in this space.
Well, there's no,
as we said earlier that we that we there's
no constraint about what the one forms are,
but we can choose them so that
the contract to form the direct
delta function to the so the basis,
the basis one form #1 contracted
with basis vector one is 1.
And the zero contracted with other ones.
And what do the components that look like?
They look exactly the same as the vectors.
So in.
Flat clean space the one forms.
When you turn the handle and find with
root like look exactly like the vectors.
You can't tell them apart and that is why
you never had to learn about them before.
Because in the sense you've always been,
you're dealing with one form.
In fact, you could in space,
but you didn't know it because they looked
at it indistinguishable from vectors.
So if if you want to think of it that way,
you could say that row vectors
in the example you used earlier
are are the one forms of flat.
You could use space,
so you'll be using them all the time,
but you never had to had to care.
Similarly if you.
If you have continued mechanics
in in previous years,
you learned at the National
Center or the strain tensor,
you never had to worry about raise lowered
indexes because the didn't matter.
There's no difference between the reason
Lord indexes if you like in Euclidean space,
the the components the one are the same,
the transformation between them is just the
right delta rather than the more complicated.
Not the right delta,
the chronicle delta.
And.
OK,
let mustn't get bogged down.
And our metric in this space is
just G has components. GIJ equals.
You know that that's.
Here there there's no sums
in the nose imply summations.
Here our metric in.
The for the space is just.
The 1001. You're like, which ends up with
when we apply that to. And. A vector.
We get GIG.
AIAG. Picking up what we did last time.
And if we do those sums. Actually.
I = 1, I equals and so on.
We get a 1E1 plus E2E2 which
is equal to a 1 ^2 plus.
You 2 squared, which is Pythagoras's theorem.
So deciding that this is a
defining metric this way.
Is equivalent to seeing the Pythagorean
Theorem works and the and the.
The length squared of this vector
A is just it's X component squared
because it's white component square.
Um. Ohh yeah. And and and what and and
it's is is this fact that means that
when you raise the index or lower the
index of of of a we we we raise the
indexes of a a vector in this space.
What you get is. The same number.
In other words, vectors and and
one forms have equal components.
In space. And the other points are.
I'll go through the polar coordinates,
but fairly quickly just
because I want just to to.
I'll draw your attention to that section.
It is the same set of ideas,
but with another case that you're
familiar with Paul coordinates,
but which is slightly less
trivial than the the Euclidean
space with Cartesian coordinates.
And you can again discover what the.
The components of this transformation matrix.
We might even have that in our.
And that's.
OK, I'll go through this very quickly.
And so the, the, the basis,
the basis vectors of polar coordinates.
Are just a transformation away from the
basis vectors of Cartesian coordinates.
So say E1 and E2 are the
the X&Y basis vectors.
They're familiar with the.
Basis vector in polar coordinates
is that very obvious transformation
away from that, the basis.
The basis vector, the tangential one,
is a similar one.
That are there is not
what you've seen before.
Usually when you've seen this
transformation written down before
in what implicitly they're scaled
so that that are is not there.
And that is what makes the one you're
more familiar with, the basis vector,
the basis vectors for polar coordinates.
It makes them unit vectors.
So these are not unit vectors.
These are these are natural and
different sense, so that's not a typo.
That is the sort of natural thing.
It's context, and nobody would into that.
But the point is that that here,
this is a concrete example of
a change of basis. And.
Boom, boom, boom, boom.
I've components of the the Lambda matrix,
the transformation matrix Lambda.
So that that's.
So that's just.
The transmission matrix Lambda applied
to the pair of basis vectors E1 and E2.
And I'll leave you for that section
slightly more slowly in a moment.
And it's worth and and we can point out that.
Um.
I'll let you look at that after I
put the slides up in a moment and
the the metric of polar coordinates.
G is equal to 100. R ^2.
Which I mentioned just to show that it's
not the the the diagonal unit matrix of
the metric of in Cartesian coordinates.
That's again, that's all in the notes, so.
And. I'm going to skip over.
I'm going to skip over taking 233
because although it's not false,
it's it's a potential little
confusing and go to another very
important special example where the.
It's Minkowski space where the metric.
And you do the traditional thing of
rating the components in mikovsky
space with Greek letters. And.
The metric is also traditionally
referred with water rather
than G is the diagonal matrix.
And. Made 1111.
Umm.
And each a here is a matrix with
a particular constant components.
It's not, it's not the,
they're not component of a tensor.
The vectors in this space are a.
A equals to a.
MUEMU. And we can write the.
Really. I. And the the. Umm.
Yeah, the metric applied to
two vectors GABG mu nu. AM UB.
New which will be equal to.
MUBMU, which were equal
to minus. Oh, thank you.
0B0 plus A1B1 plus A2B2 plus A3B3 where
I am sticking with the convention
that in Minkowski space the basis
vectors are numbered 0123. The indexes.
Index is run over 0123 and there's
it's a four dimensional space.
And So what you what we have got there?
Is the inner product in Minkowski space
the inner product of special relativity
that you may recall from the last time
that you studied special relativity?
So this is a prompt,
a hint to perhaps drift back to
those notes from two years ago
and remind yourself a little bit
of what was the question.
Using it.
Than in.
Because it's arbitrary. The signature. Why?
In second year we use the metric
which was plus, minus, minus, minus.
And if you make that the signature,
then the signature of 1 +,
-, -, 1 -, 1 -, 1 -, 2,
the signature it will for because it will
always be either minus two or minus or +2,
depending on your on your convention
and a number of other equations
you change in turn now.
I prefer when you talk with special
activity to use the signature minus two,
because then the interval
is the same as proper time.
It is more conventional in GR.
And to use this, the signature the
opposite way around so that um. The the.
And spatial sector has the plus,
plus, plus.
So and that's really just a matter
of taste some extent sort of taste
and tradition you know we so it
would perfectly reasonable to
introduce special activity with with
that metric but it is arbitrary we
were undergoing nothing changes.
So with in 30 seconds I've questioned yes.
The maintenance.
Or diagonal, so that that's the
expression diagonal mobile.
So it's a matrix which which is is 0
except along the along that diagonal.
OK. And? Um.
Blah blah and the transformation matrix,
which takes you from one basis
vector with that basis vectors in.
Because his space.
To another basis vector.
Instead of vectors in Minkowski space
is this transformation matrix here,
which you may be familiar with
as Lorentz transformation.
So all the right transformation is
is just how you get from one basis
basis set attached to the station
platform to another basis set
attached to a moving a moving object,
a moving train.
That's what I transformation is.
It's a basis,
a change of basis and the and the
point of all of this positivity and
yeah is that the base is the physics
doesn't change when you do that.
And that in fairly decent time is is us.
There's a few extra marks in in in in
section 2.4 we talk about coordinates,
bases, and just clear bits of terminology.
It would be good to have a look
at that section just to get
your head straight around those.
But that's I've done with with Part 2.
So we'll go into Part 3 next time,
which is next Wednesday.