Transcript of gr-l04 ========== _0:07_: Excellent. Lecture 4 and before we get _0:12_: going in this lecture, we'll finish off. _0:15_: Part 2 is is the plan and that _0:17_: shouldn't be too difficult. _0:19_: The first half of this is going to be a bit _0:22_: more of a remark through notation I'm afraid. _0:24_: The second-half should be more examples _0:26_: and should be a little bit more conceptual, _0:28_: so basis for the first half, _0:30_: but it should be over soon _0:32_: before we get going. _0:33_: I mentioned a couple of things on the Moodle. _0:36_: And one of which is that, _0:39_: oh, that the microphone on. _0:42_: That, but that's. _0:46_: I'm putting a button called mute, _0:47_: mute, mute, OK. One thing is that. _0:53_: On the middle page there is a link to. _0:60_: Oh, I can't get to that _1:01_: piece from here anyway. _1:05_: A couple of changes of things _1:07_: as I mentioned in the message. _1:09_: That said, via Moodle, _1:10_: there is a podcast of the audio _1:12_: recordings of the lectures which _1:13_: should be findable at the end of the _1:16_: link that was in that modal message. _1:19_: So that is a bit of an experiment, _1:22_: both technical pedagogically and technically. _1:24_: So any comments or thoughts with that? _1:26_: I am very keen to hear them _1:30_: other things that are here. _1:33_: Are though. Let audio podcast. _1:36_: But we can't get there from this computer. _1:40_: That thing, they're all notes with HTML. _1:42_: That's our bundle of the entire _1:44_: collection of notes format as HTML. _1:46_: It's sort of readable and might _1:48_: be useful in some circumstances. _1:50_: It's a lot less pretty than the PDF version, _1:52_: but it might be of some interest. _1:54_: The panel I mentioned a moment. _1:56_: Now, the stream channel, which, _1:59_: well, let's not worry about that, _2:01_: which I think I pointed to you before. _2:04_: It has a couple of things. _2:06_: It will have the brief 5 minute overviews _2:10_: of the various parts will appear there. _2:13_: Part 2 is there. _2:15_: Part Three will appear. _2:16_: Part Four will appear also _2:18_: as of 10 minutes ago. _2:20_: It has copies of the first couple of my _2:26_: lectures from autumn 2020, so they're _2:29_: the recordings of the zoom lectures. _2:31_: The the first ones I did so _2:33_: the little bit Robbie but. _2:35_: I'll, I'll, I'll, _2:37_: I thought. _2:39_: I've been two minds about putting _2:41_: those up and because. _2:45_: I was in two minds. I'll put them up. _2:50_: The deduction is not that. _2:51_: The that these lectures are are not _2:53_: important, and I'll put them up _2:55_: with a bit of a delay in any case, _2:57_: but there are an extra resource _2:59_: to use as you think appropriate. _3:02_: You're all have whole new learning _3:04_: strategies over the years and _3:06_: I'm sure you'll be sensible _3:07_: about these are the imaginative, _3:09_: so I will put them up there. _3:11_: Put these two year old lectures up as. _3:16_: In time. And _3:21_: I will point you again to the lecture _3:23_: notes folder in the middle which has the _3:26_: notes and the screen versions and the _3:28_: slides of the parts that have completed. _3:31_: So after today I'll put up _3:34_: the Part 2 slides PDF. _3:36_: There's nothing in the slides as _3:38_: you're as you're aware there isn't _3:39_: in the notes apart from the sort of _3:42_: answers to the quick questions things, _3:43_: but they will appear there in good time. _3:47_: I think I, yes, _3:48_: I think I have to date there. _3:50_: Regarding the padlet, _3:51_: I see there are some good questions there, _3:55_: a couple of which I think I've answered. _3:56_: As I say, if you think I haven't answered, _3:58_: if I haven't answered that then change the _4:01_: color back to White and I'll see it again. _4:04_: And I've just entered to the of resolutions. _4:07_: I I took compendium version of all _4:08_: the solutions and the compendium _4:09_: of of notes on the solutions. _4:11_: I'll release that either at the end of _4:12_: the semester or beginning of the next one. _4:14_: So it's a good time for revision but _4:17_: appears after after the lectures _4:18_: and some of the slides. _4:20_: And I answered these two questions as _4:23_: they change the color back to white. _4:25_: If you think that's if you disagree _4:27_: and I haven't had a look at these _4:29_: other to quick look look at this one. _4:34_: Someone asking, I'll post a note on this. _4:37_: Is there any significance _4:39_: to this order being swapped? _4:41_: No, and there isn't, _4:42_: because the metric is our symmetric tensor. _4:46_: So in general these two things _4:49_: would be importantly different. _4:51_: Because the metric is by _4:53_: hypothesis asymmetric tensor, _4:54_: it doesn't matter. OK, so. _4:58_: The question was we why is the _5:03_: basis vector being dropped in the _5:05_: 1st place here and 2nd place here? _5:06_: There's no significance to that, _5:08_: OK, it's just. _5:11_: I want to change my mind to the two cases and _5:13_: it doesn't matter because Jesus symmetric. _5:14_: So I'll make a note of that later _5:17_: and I'll get around to this. _5:19_: Point. In a moment, right, _5:23_: we shall proceed. _5:24_: Are there any questions about either _5:26_: what I've said there or about burning _5:29_: questions which haven't been in the _5:31_: middle about what we've covered last time? _5:33_: No. _5:35_: Good. _5:35_: OK, so the plan now is for me to get rid of. _5:42_: This and. _5:48_: Get these again. _5:54_: And. _6:00_: I think we've got 2. _6:06_: But there. Didn't we? _6:11_: Yes, we got a bit there. _6:14_: So the. The. _6:24_: Right, we go to the end of section 226. _6:27_: So what we discussed last time then, _6:30_: was your first look at the, _6:34_: well, the fiddly technology of. _6:37_: Components and all the things _6:40_: that algebra you can now do with _6:42_: the components of 10,001 forms. _6:44_: And now we're going to go on _6:48_: to the important question of. _6:51_: Last time I talked about the. _6:54_: We talked repeatedly obesity vectors and _6:57_: the basis one forms that were due to them, _7:01_: but there's nothing special about _7:02_: any basis that we pick and that's _7:05_: the that that goes back to the _7:07_: principle of covariance that _7:09_: mentioned in the first lecture. _7:11_: What that principle is, _7:12_: is statement that there was nothing _7:15_: special about any basis that you pick it. _7:17_: There's nothing special but this _7:18_: initial frame, or that one, _7:20_: or or or any or any other. _7:22_: What that means in turn is is that _7:24_: if you change your mind about _7:26_: what is a good basic set of basis _7:28_: basis vectors or corresponding _7:30_: to good set of basis one forms. _7:32_: Then you have to be able to go from 1. _7:35_: Basis set to another and that's we're _7:37_: going to talk about in the first half, _7:39_: I hope all of this lecture. _7:42_: So what that means is some _7:46_: more bit more notation and. _7:49_: What's the best way of writing this? _7:52_: I'll leave that up there. _7:55_: New projector document camera. _7:59_: To that point. _8:04_: Four which for you? _8:08_: OK. So we have a basis. _8:14_: Our vector E could you _8:16_: is is the late annoying. _8:18_: Can you read that? OK, OK, _8:20_: which we're gonna write as a IDE. _8:24_: Aye with acid 4 the implied _8:26_: some of the dummy index I. _8:33_: And the components there are _8:35_: just the. Set of of numbers. _8:37_: And would you get when you? _8:42_: Apply the vector A regarded as _8:46_: a function. To the one form, _8:48_: the one of the basis one forms. _8:54_: OK. Now let's change our mind about _8:59_: what the basis vectors are and _9:02_: rather than the basis vectors EI. _9:05_: Well, state have the beta vectors EI bar. _9:09_: Now that looks a completely _9:11_: demented notation. _9:12_: What, not least because it's slightly _9:15_: fiddly to write what I've done is I put a. _9:19_: I I'm your bar over the index. Here. _9:25_: Now there are other ways of writing of there. _9:28_: Are there other notations? _9:29_: Are notional alternatives here and different _9:31_: books sometimes used to different things, _9:33_: some are prime. _9:35_: Some use hat some, _9:37_: but the general consensus is that the. _9:41_: The the, the the indicator good over _9:43_: the index and not over as you might _9:45_: expect over the basis vector itself. _9:47_: So we don't, so we don't write. _9:50_: E primed I, for example. _9:54_: We don't write that, although that that _9:56_: might be what you might first guess. _9:59_: And the reason for this? _10:00_: Well, I hope becomes slightly clear. _10:03_: So. _10:05_: That means that just as we can _10:07_: write a in terms of the. _10:10_: Vector EI we can also write. _10:13_: It in terms of basis vectors. _10:17_: AI bar EEI bar where there's a again an _10:21_: applied some over the dummy index I bar. _10:24_: OK. So that's just as good. _10:31_: Business as before and just as before the. _10:35_: The components AI bar are the _10:38_: vector A applied to Omega I bar, _10:41_: where the Omega I bar is the are _10:44_: the basis one forms dual to the _10:48_: basis vectors EI bar. OK. So I. _10:52_: All I've done this is isn't is notation. _10:56_: I'm just saying that this is _10:57_: what I'd changing your mind _10:58_: looks like in this notation. _11:05_: No. And again. _11:11_: So this is the component of _11:15_: the vector in the. Alternative. _11:21_: Vicious. But we can expand a. _11:25_: Her relic? We can, _11:26_: for example, write A is AI. _11:30_: EI. When we go. High bar. _11:36_: I think that that that light is annoying _11:39_: and we'll see if it might not be possible. _11:42_: However, that important to think. _11:47_: I don't think it's possible _11:49_: to make that as visible. _11:51_: What I could do is put it on that _11:54_: other one that's probably smarter. _11:56_: Do both. Not look at camera. _12:04_: Should be eligible somewhere. _12:05_: Ah, right, that's the problem. _12:07_: There's light, OK, _12:09_: you just have to deal with. So. _12:16_: We don't know what that is. _12:17_: We can't just decide what that, _12:20_: that, that, that basis vector _12:22_: applied to that basis of one form is. _12:24_: So we'll see. Instead it is. _12:27_: It's some set of numbers Lambda I bar I. _12:35_: And that Lambda, which is a matrix, _12:37_: is a collection of numbers. _12:40_: Of and by numbers. _12:42_: That matrix Lambda is what characterizes the _12:46_: relationship between one basis and the other. _12:50_: And it is just this bit of vector EI _12:53_: applied to the base of one form EI bar. _12:56_: OK. So that's your transformation _12:59_: matrix that takes you from 1. _13:01_: Vicious to the other. _13:04_: So, so that that. _13:06_: But it's probably not a. _13:08_: This is a long way to something _13:10_: you possibly have thought of _13:13_: before you know the context. _13:15_: Umm. _13:19_: Yeah, we've written the matrix at that. And. _13:26_: In the same way. The. _13:31_: And. Let's see the. _13:34_: The components of our. _13:38_: One form P will be P. _13:46_: Applies to E. Ibar. And. _13:53_: Again we can write P is equal to Pi Omega. _13:59_: IE I bar which is equal to Lambda I _14:05_: I bar Pi and you can see this all _14:09_: sort of works in terms of the of the. _14:13_: Commission convention because the the _14:15_: pattern of of of raising load indexes _14:17_: hangs together and we end up with, _14:20_: as here, one eye bar in the bottom and _14:24_: a pair of of of recent Lord eyes on _14:27_: the on the right hand side. So. Um. _14:42_: So I will. Go I I sort of want to see. _14:48_: Have a look through these yourself. _14:54_: I'm indecision. _14:60_: Yeah, let's go through this. I'm, I'm. _15:04_: I'm slightly nervous here because the _15:06_: it's always very easy to get these _15:08_: indexes wrong when so doing this as _15:10_: it were alive in front of people, but. _15:14_: The next step? Is to. Uh. _15:24_: Yeah, I do it that way. _15:30_: We want to look at the. _15:37_: I don't make this too complicated, _15:38_: but I don't wanna make it trivial and _15:41_: let's write down this Ohga. I e.g. _15:49_: Which is, so this is the, _15:51_: the basis one for this happening _15:53_: just in one in one frame and _15:56_: we'll write that down as delta I. _15:59_: Edgy, so I'm on 2 minutes _16:01_: going through the step by step. _16:03_: I think I will because it's useful _16:06_: as an illustration of the handle _16:09_: turning of of of the relevant algebra. _16:13_: So we have an expression like that, _16:16_: but we can also write each of those. _16:20_: And that says and this is what _16:22_: written down here is just the _16:24_: thing we decided on last time that _16:26_: the that the basis one forms are _16:27_: going to be dual to the basis of _16:30_: vectors in this very specific sense _16:33_: that we will one apply to E1 is 1 _16:35_: when we go one apply to any other _16:37_: basis basis vector is is 0. _16:39_: So that's that's all that we're _16:41_: seeing there that's that's the _16:43_: definition of of of dullness. _16:44_: Well, then write down these two things _16:46_: in terms of the expressions we have here. _16:48_: That's Lambda I. _16:50_: I bar Omega. High bar. _16:55_: Applied to Lambda, G Bar G. E.g bar. _17:03_: I can't obviously write. _17:06_: I I buy I here because I've already _17:08_: used up I already in the sum, _17:10_: so I've gotta pick another dummy index. _17:13_: Tensor application is linear in its argument, _17:16_: so we can take that Lambda out. _17:19_: Get Lambda I I bar. Lambda G by G. _17:25_: Omega. I bar E. G bar. _17:30_: But we are also going to presume that the in _17:36_: the change basis the the the basis one forms. _17:41_: I bar are also due to the basis vectors _17:45_: in that basis, so that will be. _17:49_: Lambda I I bar Lambda G Bar G Delta _17:55_: I bar G bar and we do that sum. _17:59_: You end up with Lambda I bar I. And. _18:08_: Lambda I bar G. _18:13_: Which is the unit, the identity matrix. _18:18_: Remember, Delta is the components _18:20_: of the the matrix with the the _18:22_: with all ones on the diagonal. _18:25_: We're just telling you that Lambda _18:27_: I I I bar a Lambda I bar I. _18:31_: Are. Matrix inverses. _18:34_: So this is an example. _18:37_: I think I earlier said that if you _18:39_: look at the components of a tensor. _18:42_: Then they will are always _18:44_: representable at a matrix. _18:46_: So you just because you've got an array of _18:49_: of end by end by whatever whatever numbers. _18:52_: But there are the. _18:53_: The converse is not true. _18:54_: This is an example. _18:56_: Lambda is an example of our. _18:59_: An array of numbers, _19:01_: a matrix which does not have _19:04_: a corresponding tensor. _19:05_: Because there isn't a geometrical object _19:08_: which those lambdas are the components of. _19:12_: OK, that's a key thing. _19:15_: So the the the matrix of components of. _19:19_: Or the the the vector of the _19:21_: the the column of of components _19:23_: of a vector or one form or the _19:26_: matrix of components of a tensor _19:28_: are not just a random matrix, _19:30_: they are linked to the geometrical _19:32_: object single geometrical object _19:34_: which is frame independent. _19:35_: But Lambda is not a tensor and that's _19:39_: why just just parenthetically we write _19:41_: the indexes just one above the other. _19:44_: If you remember when we write the _19:47_: the matrix the components of our. _19:50_: Are two taken frontier. _19:51_: We carefully staggered them _19:53_: because they're referring to _19:55_: different arguments of the tensor. _19:57_: In this case, this isn't attention. _19:59_: There's no need to stagger the indexes. _20:00_: And then we write the one above the other. _20:03_: And in this for something like that, _20:05_: you can sort of start to see _20:08_: why the bar goes over the. _20:10_: Index rather than the rather the vector. _20:15_: The vector, _20:16_: because it lets us keep track _20:18_: of all of which way round. _20:20_: Or which we up if you're like _20:24_: the the matrix Lambda is. _20:26_: And that, _20:27_: but you couldn't do if if if the _20:29_: dictation were elsewhere the question, sorry. _20:33_: 2nd. _20:36_: This should be I, bar J, _20:37_: bar J yeah, yeah, this one, _20:40_: that's I I bar I bar G because we have. _20:45_: Summed over the of the J bar. _20:49_: So that is if we do the sum over J bar. _20:54_: This term here will be 0. _20:58_: Except where G bar is equal to I bar. _21:01_: So the only time that survived out of _21:03_: that out of that sum is the term where _21:05_: G bar is equal to I is equal to I bar. _21:08_: So that's how we get that. _21:09_: Thank you. Yes, so it's important. _21:11_: So part of the point of of of me writing _21:14_: this out long hand is to go through exactly _21:16_: that the step by step quite slowly. So. _21:19_: So yes we are doing that sum over J bar _21:22_: and that's why that deals with there _21:24_: you know changes that G bar into an I bar. _21:27_: So we're left with a sum I bar I. _21:31_: Do W indexes were left with _21:33_: an I raised eye a, lowered G? _21:35_: Just as a relieved I Lord G. _21:38_: So everything matches. OK. _21:41_: If the the the indexes you got on one side _21:45_: do much indexes you got on the other side, _21:48_: you have done it wrong. _21:50_: OK, OK, what you've done wrong, _21:52_: but you've done something wrong. _21:53_: So that's always a check. _21:55_: Can be at every step. _21:57_: Through this calculation, we could check. _22:01_: Does the does the have one _22:03_: raised 11 lower G duplicated. _22:05_: I bar Dublin G bar. _22:06_: That's good, _22:07_: that line is good and so on. _22:09_: So you can check each line against _22:11_: that with that Santa check. _22:15_: Umm. _22:18_: And there's. _22:25_: Right and and. I'm not going to go _22:29_: back and forth through the the notes, _22:31_: but by going through a similar sort of _22:34_: calculation you can discover that this. _22:37_: Lambda matrix is transformation matrix. _22:39_: It doesn't just transform components, _22:42_: it also ends up being the. _22:47_: How you turn? _22:54_: 1. Basis. Into another. And Omega. _23:02_: I bar equal to Lambda I bar I. _23:08_: Now you may think, Oh my God, _23:10_: that's an awful lot of things to memorize. _23:12_: You really don't need to memorize _23:14_: anything because once you once you've _23:16_: got the idea that there's a an, _23:17_: in this case an E Lambda and a knee. _23:21_: There's only one way the indexes can go. _23:25_: So you don't have to memorize this, _23:27_: you just have to know that you know _23:30_: the general idea and and and the _23:32_: index is fit in it only one way. _23:33_: So there's no memorization here. _23:36_: But there's a a useful exercise. _23:41_: It might be. _23:44_: What exercise 214 to 217, _23:47_: which invites you to form a I I _23:49_: just a table of all of these things. _23:52_: It's not terribly exciting, _23:53_: excited, but it it helps you to _23:55_: drill that a little bit more. _23:57_: So that is useful. _23:59_: And this sort of thing also is why _24:01_: I think it's useful to use bars _24:03_: rather than hats or or primes, _24:06_: because it's just slightly easier _24:08_: to write them neatly, I find, _24:11_: rather than having dashes or or or _24:14_: circumflex or all over the place, _24:16_: and then later on we start introducing _24:18_: punctuation to this notation, _24:19_: things would get a bit hairy _24:22_: unless we end up with a very neat _24:24_: handwriting at the end of this course. _24:26_: Well, you're very neat handwriting _24:28_: over a few letters your your queues _24:30_: might look terrible, but your eyes, _24:32_: G's and keys will be perfect. _24:34_: Umm. _24:37_: And and this this is these are _24:42_: both generalizable, it turns out, _24:44_: so that something like TI bar. _24:47_: AG Bar Key bar is equal to Lambda I bar _24:52_: I Lambda G bar G Lambda K key bar T. _24:59_: IG. OK. And and again, _25:03_: I didn't have to remember. _25:04_: I didn't memorize anything there. _25:05_: I just remembered the pattern. _25:07_: There's one Lambda per index, _25:08_: and there's only one way _25:10_: that the index is fit in. _25:14_: Umm. _25:20_: But, but boom, there's a few other remarks _25:22_: which in at the end of that section which _25:25_: I don't think it's useful to belabor. _25:27_: But in the in the last half hour I _25:30_: will going to move on a bit but first. _25:33_: What questions have you about that? _25:38_: I question the. See. _25:43_: To the. _25:50_: Here. _25:53_: Yeah. _25:56_: Yes, because I think that's a very good _26:01_: point that's useful to highlight that _26:03_: these because they're matrices and _26:05_: because that's just a matrix of numbers, _26:08_: these are all numbers. They're all, _26:10_: they're all things on the real line, _26:12_: so you can swap them over on all you _26:15_: like and because the the the sum. _26:19_: So. So one of the advantages of this _26:21_: component notation is that it means you _26:23_: can swap things arbitrarily, because _26:24_: they're just numbers and numbers commute. _26:26_: So when we later come on to talk _26:29_: about differentiating things. _26:30_: Differential operators don't commute _26:32_: with numbers, but numbers do. _26:34_: So is that what you meant? _26:37_: You. Yeah, yeah. _26:38_: So yes, you could if you wanted _26:40_: to write those another order. _26:41_: Weird, but you're allowed. _26:43_: OK, your question there. _26:45_: Swap them around without having _26:47_: to to change the index. _26:49_: Indeed, absolutely so I could if I wanted _26:51_: to if I was really confused myself, right? _26:55_: T like Lambda I bar. _26:60_: Key Lambda G bar M. _27:07_: Sorry. If we bar Lambda K. _27:14_: Bar and Lambda. And. _27:19_: G Yeah, G. Cheap bar equals T. _27:27_: KM. Gee or something? _27:30_: Now that would be perverse, _27:32_: but but there's nothing stopping _27:36_: me doing that because all that I've _27:38_: done there is I've changed is. _27:40_: I've changed my mind about the dummy indexes. _27:43_: And to a stupid thing. _27:45_: But I'm an allowed allowable thing, OK? _27:50_: So, so I've just to to check one dummy. _27:54_: Dummy, yes. _27:54_: So I've I've still got a IIRG bar. _27:59_: So I've I've I've given _27:60_: myself too much leeway there, _28:01_: so sorry, that should be G bar. _28:04_: And that's that has to be a key bar. _28:07_: So there's a a matching. _28:10_: I bought gbar keybar. _28:13_: But _28:16_: KM&G in this case I don't mean _28:18_: indexes, so so so disappear. _28:22_: Please. _28:26_: Yeah. _28:29_: And second position then we _28:31_: would have to put K bar. _28:37_: No. So if it's what those _28:40_: dreaming no cause, because still. _28:43_: The, the, the, the the sum would be. _28:46_: OK, let's write that down. _28:47_: Like it's just just just just show it. _28:51_: So if we wrote a Lambda G bar _28:55_: G Lambda I bar I Lambda key _29:01_: bar tigg key that you mean? _29:05_: Yeah. Yeah. And the the again, _29:08_: it's still just a sum over. _29:12_: I G&K and because they're just numbers. _29:17_: And each element in that sum, _29:20_: all that will have happened in the _29:22_: there's quite long sum that is an end _29:24_: by end by end sum is that the the _29:26_: real numbers in the product of each of _29:29_: those terms will be in a different order. _29:31_: So. So yes, _29:33_: these lambdas are just numbers. _29:36_: How to decide the order of the T? _29:40_: Order of the right size, _29:43_: up or down, right. _29:46_: In both these cases, this is a 2. _29:50_: There's this appears to be a A21 tensor. _29:53_: There's 10s, it's a tensor _29:54_: which takes I've just you. _29:55_: I just picked a a rank to to illustrate this. _29:58_: So it's a A21 tensor. _29:60_: So there will be a two ways and one Lord. _30:04_: Indexes. So once, _30:05_: but once you've got and and and it's, _30:08_: it's a 21 tensor here, it's a two. _30:11_: It's the same true one tensor here. _30:13_: So it'll have the same pattern of indexes. _30:16_: And once we've got that pattern, _30:18_: then the pattern of of lambdas _30:21_: inside in this transformation follow. _30:27_: The last line, that's right. _30:29_: It is last line below. _30:32_: So it is 21 tensor. _30:35_: It's so true. 21 tensor, yes. _30:37_: The two up and one down, one down. _30:40_: Yeah, yeah. You have to show up all three. _30:43_: Or can you just swap high bar to I _30:46_: and then just give J bar and J bar? _30:48_: Yes you would have to because the. _30:53_: What what this set of numbers is? _30:56_: Is the components of that tensor _30:58_: in that transformed basis. _31:00_: So. In each, so it is. So _31:09_: TIGI bar G Bar K bar is equal to T Omega. _31:13_: I bar. When we get a cheap bar. _31:19_: Ek bar and it wouldn't make any sense, _31:21_: really. To to, yeah, _31:25_: if you were to to drop in one forms _31:31_: and vectors from two different bases, _31:34_: you get a number, but it wouldn't mean any. _31:37_: They are just. The different forms _31:40_: of the tensor to write down. _31:43_: These. This is one form in Japan. _31:47_: The below is another form, _31:51_: so this tends to this tensor. _31:53_: Here is A21 tensor, yes? _31:55_: So it's the same. _31:59_: The same with yes, _32:01_: it seemed interesting. _32:02_: It's just the truth. _32:04_: It seemed unsure. Yeah, _32:08_: just inform the same tension, _32:09_: OK, and one better move on. _32:12_: But one last question there. _32:17_: Yep. _32:20_: It equals the components of tensors, yeah, _32:22_: so equals the components of a tensor, yes. _32:26_: So and so each of these is a matrix. _32:31_: That's a matrix which is the the _32:33_: the matrix components of a tensor. _32:35_: And similarly that's a metric which _32:37_: is the component of tender and so, _32:40_: so this is our. _32:42_: One tensor is equal to N by N by N. _32:47_: Terms which are a number times _32:49_: the components of a tensor. _32:52_: Is that what you were asking on the next, _32:55_: but you just got the three members? _32:59_: So that that's not that. _33:01_: That's not supposed to be. _33:03_: Or is it? Yeah, yes. _33:06_: So not the not equal sign there, _33:08_: but spotted well someone, _33:10_: someone is paying close attention. _33:12_: OK. That is the all of the components, _33:20_: all the components gymnastics _33:22_: that we were introducing. _33:24_: So the last bit is slightly less. _33:28_: These were heavy and just a _33:30_: few more examples of BC's and _33:32_: transformations and spaces. _33:35_: So the first example of a space _33:37_: in which we can talk these _33:40_: things is flat Cartesian space. _33:42_: Now it's flat in the sense that _33:44_: I think do I define that here? _33:47_: I think there's. _33:51_: Just flight, welcome back to _33:52_: flat in the morning. And you're _33:53_: familiar with flat Cartesian space. _33:56_: Flight Euclidean space if we were flat _33:59_: Euclidean space with a Cartesian basis. _34:02_: So Euclidean space is the space _34:04_: we're familiar with, but if we're, _34:07_: Pythagoras theorem works. _34:08_: The Cartesian basis is the X&Y basis, _34:12_: so so things like are given. _34:15_: The basis vectors are. _34:19_: Orthogonal to each other. _34:21_: No, we and the unit and the other _34:23_: unit length although right now _34:25_: because well up to the point where _34:28_: we define a a metric on that space. _34:30_: We can't talk about length or about angles. _34:33_: But you know you you have _34:34_: a metric in your head. _34:35_: Pythagoras theorem is the _34:36_: definition of a metric. It is. _34:38_: It's how you turn direction. _34:41_: Directions into into distances. _34:45_: So in Floaties, _34:48_: because he's in space, we have. _34:52_: Umm. _34:58_: EX. 1 E y = e two and this is _35:05_: a good point to say that I _35:08_: will sometimes swap between. _35:10_: And numbering the basis vectors _35:12_: and giving them, you know, _35:14_: I'll say more pneumonic. Remove. _35:16_: There we can sort of sum over these things. _35:19_: We can't remove these, _35:20_: but if if we're different specific things, _35:22_: it's useful to write things like that, so. _35:26_: Our vector in fact you clean space _35:29_: or in any thing A1E1 plus A2E2 and _35:34_: which we can say right as a *** _35:39_: just like the informal notation. _35:43_: And that's something you learned _35:44_: about in secondary school? _35:47_: So that's there's nothing exotic there. _35:50_: So this is a very, _35:51_: a very long way to come back to. _35:53_: Some of you learn about school, _35:55_: but there's nothing, _35:56_: there's nothing extra at this point. _35:58_: Um. So what are the? _36:03_: The one forms in this space. _36:06_: Well, there's no, _36:07_: as we said earlier that we that we there's _36:10_: no constraint about what the one forms are, _36:13_: but we can choose them so that _36:16_: the contract to form the direct _36:19_: delta function to the so the basis, _36:22_: the basis one form #1 contracted _36:26_: with basis vector one is 1. _36:29_: And the zero contracted with other ones. _36:31_: And what do the components that look like? _36:33_: They look exactly the same as the vectors. _36:36_: So in. _36:37_: Flat clean space the one forms. _36:40_: When you turn the handle and find with _36:42_: root like look exactly like the vectors. _36:44_: You can't tell them apart and that is why _36:46_: you never had to learn about them before. _36:49_: Because in the sense you've always been, _36:51_: you're dealing with one form. _36:52_: In fact, you could in space, _36:54_: but you didn't know it because they looked _36:56_: at it indistinguishable from vectors. _36:58_: So if if you want to think of it that way, _37:01_: you could say that row vectors _37:03_: in the example you used earlier _37:05_: are are the one forms of flat. _37:07_: You could use space, _37:08_: so you'll be using them all the time, _37:09_: but you never had to had to care. _37:13_: Similarly if you. _37:17_: If you have continued mechanics _37:18_: in in previous years, _37:19_: you learned at the National _37:21_: Center or the strain tensor, _37:23_: you never had to worry about raise lowered _37:27_: indexes because the didn't matter. _37:29_: There's no difference between the reason _37:31_: Lord indexes if you like in Euclidean space, _37:33_: the the components the one are the same, _37:35_: the transformation between them is just the _37:37_: right delta rather than the more complicated. _37:39_: Not the right delta, _37:41_: the chronicle delta. _37:43_: And. _37:45_: OK, _37:45_: let mustn't get bogged down. _37:50_: And our metric in this space is _37:53_: just G has components. GIJ equals. _38:02_: You know that that's. _38:04_: Here there there's no sums _38:06_: in the nose imply summations. _38:08_: Here our metric in. _38:11_: The for the space is just. _38:17_: The 1001. You're like, which ends up with _38:22_: when we apply that to. And. A vector. _38:30_: We get GIG. _38:36_: AIAG. Picking up what we did last time. _38:40_: And if we do those sums. Actually. _38:45_: I = 1, I equals and so on. _38:47_: We get a 1E1 plus E2E2 which _38:53_: is equal to a 1 ^2 plus. _38:57_: You 2 squared, which is Pythagoras's theorem. _39:01_: So deciding that this is a _39:04_: defining metric this way. _39:07_: Is equivalent to seeing the Pythagorean _39:10_: Theorem works and the and the. _39:12_: The length squared of this vector _39:14_: A is just it's X component squared _39:17_: because it's white component square. _39:25_: Um. Ohh yeah. And and and what and and _39:29_: it's is is this fact that means that _39:31_: when you raise the index or lower the _39:34_: index of of of a we we we raise the _39:36_: indexes of a a vector in this space. _39:38_: What you get is. The same number. _39:41_: In other words, vectors and and _39:44_: one forms have equal components. _39:46_: In space. And the other points are. _39:54_: I'll go through the polar coordinates, _39:56_: but fairly quickly just _39:57_: because I want just to to. _39:59_: I'll draw your attention to that section. _40:01_: It is the same set of ideas, _40:04_: but with another case that you're _40:07_: familiar with Paul coordinates, _40:09_: but which is slightly less _40:11_: trivial than the the Euclidean _40:13_: space with Cartesian coordinates. _40:15_: And you can again discover what the. _40:20_: The components of this transformation matrix. _40:24_: We might even have that in our. _40:30_: And that's. _40:32_: OK, I'll go through this very quickly. _40:34_: And so the, the, the basis, _40:37_: the basis vectors of polar coordinates. _40:41_: Are just a transformation away from the _40:43_: basis vectors of Cartesian coordinates. _40:45_: So say E1 and E2 are the _40:48_: the X&Y basis vectors. _40:50_: They're familiar with the. _40:53_: Basis vector in polar coordinates _40:55_: is that very obvious transformation _40:58_: away from that, the basis. _40:59_: The basis vector, the tangential one, _41:01_: is a similar one. _41:04_: That are there is not _41:05_: what you've seen before. _41:07_: Usually when you've seen this _41:09_: transformation written down before _41:10_: in what implicitly they're scaled _41:12_: so that that are is not there. _41:16_: And that is what makes the one you're _41:19_: more familiar with, the basis vector, _41:22_: the basis vectors for polar coordinates. _41:24_: It makes them unit vectors. _41:27_: So these are not unit vectors. _41:29_: These are these are natural and _41:31_: different sense, so that's not a typo. _41:33_: That is the sort of natural thing. _41:35_: It's context, and nobody would into that. _41:37_: But the point is that that here, _41:39_: this is a concrete example of _41:41_: a change of basis. And. _41:43_: Boom, boom, boom, boom. _41:45_: I've components of the the Lambda matrix, _41:48_: the transformation matrix Lambda. _41:50_: So that that's. _41:51_: So that's just. _41:53_: The transmission matrix Lambda applied _41:56_: to the pair of basis vectors E1 and E2. _42:01_: And I'll leave you for that section _42:05_: slightly more slowly in a moment. _42:08_: And it's worth and and we can point out that. _42:15_: Um. _42:18_: I'll let you look at that after I _42:20_: put the slides up in a moment and _42:22_: the the metric of polar coordinates. _42:32_: G is equal to 100. R ^2. _42:38_: Which I mentioned just to show that it's _42:41_: not the the the diagonal unit matrix of _42:44_: the metric of in Cartesian coordinates. _42:47_: That's again, that's all in the notes, so. _42:51_: And. I'm going to skip over. _42:58_: I'm going to skip over taking 233 _43:00_: because although it's not false, _43:01_: it's it's a potential little _43:04_: confusing and go to another very _43:07_: important special example where the. _43:16_: It's Minkowski space where the metric. _43:20_: And you do the traditional thing of _43:23_: rating the components in mikovsky _43:25_: space with Greek letters. And. _43:29_: The metric is also traditionally _43:31_: referred with water rather _43:33_: than G is the diagonal matrix. _43:35_: And. Made 1111. _43:42_: Umm. _43:45_: And each a here is a matrix with _43:49_: a particular constant components. _43:50_: It's not, it's not the, _43:51_: they're not component of a tensor. _43:56_: The vectors in this space are a. _44:02_: A equals to a. _44:06_: MUEMU. And we can write the. _44:14_: Really. I. And the the. Umm. _44:21_: Yeah, the metric applied to _44:25_: two vectors GABG mu nu. AM UB. _44:30_: New which will be equal to. _44:37_: MUBMU, which were equal _44:39_: to minus. Oh, thank you. _44:44_: 0B0 plus A1B1 plus A2B2 plus A3B3 where _44:52_: I am sticking with the convention _44:55_: that in Minkowski space the basis _44:58_: vectors are numbered 0123. The indexes. _45:04_: Index is run over 0123 and there's _45:07_: it's a four dimensional space. _45:09_: And So what you what we have got there? _45:13_: Is the inner product in Minkowski space _45:16_: the inner product of special relativity _45:18_: that you may recall from the last time _45:21_: that you studied special relativity? _45:23_: So this is a prompt, _45:24_: a hint to perhaps drift back to _45:27_: those notes from two years ago _45:30_: and remind yourself a little bit _45:32_: of what was the question. _45:35_: Using it. _45:37_: Than in. _45:43_: Because it's arbitrary. The signature. Why? _45:51_: In second year we use the metric _45:53_: which was plus, minus, minus, minus. _45:57_: And if you make that the signature, _45:60_: then the signature of 1 +, _46:01_: -, -, 1 -, 1 -, 1 -, 2, _46:03_: the signature it will for because it will _46:06_: always be either minus two or minus or +2, _46:09_: depending on your on your convention _46:11_: and a number of other equations _46:12_: you change in turn now. _46:16_: I prefer when you talk with special _46:18_: activity to use the signature minus two, _46:21_: because then the interval _46:23_: is the same as proper time. _46:26_: It is more conventional in GR. _46:31_: And to use this, the signature the _46:34_: opposite way around so that um. The the. _46:39_: And spatial sector has the plus, _46:42_: plus, plus. _46:42_: So and that's really just a matter _46:44_: of taste some extent sort of taste _46:46_: and tradition you know we so it _46:48_: would perfectly reasonable to _46:50_: introduce special activity with with _46:51_: that metric but it is arbitrary we _46:54_: were undergoing nothing changes. _46:57_: So with in 30 seconds I've questioned yes. _47:02_: The maintenance. _47:03_: Or diagonal, so that that's the _47:06_: expression diagonal mobile. _47:07_: So it's a matrix which which is is 0 _47:10_: except along the along that diagonal. _47:13_: OK. And? Um. _47:17_: Blah blah and the transformation matrix, _47:21_: which takes you from one basis _47:24_: vector with that basis vectors in. _47:26_: Because his space. _47:28_: To another basis vector. _47:31_: Instead of vectors in Minkowski space _47:34_: is this transformation matrix here, _47:36_: which you may be familiar with _47:38_: as Lorentz transformation. _47:39_: So all the right transformation is _47:42_: is just how you get from one basis _47:44_: basis set attached to the station _47:46_: platform to another basis set _47:48_: attached to a moving a moving object, _47:50_: a moving train. _47:52_: That's what I transformation is. _47:55_: It's a basis, _47:55_: a change of basis and the and the _47:57_: point of all of this positivity and _47:59_: yeah is that the base is the physics _48:01_: doesn't change when you do that. _48:05_: And that in fairly decent time is is us. _48:10_: There's a few extra marks in in in in _48:12_: section 2.4 we talk about coordinates, _48:14_: bases, and just clear bits of terminology. _48:17_: It would be good to have a look _48:18_: at that section just to get _48:20_: your head straight around those. _48:21_: But that's I've done with with Part 2. _48:23_: So we'll go into Part 3 next time, _48:25_: which is next Wednesday.