Transcript of gr-l03 ========== _0:10_: OK, this is lecture 3. _0:12_: We managed to make excellent _0:14_: progress last time and what we got _0:17_: the right amount through Part 2, _0:20_: what we're going to do? _0:22_: Well, let's move forward to where we _0:24_: did get to last time, the plan. Is. _0:31_: We've got to vote there I think and there _0:34_: and the idea that one way of visualizing, _0:37_: mentally visualizing our our one form and _0:42_: I think of planes or lines and the way of _0:45_: because because that's a good visualization _0:48_: because it makes the idea of contracting _0:52_: A1 form with a vector very natural. _0:54_: So the the contraction of these three vectors _0:58_: with that one form field are all the same. _1:01_: The team in each case _1:04_: SO3 layers pointed even. _1:05_: Although the one form fields varies _1:08_: over the course of the the fields, _1:11_: it varies in direction so that direction, _1:13_: in that direction whatever, _1:15_: and size magnitude. _1:16_: We're just as if you're visualizing these _1:20_: contour lines the the the gradient of that. _1:24_: Landscape is larger when the _1:26_: lines are close together, _1:27_: so that's why it's it's a nice visualization. _1:30_: Not too much hanging on that. _1:31_: It's just we've got you can _1:33_: have a picture in your head. _1:35_: So this next section we're going to go _1:38_: on to talk about if we find the right. _1:41_: Page we'll talk about. _1:43_: We'll start off in a second _1:45_: 225 talking about components. _1:47_: Now this is where things start to get fiddly. _1:51_: OK, so there's quite a lot of notation is _1:55_: being going to be thrown at you in this. _1:58_: In this lecture, it's not deep. _2:02_: You have to think and go to mountain _2:04_: top and meditate on on the ideas here. _2:06_: But you do have to in since, you know, _2:08_: go back over and concentrate, work out, _2:10_: get everything straight in your head. _2:11_: So don't expect to have any _2:14_: wonderfully physically epiphanies here, _2:15_: but there's, you know, _2:17_: certain amount of apparatus and technology _2:18_: that we're going to get through here. _2:20_: So I am going to go _2:22_: through it fairly briskly. _2:23_: And show you if there are questions, _2:25_: but I don't think there's a lot of. _2:28_: Profundity, coming up is what I'm saying. _2:30_: In, in, _2:31_: in in in this neglect lecture. _2:36_: Before I get going, are there any other _2:38_: administrative things that I have to say? _2:40_: I don't believe so. _2:42_: Are there any questions of that _2:44_: sort of sort that anyone has? _2:46_: No good, homie. Let's get going, OK? _2:56_: I said earlier last lecture that _2:59_: the set of of vectors in one _3:02_: forms is from the vector space. _3:05_: That means that there is a basis _3:08_: what from the the axioms of _3:10_: vector space you can reduce. _3:12_: We're not going to do it to talk _3:14_: to mass department for that you _3:16_: can get a basis for the set and _3:18_: and what I mean by a basis is. _3:21_: That you can't for any vector. _3:23_: Yeah. A. _3:25_: There's a set of basis vectors _3:28_: which are going to call EI and any _3:31_: any which which span the space, _3:34_: so that in an N dimensional _3:35_: space there are N basis vectors. _3:37_: They're all in linearly independent _3:40_: and I think there are basis. _3:42_: We are saying that any vector _3:45_: in that space A. _3:47_: Can be expressed as a unique sum of _3:50_: multiples of those basis vectors. _3:52_: OK, that's that. _3:54_: That's just the same as saying _3:57_: that you have the the. _3:59_: Any any vector in the plane is is _4:00_: is too many I plus too many G's and _4:02_: and and so you're familiar with that notion, _4:04_: I trust. _4:05_: And they and the the components are unique, _4:09_: we can do the same thing. _4:11_: For the vector space for _4:13_: the for the one forms, _4:14_: we have a set of basis one forms _4:16_: which we originally call Omega with a _4:18_: tilde above them to show the one forms, _4:20_: and any one form in the space can _4:23_: be expressed as a as a unique sum of _4:26_: multiples of those pieces one forms. _4:28_: Is anyone feeling surprised at that? _4:33_: Official question. _4:34_: Completely different space at this point. _4:36_: Yes, there are different space. _4:37_: We'll find out shortly that we're _4:39_: going to make our very important _4:40_: link between these two spaces, _4:41_: but right now they're just different spaces. _4:45_: OK. _4:48_: Uh. _4:51_: The other important thing is that _4:54_: if you change the basis set. _4:57_: Device, say I don't like those _4:58_: I I'm going to have different, _4:59_: IE then you change the components, _5:03_: but you don't change the vector. _5:05_: OK, so the same vector A can _5:08_: be expressed as a different sum _5:11_: of components of multiples of. _5:13_: As a sum of components of _5:16_: these basis vectors. _5:17_: As a sum of different components _5:19_: of these basis vectors. _5:20_: Fairly obviously you just changed _5:22_: that your choice of basis vectors _5:24_: and the components that you have _5:26_: to use to add add up to get the _5:28_: the initial vector are different, _5:29_: but the vector of the same. _5:31_: That seems hardly worth saying at this point. _5:34_: It seems so obvious because you're _5:36_: you're starting off with with _5:38_: A and you're discovering what _5:39_: the components AI are, _5:41_: but it very crucial, _5:42_: it's crucial that you hold that _5:44_: you remember that that the vector _5:46_: A is what we're talking about here. _5:48_: That's the important thing. _5:49_: The way it happens to breakdown _5:51_: in terms of this basis set, _5:52_: of that basis set is a detail what _5:56_: that what we'll learn that that's _5:59_: the changing the basis is changing _6:01_: the reference frame or changing _6:02_: the natural reference frame. _6:04_: So you change frame, you change the, _6:06_: you change the coordinate system, _6:08_: you change the components. That's all. _6:09_: That's all I'm saying there. _6:11_: I'm seeing some of you are familiar with, _6:12_: I think in a complicated way. _6:15_: And because it's useful to _6:17_: for that for complication. _6:18_: Now note the. _6:21_: Conventional layout of the indexes. _6:23_: Here, _6:23_: all of the basis vectors have _6:25_: lowered indexes and the components _6:27_: of rate ones of the basis one forms _6:30_: have raised indexes and their _6:31_: components have lowered ones. _6:33_: If we stick by that convention. _6:35_: Then there is a hugely useful additional _6:39_: convention we can take advantage of, _6:42_: and it's called the Einstein _6:44_: summation convention. _6:46_: Which, and unfortunately this is going to _6:49_: be annoyingly slow to swap back and forth, _6:51_: but there's a meeting eventually. _6:52_: Is that if we have some? A i.e. Aye. _6:58_: Then we can write that as just a i.e. _7:03_: I, and if there is a. _7:07_: Was there a repeated index? _7:09_: One high, one low? _7:11_: And in any expression, _7:13_: then we assume a sum over it. _7:15_: OK. Now. _7:18_: And that is a repeated _7:20_: and repeated index, so. _7:25_: AIJ. The Newsroom. _7:28_: There's no repeated index, and that doesn't. _7:31_: If you've written that down, _7:32_: you've probably made a mistake, OK? _7:35_: If you write down something like. _7:38_: AI. For EG, you've made your EI. _7:42_: See see, see. You made two mistakes. _7:45_: First, you probably didn't mean to write _7:47_: a the the the I of the superscript. _7:49_: Secondly, that's two raised indexes, _7:52_: so there's a no sum, _7:54_: and B you probably made a mistake. _7:56_: OK, so this is it's. _7:58_: It's quite quite useful in rotation _7:60_: because it surfaces mistakes that you make, _8:02_: notation mistakes that you _8:04_: make quite quite promptly. _8:05_: And if you've got 3 repeated index, _8:08_: you know an index would be three _8:10_: times you made a different mistake. _8:12_: OK, so so all these mistakes _8:14_: are ones you will make. _8:17_: This course is. _8:21_: And. _8:24_: The the the little secret of on _8:27_: of of advanced courses is the _8:29_: harder courses of easy exams. _8:32_: Because harder courses tend to _8:34_: be harder to write exams for. _8:37_: Which means that there are a _8:39_: limited range of things one can _8:40_: examine in a in a in a hard course, _8:43_: but one of the things you can examine is _8:45_: can you turn the handle on the algebra? _8:47_: It's not a very exciting remember the _8:49_: distinction made between English objectives? _8:51_: I said aims with the point _8:53_: objective with the party tricks. _8:54_: Well, algebra is a party trick, OK, is it? _8:57_: It's not a deep thing. _8:58_: It's not the thing you'll remember, _8:60_: you know in in years to come. _9:01_: But it is it is nice and easy to examine. _9:03_: So big hint the the exercises that are that, _9:07_: where you exercise your, _9:09_: your fluency at using this _9:11_: notation are good ones. _9:12_: Have a look at. _9:14_: I'm not making any promises _9:16_: but you know that that. _9:18_: If you end up with if if the if in in _9:20_: June you end up with an exam put paper _9:22_: which is is clearly algebra heavy, _9:24_: then go yes. _9:26_: I'll try not to do that because _9:28_: it's too easy to do it, _9:29_: but it tends to be a binary that one, _9:32_: so those who have done the exercises. _9:35_: It's just a matter of not getting lost _9:37_: those who haven't done the exercises. _9:39_: I mean draw a literal blank. _9:40_: I mean, it's, it's, it's. _9:41_: It's not nice to see. _9:43_: Sorry hint over there water. _9:46_: Ohh yes so getting getting used _9:48_: to getting good at this algebra at _9:50_: this algebra this right this index _9:51_: wrangling is important and the only _9:53_: way of doing it is doing exercises. _9:56_: The indexes are arbitrary here, _9:58_: the other important thing so _10:02_: that AI that and that a GE. _10:06_: G's are the same thing. _10:09_: So you can always swap the indexes _10:12_: over as long as this the the _10:14_: the the same pattern is there. _10:16_: OK. Because in both cases is a _10:21_: sum over the arbitrary index. _10:24_: Are operating. _10:26_: And So what happened if we do things _10:32_: like applying one of the ohso? _10:36_: So there's a. _10:38_: Ohh yes, _10:40_: Yep. _10:51_: So what happens if we? _10:52_: This is really annoying, _10:53_: but I do want to rather swap _10:55_: back and forth between these. _10:56_: So so there's the. _10:60_: Vector A is the sum of AI A1E1 plus _11:03_: E2E2 or more complicated than that. _11:05_: And they I'm not clearly not making _11:08_: any assumptions about the relationship _11:09_: between the the the basis vectors. _11:12_: They're not the same length, _11:13_: they're not the same. _11:14_: The angle between the business _11:16_: is 90 degrees because we haven't _11:18_: don't have those concepts yet. _11:20_: The idea of a length of a vector _11:22_: and the angle 2 vectors doesn't _11:23_: exist as far as we've got yet. _11:25_: OK, so we're we're just talking about _11:27_: two things which are not the same. _11:28_: We are not just one E one is not a _11:31_: multiple of E2 is the only thing that _11:32_: that that's that's important here, _11:34_: because if it was that I _11:36_: wouldn't stand the space. _11:38_: And submission convention. _11:40_: So quick pause. _11:41_: Which of the following is a valid _11:44_: expression in the context of the _11:47_: Einstein summation convention? _11:48_: I'll go through quickly that your brief chat, _11:50_: so one who said it was valid? _11:54_: Two, who said those valid? _11:56_: Three, _11:57_: that was valid. _11:60_: 4:40 that was valid. That's AIG. _12:03_: OK, have a brief chat and. _12:09_: I mean, you're mostly right to say _12:11_: one point, but what? The second one? _12:16_: JEJ. _12:46_: So with that. _12:51_: With that reflection in mind. _12:55_: Books with that reflection in mind. _12:60_: Hello with that reflection in mind. _13:04_: Who see that one was valid? _13:08_: Yeah, it's it's it's two indexes, _13:10_: the same, one up, one down. _13:11_: We would say the second one valid. _13:14_: Yes, on reflection it is because although _13:17_: I've made a point of of writing one _13:20_: forms as with lower case letters would _13:23_: therefore have subscripts, there's no. _13:25_: That's not an absolute rule, _13:26_: so the fact that PG each is is out of one _13:31_: form used letter doesn't change anything. _13:33_: PII is is not valid purely because _13:38_: there are other ₹2 indexes. _13:41_: They're both lowered, so that's probably a. _13:44_: An algebraic mistake at some point, _13:45_: and A i.e. G1 up, one down is. _13:50_: Invalid because others one up, _13:52_: one down is that it's not invalid. _13:54_: It's not invalid. _13:55_: It just doesn't mean very much. _13:56_: So that there's no Einstein _13:58_: summation convention. _13:59_: OK, so. _14:02_: The last two aren't really invalid. _14:06_: They probably indicate mistakes _14:07_: of some type, but they're not. _14:09_: But they're simply not one to which _14:12_: the Inspiration Convention applies. _14:14_: OK? But probably valid. _14:16_: There's probably a mystique has happened _14:19_: leading up to that being written down. _14:23_: Yeah, so bad isn't quite right there. _14:26_: I mean it it's just probably a _14:28_: mistake, but it the, the, the, _14:29_: the the Commission doesn't apply. _14:34_: OK, I'll come back to that moment. _14:35_: Let's walk back here. _14:38_: So what happens if we? And ask. _14:45_: Apply around in one form P to one _14:49_: of these basis vectors. Called EJ. _14:53_: Well, as we know P as we know _14:54_: know as of like 10 minutes ago, _14:56_: as P can be broken down into its components. _15:00_: So we can write Pi. _15:03_: Well, we got i.e. Gee. And. _15:11_: They would stop because we don't know _15:13_: anything about that and what we got, _15:14_: IE we don't know what that basis one _15:18_: form applied to that basis vector does, _15:21_: however, we can decide. _15:24_: That will define the basis of one form Omega _15:28_: in terms of the basis one basis vectors EI. _15:32_: Such that. Omega i.e. _15:36_: G is equal to delta. _15:39_: IG. In other words. _15:43_: And one if I equal to G and zero if I. _15:49_: Is not equal. _15:50_: So that's not we're not required to do that, _15:53_: but it is foolish doctrine. OK, _15:55_: we're defining the basis one form to be dual. _15:59_: To the basis vectors. With that in mind. _16:03_: We could no rate that this P tilde. _16:08_: EG is Pi Omega I e.g. Is Pi delta? _16:16_: IG. And we have a summation _16:19_: convention we can we can do. _16:21_: We can add that up. _16:22_: What answer do we get? _16:24_: In the sum over the the eyes. _16:27_: The the Delta IJ is 0 except _16:30_: when I is equal to J. _16:32_: Therefore that is equal to. _16:37_: PG. In other words, more clearly. _16:44_: In other words, if we apply the one _16:46_: form to one of the basis vectors, _16:49_: what we end up with is we _16:51_: pops out that Pops pops out, _16:52_: there is the the corresponding component, _16:55_: the JTH component of the one form P. _16:59_: So that that's how we extract _17:02_: components from an arbitrary one form. _17:08_: Before and we can do exactly the same _17:11_: by applying our vector to one of the. _17:17_: Basically one forms that's a i.e. _17:21_: I applied to Omega. _17:25_: G which we will similarly. _17:28_: It's similarly defined to be _17:32_: a I delta IG which is a G. _17:38_: So we we turn the handle and and _17:41_: out pops the component and you can _17:43_: see where the sort of algebraic _17:45_: mistakes that can happen to you. _17:47_: If I if I if I accidentally write _17:48_: an instead of a G then I'm going _17:50_: to end up with with with two eyes _17:51_: in the top and and or three eyes _17:53_: or something and I go Nope and _17:55_: and I I step back to a bit and _17:58_: work out where where I miss wrote. _17:60_: So there are plenty of opportunities _18:02_: to improve your handwriting and and _18:04_: write very neat eyes and J's and _18:06_: commerce and technical ones later on. _18:08_: So that's that's something. _18:10_: But mechanics, _18:11_: how can we do the same thing for teachers? _18:12_: Of course we can. _18:14_: And if we now is just as we can write _18:20_: could decompose our vector into. _18:25_: The components. We can decompose _18:28_: a tensor into its components. _18:32_: LM. And and here I'm going _18:36_: to use an E. L. He. And. _18:44_: End. _18:47_: I I think I said last week that I was _18:49_: going to introduce out of public but not _18:51_: actually use them for a little while. _18:53_: I clearly had forgotten that I _18:56_: was going to go through this. _18:57_: So there were version with the top _18:60_: we're writing the the components _19:02_: that were expanding the tensor _19:05_: in terms of components. _19:07_: At times an outer product of. _19:10_: All vectors 1 forms. _19:11_: How do we in that case extract the _19:14_: components of the tensor from that? _19:20_: If we apply. Remember T what? _19:24_: Breaking T as a AA21 tensor? _19:29_: So that means it takes 21 forms _19:31_: and one vector as arguments. _19:32_: So if we drop in. Omega I. _19:36_: We got a G&E. Key into that. _19:43_: Then we. Discover that is T. _19:49_: LMN remembering the definition _19:51_: of the outer product. _19:55_: El. We got I. Times on your _20:00_: ordinary multiplication E. _20:02_: And Omega. G Times Omega. _20:11_: NE. Keith? OK, so so remember _20:17_: that the applying the the _20:19_: definition of the outer product. _20:21_: Is that when that other product is _20:24_: applied to boom boom boom 3? Arguments. _20:27_: The three arguments are divided amongst _20:30_: the three things in the product. _20:32_: In this way, that's in the definition _20:34_: that product mentioned last time. _20:36_: If you if you go back to that, _20:37_: you'll be reminded of of what _20:40_: we talked about last time. _20:42_: And at this point it's become _20:45_: mechanical because we just replace _20:47_: each of these by deltas. Tea. _20:52_: LMN Delta Li, delta MG DD. And. _21:02_: OK. We've got three repeat indexes. _21:06_: We've got 3 repeated, _21:08_: 3 implicit summations here. _21:10_: In each case the summation over. _21:15_: L / M and over north. _21:19_: The summation over N will be 0. _21:22_: Except when a is equal to key, _21:24_: the summation over M will be _21:25_: 0 except when M is equal to J, _21:28_: and Speaking of L will be 0 _21:30_: except when L is equal to I. _21:32_: So this will be TIG. _21:37_: Key. _21:39_: So just as with vectors. _21:43_: We can extract the components of a. _21:47_: Uproot enter by dropping in the _21:50_: corresponding the right number of _21:53_: basis vectors and basis one forms, _21:55_: turning the handle and and getting a _21:58_: number out, and so I mentioned this. _21:60_: I worked through this in order to, you know, _22:02_: show that this this whole process works _22:04_: for tensions as well as vectors 1 forms, _22:05_: and also to show the sort of of, _22:08_: admittedly very fiddly sort of _22:10_: algebra that is involved here. _22:12_: The. This isn't fun algebra, _22:16_: it's just does require concentration and _22:20_: most people who do research, you know. _22:25_: Could have written your research. _22:27_: Tend to be very fond of computer _22:29_: hardware packages because you _22:30_: know all all the handle turning _22:32_: can be done very reliably. _22:36_: OK. _22:40_: Are there any questions at that point? _22:41_: You you see what I said about that? _22:43_: There's quite a lot of detail here, _22:44_: nothing very, not a very profound, _22:46_: but it does require concentration. _22:49_: Any questions? OK. And. Umm. _22:60_: So just go back over over over to _23:02_: emphasize a couple bits of rotation. _23:04_: The set up the this set of. _23:06_: So this is. This is a number. _23:09_: OK, that's a number. _23:10_: It's a set of N by N by N numbers. _23:13_: In fact, it's that there are _23:16_: in any dimensional space, _23:18_: there are, that there's _23:21_: T111T112T113 and so on. _23:22_: And so this is N by N by N numbers. _23:25_: But this thing here, TK, is just a number. _23:30_: It's a real number on the real line. _23:32_: There's nothing exotic about that. _23:35_: But we will somewhat slangily. _23:39_: Either talk about tea by writing down a tea, _23:42_: or talk about the tensor by writing _23:44_: down that and saying that's the tensor. _23:46_: It's not tensor, _23:47_: it's just important for tensor, _23:48_: but we'll we'll sort of equivocate between _23:50_: the between the two things, in what way _23:53_: that will be be natural in context. _23:56_: But remember that even if _23:59_: we see the tensor T here. _24:02_: Were being bad. _24:05_: Not even that we're talking _24:06_: the components of density. _24:08_: OK, I I make a big fuss about that. _24:09_: Again, that seems obvious now, _24:11_: but it can trip you up later. _24:16_: It's also useful to write it down that way, _24:17_: because that makes it clear to 1 tensor. _24:20_: It's just saying tea, you have to you have _24:22_: to remember what was the rank of this tensor, _24:24_: but here it's obvious it's a A21 tensor. _24:28_: And hopefully there's two. _24:29_: There's two intentions 2 _24:31_: upstairs and one downstairs. _24:33_: Another thing to note is that um. _24:40_: The. _24:45_: If I write key IJ. And T. I. Gee. _24:54_: They are not the same thing. _24:57_: Because TIG. Is equal to _25:03_: some tensor T. With the. _25:09_: Had two documents filed in and the tensor _25:14_: TIGIST. I. _25:20_: In other words, these two tensors. _25:22_: I could be different. _25:24_: They're both 11 tensors in one. _25:25_: In the first case, _25:26_: the the the first argument is one form, _25:28_: the second argument is a vector, _25:30_: second case the first argument vector, _25:31_: the second of the one form the _25:33_: complete different things. Now. _25:34_: In most cases, if we use the same _25:37_: symbol we are for these two, _25:38_: there's some implication that they _25:40_: are related to each other just because _25:41_: there are limited number of letters _25:42_: in the alphabet and and and so on. _25:44_: But and This is why the _25:47_: indexes are staggered. _25:48_: So we haven't written TI above G. _25:51_: Written tea I. _25:53_: G and the second case, TIG. _25:56_: So it doesn't matter where these _25:58_: things are vertically positioned. _25:59_: There's a very dense notation. _26:00_: There's a lot of, _26:02_: there's a lot of information _26:03_: in every last bit of of of _26:05_: where the the notation is. _26:07_: Lead out. _26:12_: And and and and and. What this _26:14_: also means is that we can. _26:19_: But your basis vector E1C and we can _26:21_: turn that into a set of components. _26:24_: One you you, you. Whatever and so on. _26:28_: So so so at this point we can write _26:31_: down a set of components of the basis _26:33_: vectors and make sure you you you see, _26:35_: you think this is a bit more making _26:37_: sure why you see it's obvious _26:38_: that's 1000 and E2 would be 0100 _26:42_: and E3 B 001000 and and and so on. _26:44_: So just make sure you see where _26:46_: that's obvious in the mathematician _26:47_: sense of obvious meaning going _26:49_: think about it for a while you'll _26:51_: see that you couldn't be otherwise. _26:52_: But that question. _26:55_: Another one for first, second, third. _26:60_: Well yes because in this case _27:02_: the the the two T's there _27:04_: they are different sensors. _27:05_: So the the the whole yes, _27:08_: but it's for the second phase _27:09_: of you know J on the left and _27:12_: I on the ohh right and Yep so. _27:20_: GIIST. _27:23_: E.g. Oh my God, I. _27:28_: There's not the same as the. _27:30_: No, no, no, it's not, because in _27:32_: the second line that is. And. And. _27:42_: Is that with the same? _27:42_: So so these these two things are _27:45_: interchangeable because all we've done. _27:50_: Is swapped I Ng. Not over. Alright. _27:59_: And. About you exactly me. _28:02_: But the in that case, the, the, _28:05_: the, the tension right above. _28:06_: Let's call that. _28:09_: T1 and T2T1, that's attention which takes _28:14_: is it attended with a different pattern. _28:16_: It's a machine with a different set _28:17_: of holes in the top one form shape _28:19_: and a vector shape tool, as opposed _28:21_: to a vector and one form shape tool, _28:23_: and so it's carelessness on my part _28:24_: to have written tea for both of them. _28:26_: It is that we winning. _28:29_: Because they must be one. _28:32_: That makes sense. _28:36_: Which place the holes are _28:37_: which should change function, _28:39_: but the second one you changed _28:42_: the E has a is an I got the J. _28:46_: OK, I I think it's possible. _28:48_: Drug cross purposes, but no in this case. _28:53_: The the letters I pick. _28:55_: Will be arbitrary, so I could write T. _29:00_: KKL. Will be tea. _29:04_: The key. Oh my God. _29:07_: And the same thing is being said so the _29:10_: the the the the so so so pick a key, _29:14_: pick an L and the the keys 1L3 _29:19_: then the the 1/3 component is _29:22_: what you get when you E1 omega-3 _29:25_: in there and it doesn't matter _29:28_: what in what letter you choose. _29:31_: For, for denoting those. _29:37_: Yes, yes, the the letter doesn't matter. _29:40_: So sorry that, that that that that that. _29:42_: I think which letter you pick doesn't _29:44_: matter that the the fact that the pattern _29:46_: on this side and the pattern on this side _29:49_: is the same is is the key thing. So yeah, _29:52_: so there's a famous arbitrariness here. _29:55_: And and you could always change the. _29:59_: You can always change the the the _30:00_: the letters you you you pictures and _30:02_: note the the arbitrary indexes as _30:03_: long as it was anything both sides. _30:05_: So that's that's an algebraic rule. _30:06_: If you like, that's that's that's new. _30:10_: OK, keeping going. _30:13_: We do other things we could do _30:16_: are what if we apply? _30:18_: One form, actually, one form, _30:20_: an operator vector. _30:22_: Well, before that's going to be Pi Omega. _30:31_: IAGE. Gee, the temptation there _30:33_: was to me to write AIP IEI. _30:37_: But that wouldn't be right, because then _30:39_: would have at A4 indexes I, so I've got. _30:41_: I've got to pick a random other letter, _30:43_: so we're putting in the expanding _30:46_: the A inside the brackets. _30:49_: Pick some other. Other index, _30:53_: but remember that the tense Omega I being _30:57_: a tensor is linear in its arguments. _31:01_: And remember that the _31:03_: components are just numbers. _31:06_: So because of that we can take the. _31:10_: The output of the argument and write Pi. _31:15_: EG Omega I e.g. _31:20_: Which is equal to PIAJ delta IJ. _31:26_: Some over the eyes or the J's. _31:28_: In both cases is 0 unless I _31:30_: equals J which is equal to Pi. AI. _31:36_: In other words, the contraction of. _31:40_: P and a. _31:41_: P applied to A is P1A1 plus _31:45_: P2A2 plus P3A3 plus P4. _31:49_: How many up to which you will remember _31:52_: from the definition of the inner _31:54_: product between vector vector inner _31:56_: product between the ground about school. _31:58_: So that's the the vector inner _32:01_: product we about school is _32:03_: the same thing happening, _32:04_: OK. _32:07_: This and this is just a number, yeah. _32:13_: Yes, so. So, so the that's of tension, _32:19_: that's tends to, that's a tensor, _32:21_: that's a, that's a tensor, _32:23_: that's a tensor and because of our our _32:26_: definition of the what basis one forms _32:29_: has been dual to the basis vectors. _32:32_: We decide that the. _32:35_: Value of that is just the chronicle delta. _32:40_: At which point we can sort _32:41_: of turn the handle, you know, _32:43_: make our way home up, _32:44_: sum over G or I and get this, _32:48_: which is a sum of numbers, _32:51_: so π and AI numbers. _32:53_: That's a based on the sum of numbers, _32:55_: and end up with a number, _32:56_: which is what we expected because the _33:00_: which is what we should get because _33:02_: remember that being a tensor, the. _33:06_: A01 tensor Omega or P rather _33:09_: is a move which takes a number. _33:11_: I put takes a start game. _33:14_: PRP is a 01 tensor, _33:16_: which means it takes a single vector _33:19_: as argument and turns it into a number. _33:21_: So this does hang together. Question. _33:31_: Yeah. Three, so with the. _33:39_: Yeah, yes. So that's what we. _33:45_: Yeah, so so so they are we we that _33:49_: that that that's not. No, I suspect. _33:53_: Given that we chose that to be the case, _33:57_: I suspect we couldn't not _33:58_: choose that to be the case. _33:60_: I I think I I suspect that implies _34:01_: that implies we'll put it up. _34:03_: Put it right, it implies that, _34:05_: but the article. But you're right, _34:06_: the are two separate things. _34:09_: Looking at. _34:11_: This because we are applying _34:13_: A1 form to a vector here. _34:15_: We also end up applying one _34:16_: form to a vector here. _34:17_: Just want to break it down into applying _34:20_: a basis one form into a basis vector. _34:22_: OK. _34:23_: So what we're doing here is we're _34:25_: going back and forth between the _34:27_: geometrical objects one forms and _34:29_: vectors and calculations in terms _34:32_: of numbers which are components, _34:34_: all remembering that if we changed _34:35_: the the the the the the the the bases, _34:38_: then we changed the components. _34:41_: Any other questions for that? _34:44_: OK, we're making, we're, we're on. Good. _34:49_: We're timing this well so far. Right, so. _34:58_: Ohh yes and also. And. Here. _35:03_: I haven't mentioned bases that, _35:05_: that that's just we don't, _35:06_: that's just a number, OK. _35:09_: Here on this side. _35:12_: If we change the basis basis vectors and _35:15_: that's changed the the basis one forms, _35:18_: then these numbers Pi and AI would change. _35:22_: But the sum of them would not change. _35:27_: Which is interesting. So the the _35:30_: the components are basis dependent. _35:33_: But this inner product, _35:35_: or this contraction, rather, is not. _35:37_: Which is surprising. _35:39_: I mean that the that you might not _35:41_: have guessed that would gonna be _35:42_: the case before we demonstrated _35:45_: that it was true here. Umm. _35:51_: Similarly, if we were to do the _35:55_: same thing with a tensor TP. _35:59_: Q. E would end up with _36:06_: PIQIQJAK. T. Um, IG. _36:13_: Where as you can see I've got two _36:16_: of each index. One up, one down, _36:19_: and I have carefully staggered. _36:21_: The indexes of the T to match the. _36:25_: One for one form vector, _36:26_: 1 from 1 from 1 vector arguments of the. _36:31_: Of the tensor. Um. _36:36_: A remark there about how you can _36:39_: define contraction in a slightly _36:40_: more general general sense. _36:45_: I'm not going to go into that _36:46_: I I almost want to put that _36:48_: in a dangerous bend section, _36:49_: but it's it's it's useful to _36:50_: have to mentioned in here, _36:52_: but it it's not something we _36:55_: depend on greatly greatly. _36:57_: Just to summarize all this. _37:03_: Well, I'm not gonna read out the the. _37:05_: There's somebody at the _37:06_: end of that section 225. _37:07_: You you can look at yourself _37:09_: and and and and and just go, _37:11_: go go back to what we've covered here. _37:12_: So we've covered a lot _37:13_: of fiddling around here. _37:14_: Nothing very profound but fiddly. _37:17_: And you will get very used to it if you _37:20_: what's your bucking exercises. And. _37:28_: Yeah and there's other remarks. _37:30_: I I think that section has grown _37:32_: somewhat over the years as I _37:33_: thought ohh nothing you say, _37:34_: nothing to say another you can say _37:36_: appended to the bottom of that section. _37:37_: So there's lots of extra remark one _37:40_: can make without necessary without _37:41_: taking up a lot of time in the _37:43_: lecture talking over the over them. _37:45_: OK, moving on. And. _37:54_: One thing we haven't done yet, _37:57_: and I make sure only in passing, _37:58_: is we haven't said have. _37:59_: They don't have any notion _38:01_: of how how long a vector is. _38:02_: Or how long one form is? _38:04_: So so the the this space of things _38:07_: we're talking about here has no _38:08_: lengths in itself at this point. _38:12_: And you can do a lot without _38:14_: talking about lengths. _38:16_: So adding lengths and the definition _38:18_: is a thing, you is a an extra. _38:21_: It's vitally important for the _38:23_: use of different geometry in GR, _38:25_: but it is an extra thing. _38:26_: You can do all sorts of maths _38:28_: without worrying with this here. _38:33_: We're gonna have the notion of. _38:35_: The metric center. _38:37_: Now we pick our tensor, _38:40_: which the way we introduce lengths is. _38:42_: We pick a tensor, we pick a A20 tensor, _38:48_: so tensor that takes a. _38:51_: It takes 2 vectors of arguments. And. _38:57_: We use that as our definition of length, _38:59_: and we call this metric. _39:00_: This centre the metric. _39:02_: Metric meaning measurement or or _39:04_: ecological relative to, to to measure. _39:07_: The tensor is the idea of it's where _39:10_: length enters the geometry here. _39:12_: So if one says that's a _39:14_: centimeter rather than a parsec, _39:16_: you know it's what gives definition _39:18_: to the developer length. _39:19_: The point we introduce we introduce length _39:22_: is the idea that if you take a vector A. _39:25_: And you stick it into both _39:27_: slots of the metric tensor. _39:29_: Then the number you get out is. _39:31_: We're going to call this that _39:33_: the square of the length of the _39:36_: of the of the of the vector. _39:38_: OK, that's our definition of length. _39:40_: That's definitely the length of a vector. _39:44_: Um, and you can? _39:49_: Right. And if you um. _39:57_: You can do other things. _39:58_: Like talk about the angle between 2 _40:01_: vectors in this way which? If we had, _40:04_: the notion of angle could come in here, _40:05_: but we're not going to worry _40:06_: about that at present because, _40:07_: you know, it's something that we _40:09_: we need to know at this point. _40:11_: So the vector. So what do I say? _40:15_: Yeah, OK, and let's stick _40:16_: with the slides for more. _40:21_: I've as usual. We can partially apply the. _40:26_: The arguments to the vector, the tensor. _40:31_: So if we apply just drop in one. _40:37_: Vector argument to the tensor. _40:39_: What we have is a tensor with one free slot, _40:43_: one vector shaped hole. _40:45_: In other words, I want form. _40:48_: So what this does is it defines A1 form a _40:52_: tilde which is associated with the vector. _40:56_: A. Via the metric tensor. _41:02_: So this isn't dual in the same way that _41:05_: the basis one forms were dual to the. _41:09_: Basic vectors. Well, not quite. _41:12_: This is a dual like thing. _41:16_: Push, which is via, _41:18_: specifically the metric, _41:19_: the metric tensor. OK. _41:23_: And and we'll we'll see _41:25_: a bit with that anymore. _41:30_: Umm. _41:33_: That that these quick questions are the _41:36_: the the selection of the of the exercises _41:39_: which are sort of just are you awake? _41:41_: Take questions so that they're they're _41:44_: notated in the in the exercises. _41:46_: Then at the end of the of of the part _41:49_: as the notice has been just just quick _41:53_: things just to keep I'm not going. _41:57_: With that for the moment. Um. _42:05_: No, it is a useful thing to to to _42:07_: to go through because it just if _42:09_: it further illustrates the the. _42:13_: Uh. _42:16_: And. _42:19_: The limitation working so. _42:25_: G. A/B. _42:30_: It's useful thing illustrated. _42:32_: I introduced this by talking about _42:34_: the the length of a vector being _42:36_: what what you get what length squared _42:38_: be what you get when you drop both, _42:41_: when you feel both of the of the _42:44_: metrics arguments with the same vector, _42:47_: but if you apply different. And. _42:51_: Vectors to those two things, _42:53_: then what do you get? _42:56_: And before you get something _42:57_: like G applies to _43:02_: A i.e. IBGE. Gee, again choosing _43:06_: different indexes in the cases and _43:09_: again AI and BJ are just numbers, _43:12_: so because the tensor with the tensor _43:13_: is linear in each argument, so the. _43:16_: The IBA BJU could come out, _43:19_: pop out here, and that gives us a I. _43:23_: Be GG. Well, GG. _43:29_: And um. OK. What can we do with that? _43:34_: Nothing quite yet. But if we go _43:37_: back and and and and look at this. _43:39_: And tensor there's one form A _43:42_: which we get when we apply a. _43:46_: And to just one of the. _43:49_: Arguments of the tensor. _43:51_: Ohh and and by the way, what do we get? _43:57_: What we get is. And a. _44:04_: How do I how do I write this? _44:05_: How do I how do I freeze this? _44:09_: Well. _44:12_: What we we get that if we know ask what are _44:16_: the are the components of that one form. _44:18_: We what we then what we we know how to _44:23_: do that we we write a I applied to. _44:28_: Basis vector e.g. But this is he. _44:33_: G is that which is G applied to a. E. Gee. _44:42_: We can expand that as usual G to a. I. _44:51_: GE i.e. G which is equal to EI. G. _44:57_: IG. So again, I'd be I'd be several _45:02_: important steps here, which you will, _45:04_: which will confluent to you eventually. _45:06_: I've just. The component is what we get. _45:10_: We apply one of the basis vectors _45:12_: to the one form. From this. _45:14_: The definition of that is that _45:17_: we can expand a into a i.e. _45:20_: G and take the I out. _45:22_: We end up with G applied to two _45:24_: basis vectors, which is just the _45:27_: component IG components of that that. _45:31_: Tensor. To what we get is that AJ? _45:37_: Is equal to AIGIG. _45:40_: In other words, _45:42_: the component version of this _45:46_: thing here is saying that the. _45:50_: Viewed this way, _45:51_: the metric tensor is a thing which _45:54_: turned vectors into into one forms. _45:57_: View this way, the metric tensor _45:60_: is a thing which lowers the index. _46:03_: In the sense that it turns. _46:06_: And EI. _46:07_: Or which again we're equivocating between AI, _46:11_: the components and a vector, _46:13_: and the vector A into a J the _46:17_: components of a corresponding one form. _46:20_: And so we're talking raising Lord indexes. _46:22_: What we mean is turning. _46:25_: Using the the metric to turn. _46:28_: A vector intercourse one form _46:29_: or a thing with the right index _46:32_: into a thing with the Lord index. _46:34_: OK. _46:35_: And you'll be that would become a familiar _46:38_: operation you'll see again and again. _46:41_: Um, so looking at this expression here. _46:46_: We just we we can write that as a IB. _46:51_: I. _46:53_: And discover that this operation of _46:57_: applying the metric tensor to two vectors. _47:03_: Produces something which is looks _47:04_: a lot like the inner product _47:07_: that you're familiar with. _47:08_: In the sense that it's a 1B1 plus A2, _47:11_: B 2 + 2, AB three and and so on. _47:14_: But where the the, _47:16_: the the the the the ones and twos are from? _47:19_: Alternately a vector and a _47:22_: corresponding one form? OK. _47:25_: And the. _47:28_: I don't expect you really to be having vague, _47:31_: fluent pictures of these in your heads, _47:32_: yet the the thing here is to get _47:36_: practiced with the the the the _47:38_: technology of of components. _47:40_: Um, rach? Um, and similarly, uh. _47:46_: Another thing that's important is this. _47:51_: So this GIG remember is just _47:54_: number is the IG component of the _47:58_: the the the matrix of components. _48:00_: So G is a tensor. _48:03_: GIG is actually a matrix, _48:05_: but not all, not all. _48:08_: 10 old rank 2 tensors can be _48:10_: represented as a, as a, as, _48:12_: as a square matrix, and by matrix. _48:15_: Not all matrices correspond to, to, to, to. _48:18_: To tension. _48:19_: Anyway, _48:19_: that might become a little clearer _48:22_: in a moment. The other thing is that. _48:26_: This fix this this this tensor GAB. _48:32_: Is. But we can imagine also a tensor which. _48:38_: Takes a vector in one form of _48:41_: argument. A tensor which takes. _48:45_: Also one forms documents and they _48:47_: are completely different answers. _48:49_: But we will choose to link them together, _48:54_: and we'll actually think of them as _48:56_: all as all facets of the same tensor, _48:58_: although that's not mathematically. _49:02_: Correct, because we will decide. _49:05_: The the this tensor here which has _49:08_: components I GG raised because _49:11_: these two arguments are one form. _49:15_: We will decide that that _49:17_: matrix is the matrix inverse. _49:19_: Of G so that so that that that's us _49:24_: defining brings a little bit lower, _49:26_: so I can actually point to it rather _49:28_: than leaping about the well we'll define. _49:30_: This tensor here G applied to two one forms. _49:36_: Via its components, we'll see that the the _49:40_: components of that tensor are the matrix _49:42_: inverse of the components of that tensor. _49:45_: And what that means is that if we apply them. _49:53_: GIGG. GK we. _49:56_: Contract over one of these indexes. _50:01_: This. Operation this this. _50:03_: We discovered here that this process _50:06_: of a much of finished momently this _50:08_: process of almost playing by the _50:10_: the the tensor with lowered indexes _50:13_: lowered an index so that is GI. K. _50:19_: And if these two things are matrix inverses, _50:22_: then that is the unit _50:25_: that the identity matrix. _50:27_: Equals delta IK. _50:30_: In other words, so if if if that matrix _50:33_: and that matrix matrix inverses, _50:35_: then when you matrix multiply them together, _50:39_: which you which you realize looking at it _50:41_: is what what's happening here this this, _50:43_: this idea of adding up that. _50:48_: And if you say that for a bit, _50:49_: you will reassure yourself that _50:50_: that is the the the what you do _50:53_: when you do missions application. _50:54_: So that matrix multiply that matrix we. _50:56_: If we're seeing this inverse, _50:58_: then this must be identity matrix, _50:60_: which is that matrix. _51:02_: So in other words, _51:03_: defining this means this potential _51:05_: here also by its components. _51:06_: We're just components. _51:08_: The identity matrix, so that's. _51:10_: So these are all different different tensors, _51:13_: but we have chosen that the _51:16_: the the tensors of different. _51:18_: Passive index indexes have this _51:19_: relationship to each other, _51:21_: and what that also means is, _51:22_: and I'll stop, I'll stop here, _51:25_: is that the the metric tensor is also a. _51:32_: Can also be an index raising. _51:35_: Offer operation by that. _51:38_: And that was a little bit garbled at the end. _51:41_: Go through the roots to, you know, _51:43_: make sure all hangs together. _51:44_: I'd hoped to get slightly further than that, _51:47_: but it means that we will have, _51:48_: we'll make a really bit of a root _51:50_: March through the last section _51:52_: of this next time and aim to to _51:55_: finish the this part.