Transcript of gr-l02 ========== _0:09_: Hello and welcome to Lecture 2 of the Greg. _0:14_: This is lecture 2 out of 11. _0:17_: I trust you will all have made _0:19_: your apprise yourself of a copy _0:21_: of the part of the Part 2 notes. _0:23_: The plan is to get through these this _0:27_: part on vector sensors functions in. _0:31_: A bit under 3 lectures may _0:33_: or may not manage that, _0:34_: but that that that that's the plan. _0:37_: There's not a huge amount _0:39_: of news of huge new stuff. _0:42_: Well, there is. _0:45_: Most of us have in this you probably _0:47_: have seen before in some form. _0:49_: Probably not in this form, _0:51_: and it might be an alien, _0:53_: a bit alien to you in the _0:55_: way we're approaching it. _0:56_: But the aim of this chapter is to _0:59_: get everyone on the same page really. _1:02_: So if you've seen him this stuff before, _1:04_: good for you. If not, _1:07_: make sure that that you you want to keep up. _1:12_: As I mentioned last time, _1:14_: those of you with a fairly who's _1:17_: whose background includes a fair _1:19_: amount of of pure ish maths might _1:21_: well find this deplorably informal. _1:24_: Those of you who's background hasn't _1:26_: included a parameter punish maths _1:28_: might find this rather bracing. _1:30_: OK, so I don't think it's _1:32_: over in between your, your, _1:34_: your your your expectations. _1:36_: But as I also mentioned last time, _1:38_: this is the first of the two middle parts, _1:41_: which is mostly mathematical, _1:42_: allowing us to get back onto _1:45_: the physics in the last chapter. _1:47_: Because the payoff of this maths is merely _1:51_: maths, the payoff is to allow us to. _1:54_: Talk about gravity in the physics _1:58_: of gravity in a very powerful way. _2:02_: OK. Any questions about that? _2:06_: Or speculations, _2:07_: or thoughts or feelings you wish to share. _2:11_: OK then let's proceed. _2:16_: So the as before, there are aims, _2:21_: objectives, high level, _2:22_: fairly high level things. _2:24_: The aims are not very there _2:26_: aren't very many aims to this, _2:27_: to this section, _2:28_: because this is a fairly mechanical _2:29_: section in the sense it's about technique. _2:31_: There are stuff you're going to _2:33_: be learning here about how to do _2:35_: certain particular maths so that so _2:36_: you will use all this stuff later, _2:38_: but there there aren't great physical _2:40_: insights awaiting you in the next 3 lectures. _2:44_: But quite a lot of objectives, _2:45_: so there's quite a lot of things you'll be _2:48_: able to do after this this part I and as I. _2:51_: As I said last time, _2:52_: the distinctively aims _2:53_: objectives is aimed at the point _2:55_: objectives of the party tricks. _2:56_: The things that are fairly _2:58_: straightforwardly accessible is the point. _3:02_: Thank you. _3:06_: This first section on linear algebra will _3:09_: be some sort of revision to most of you. _3:12_: OK, you will have seen a lot of these things _3:15_: before at earlier stages in your education. _3:18_: If not, go back to your notes from _3:21_: previous years and make sure you _3:23_: are fully up to speed with them. _3:25_: One thing that I'm going to, _3:28_: I think otherwise isn't on this slide, _3:29_: but which is going to come _3:31_: up again and again. _3:32_: Is the notion of linearity and linearity in _3:35_: this context means a quite specific thing? _3:38_: Linearity in this context means. _3:45_: A function. F of XC it can be any _3:49_: function you think of, it's linear. If. _3:58_: If and only if F of 2X. _4:01_: If you could two up F of _4:04_: a * X equal to AF of X. _4:09_: That's what linearity means in this context. _4:11_: It doesn't mean a straight line graph, _4:12_: although that's also referred _4:13_: to as a linear graph. _4:14_: It means if you if you multiply _4:17_: the argument by a by a scaler, _4:19_: the result is multiple sclerosis. _4:21_: But that's not true of every function, _4:23_: for example. _4:26_: If X = X, ^2 is not linear. _4:31_: Because F of of 2X is equal to two _4:33_: X squared which is equal to four X _4:37_: squared which is not equal to two X ^2. _4:40_: But you'll be OK. _4:43_: So I'm belaboring that point because that. _4:48_: This this term linear has possibly slightly _4:51_: different conditions in different areas, _4:53_: but returning to these points _4:55_: here about vector spaces. _4:57_: I will come back again again to _4:58_: this notion of a vector space. _4:60_: And a vector space. Is. _5:02_: Anything. _5:04_: Which satisfies these five properties. _5:07_: I'm going to go see a little _5:08_: bit more about those, _5:09_: but not too much more, _5:10_: OK? _5:14_: The vectors that you're _5:15_: familiar with pointy things, _5:16_: the things you learned about in school. _5:18_: Those are an example of the _5:20_: elements of a vector space, _5:22_: and they're they're the, hence the name. _5:24_: They're the approach type _5:26_: example of a vector space. _5:28_: You can add vector together. _5:30_: This vector plus that vector _5:31_: gives another vector. _5:32_: OK, you know that, right? _5:34_: So that's a property. _5:36_: There exists an element, a zero at 0, _5:38_: and you add that zero to anything, _5:41_: you get the the thing you _5:42_: started with back backwards. _5:43_: So so a + 0 is a. _5:45_: There exists an identity element. _5:49_: Inverse for every vector, _5:50_: there's a vector that points _5:51_: in the opposite direction. _5:53_: We add up to the identity. _5:58_: Multiplication and in vector you can _6:01_: double it and you get another vector _6:04_: which is the same direction and _6:07_: and twice as long. And there is a. _6:13_: If you multiply that by one. _6:16_: You get this thing better. _6:17_: Nothing surprising here. _6:18_: Is the tribute of that that that _6:21_: multiplication by rules is retributive. _6:22_: So there's nothing there that's _6:23_: surprising to you, I trust. _6:27_: Good. But I'm, I'm, I'm, _6:30_: I'm listing them because these _6:32_: are very general properties. _6:33_: And anything that refers that _6:35_: satisfy those properties is a _6:37_: vector space in the sense we're _6:38_: going to be talking about. OK. _6:43_: So. Hold on to that thought. _6:48_: I'm going to also assume. _6:51_: That you know about these _6:54_: various things here and. _6:56_: London Stadium dimension _6:58_: the dimensionality of space. _6:60_: Idea basis set spans of space. _7:04_: Existence, component, blah blah blah. _7:05_: The Chronicle just symbol and _7:07_: you may have seen before the _7:09_: Chronicle death symbol is just a. _7:12_: I'll be good to stop _7:13_: switching the back and forth, _7:13_: but the coronary death symbol is. _7:16_: Dot IG is defined as being one. _7:21_: If I = g and 0 if I is not equal to G. _7:26_: What's the definition of the chronicle delta? _7:29_: I'm positive, definitely. _7:32_: This just means that the. _7:35_: In a product. _7:37_: Of of of two of of two things _7:40_: is is not negative OK? _7:45_: And some stuff about matrix algebra. _7:48_: That the matrices have an inverse, _7:50_: a trace, a determinant and so on. _7:55_: Have you you you you have come _7:58_: across these words before. _7:59_: Yeah, OK. And anyone who's feeling _8:01_: uncertain at this point, you know, _8:03_: quick trip to your to your last _8:05_: year's notes would be a good idea. _8:07_: And I'm not gonna depend on an intimate _8:08_: knowledge of these things or having _8:10_: them at the front of your head, _8:11_: but I am going to assume that you can _8:13_: look those up and remind yourself _8:15_: of the details when necessary. _8:16_: Are there any questions with that? _8:19_: OK. _8:22_: So. _8:26_: I'm so matrix algebra. _8:29_: You know what means algebra good. _8:30_: I'm sure you know all the Metra. Umm. _8:40_: OK, quick question. _8:43_: Consider yourself square and by emissaries. _8:46_: Is that a vector space? _8:48_: Kind of Lucy, yes. _8:50_: And those who say no. _8:53_: Excellent, right? _8:53_: It is a vector space. _8:55_: You don't think of square _8:57_: matrices as vectors. _8:59_: But the squeamish sees _9:01_: satisfy all those axioms. _9:04_: You can multiply it by you _9:06_: a scalar times a matrix. _9:07_: A square matrix is a square matrix. _9:09_: You can add 2 square matrices _9:11_: together and get a square _9:12_: matrix of the same rank. _9:14_: They all zeros matrix. _9:15_: That they all zeros matrix can _9:16_: be added to any matrix to get _9:18_: the same matrix started off _9:19_: with their existing identity. _9:23_: There's a negative of every matrix. _9:25_: And matrix and. _9:29_: The addition, the mosque addition _9:32_: are distributive like that, _9:33_: therefore therefore and that _9:34_: therefore it's important. _9:36_: Therefore the set of square and _9:37_: by matrices is a vector space. _9:42_: Each for each end, so two by _9:46_: two matrices and three by three _9:48_: matrices are not in the same space. _9:50_: You can add 2/2. _9:54_: 2 minutes to get three matrix. _9:55_: They're in different spaces, _9:57_: but they're each separately vector spaces. _9:58_: OK. Make sure you're _10:01_: comfortable with that notion. _10:07_: OK. _10:11_: Now I'm going to talk about tensors, _10:14_: vectors and one forms. _10:16_: You probably have heard about _10:17_: tensors at some point before, _10:19_: but you've never really had to you, _10:21_: and you may have had to to _10:22_: wrangle with them a bit, _10:23_: but certainly I remember to. _10:26_: Tensors being a slightly exotic _10:28_: thing that was happened in bits of of _10:33_: continuum mechanics and I think some _10:35_: slightly exotic classical mechanics. _10:37_: It was it was a thing that was clearly _10:39_: quite powerful, slightly magic. _10:41_: Let's not think about it too much, OK? _10:45_: That's fine. _10:46_: Now, if we're tensors, really matter. _10:49_: So here is a an important use of tensors, _10:53_: and so we're gonna we're gonna be _10:54_: talking about them a lot from now on. _10:58_: Tensors. Are um? _11:06_: Before putting up the next slide, _11:07_: I'm going to see the that tensors are. And. _11:13_: See? So we're going to label. _11:17_: I don't want to put next slide _11:19_: up quite yet because I want _11:21_: to have something up first. _11:23_: Ohh yeah, and I'm going to introduce _11:25_: tension in a fairly axiomatic way, _11:26_: in a fairly mathematical way. _11:27_: I'm not going to give you _11:28_: examples and then say, oh, _11:29_: and now we call these tensors _11:31_: because see these are the the _11:32_: definitions of tensors because I I _11:35_: think that although that can be, _11:37_: it can feel a bit sort of bit _11:39_: mathematical as an approach, _11:41_: it does emphasize that tensions _11:43_: are fundamentally rather simple _11:44_: things and that simply saying. _11:47_: Attention, each rank of tensor rank in _11:50_: the moment is an element of a vector space. _11:53_: By just saying that, I've told you _11:55_: quite a lot about what tensors are. _11:57_: OK, so a tensor. _12:00_: And what's your tensor T? _12:03_: Will have a rank. _12:05_: Well, _12:05_: I'll see what this means in a minute. _12:08_: MN so that we'll have a a rank _12:13_: MNMNR moment, and each M instead _12:15_: of MSN tensors is a vector space. _12:17_: Well, that means you have two _12:19_: tensors of that rank together. _12:21_: You get another one of that rank you can, _12:22_: there's a theory element, and so on. _12:27_: And I'm going to give _12:29_: certain of these tensors. _12:30_: Well, two you 2 two sets of _12:33_: these tensors a special name. _12:35_: So now always get these nervous _12:38_: in these three wrong around _12:41_: the set of 10 tensors. _12:43_: We're going to call. _12:46_: Vector. And the set of 01 tensors. _12:55_: We're going to call 1 forms now. _12:58_: This is a slight, _12:59_: slightly unfortunate naming collision. _13:03_: And these are all. _13:04_: These are both elements of vector space. _13:07_: But it is usual to call this this _13:11_: set of tensors, to call them vectors. _13:15_: So in future, gonna talk about vector space. _13:19_: I mean or or an element of a vector space. _13:21_: I mean the very general thing that _13:24_: obeys the axioms of of vector space. _13:27_: We're talking about vectors. _13:28_: I mean one of these. _13:30_: OK, so the point is that there are. _13:34_: For each M&N which are greater or equal to 0. _13:39_: There is a set of objects called tensors. _13:43_: Which satisfy that each of those sets _13:46_: satisfy the the acting of a vector space. _13:49_: These are two two examples of that set. _13:52_: With special names. _13:54_: With those definitions. _13:58_: An MN tensor is a function. _14:02_: We did linear in each argument. _14:05_: In the sense of which I mentioned before, _14:08_: which takes M1 forms and N vectors as _14:12_: arguments and map them to a real number. _14:16_: OK, question. What does one form mean? _14:20_: Right and right, examples. _14:23_: So I'll come to examples shortly because _14:26_: I'm going to stick with the abstract _14:29_: in your terminology to begin with, _14:31_: and then I'm going to bring an example, _14:32_: some examples question. _14:35_: N vectors and then M vectors and then one _14:39_: for the right 10 vector, because ohh, _14:43_: that's a good point, yes, yes. _14:47_: That does seem to be the one that _14:48_: does seem to be the wrong way around. _14:49_: It's not the wrong way round, _14:50_: but it does seem to be the wrong way round. _14:52_: Enough that it will cause confusion, right? _14:54_: And This is why I checked which we around. _14:56_: I I I wrote this because _14:58_: I always get it wrong. _14:59_: OK, it doesn't actually matter hugely much, _15:01_: but well spotted the duty _15:03_: to be the wrong way around. _15:04_: OK, but I'll come back to that mode. _15:08_: OK, so the point is, it's a function. _15:12_: Now a function is a machine which _15:15_: turns one thing into another. _15:18_: This function. _15:22_: Here. Is a machine which takes a number. _15:28_: And give you back it's, it's square, _15:30_: OK, so it's a number which maps. _15:34_: Real line. To the real line. _15:38_: OK, so that's a good. _15:39_: That's a good way of it. _15:41_: Sounds like a slightly baby way _15:42_: of thinking about functions. _15:43_: There's really good way _15:44_: of thinking functions. _15:44_: Functions are machines which take _15:46_: one or more things of 1 type and _15:48_: turn them into another and tensors. _15:50_: Take. One forms and vectors as input. _15:54_: You think about machine has _15:55_: holes in the top with which are _15:57_: one form shape or vector shape. _15:59_: Turn the handle and outcomes a number, _16:01_: not anything else. _16:02_: A number, _16:02_: something something in R and the real line. _16:05_: OK, and it's linear. _16:07_: So you you put two of the of _16:10_: these things and the number is _16:12_: twice what you started with. _16:14_: OK. Any questions? _16:17_: So those are really good questions. _16:19_: Any other questions about that so _16:20_: but don't really good questions, _16:22_: the answer to which is coming soon. _16:24_: Any other questions? OK. Um. _16:31_: So I think we want some pictures here. _16:34_: So here are some some. _16:38_: Some tensors, no. There's a slight _16:40_: informal notation I'm going to _16:42_: be using here. For attention. _16:45_: And what what example we're using? _16:50_: And I'm going to. See? _16:53_: This to be consistent with this. _16:57_: Thank you. Alright, ohh by the way, _16:59_: I'm going to write vectors. _17:02_: Typically with an overbar. _17:05_: And one forms. With the children. OK. _17:08_: So I'll be feeling consistent with that. _17:13_: No, absolutely consistent but _17:15_: fairly consistent with that. _17:16_: OK, So what this this is going to _17:19_: be rather suggest from station. _17:21_: So this tends to T is a. _17:26_: 21 tensor or one? Yes, at A21 tensor. _17:32_: Which take it which means it takes. _17:36_: It has three arguments 2 one _17:38_: form shaped arguments and _17:40_: one vector shaped argument. _17:42_: And it turns that into. _17:45_: A A real number. _17:46_: So if we give that tensor. _17:54_: 3 arguments. _17:57_: The answer is a real number. _17:59_: OK, that's that's what that _18:01_: that's what I mean when I when _18:03_: you talk about definition. _18:05_: Yeah. So we're keeping this _18:06_: abstract at the moment. _18:07_: Examples are main mode. _18:14_: But you can also partially apply things. _18:17_: So there this this here _18:20_: attention or questions. Confused? _18:23_: Like when you put the? Alright. _18:27_: Yes, yes. It's just this intends _18:31_: to to suggest a sort of empty slot. _18:34_: So as a whole a machine with with _18:36_: those three holes in the top. _18:38_: And that's not a, _18:39_: that's not a formal notation, _18:40_: that's just to guide the eyes that were. _18:45_: So that's that. That's a A21 tensor. _18:49_: 21 form shape tends to one form shaped holes _18:51_: and one that shaped hole. I put in. 114. _18:58_: What I have left is also a tensor. _19:01_: It it I think which had one one form _19:03_: shaped hole and one vector shaped _19:06_: hole A1 form argument and a vector _19:08_: argument that is S is A-11 tensor. _19:11_: So by partially applying. _19:13_: By partially filling in. _19:15_: The arguments of the tension. _19:17_: We can turn one type of vector, _19:18_: a true 1 tensor, _19:19_: into 111 tensor in this example. _19:23_: We can do that more than once. _19:27_: That he he will fill in the one form _19:29_: shaped one, the one form argument, _19:31_: one of the one form arguments, _19:32_: the vector argument. _19:33_: And we have something which has our, a, our. _19:40_: A single one form argument. _19:43_: You know a single just single _19:45_: one form argument is A10 tensor. _19:49_: A vector. So that's so we _19:53_: could write this as a vector. _19:56_: OK. Um, so we could give _20:00_: names to these other things, _20:03_: you know, but we don't. _20:04_: There's there's no need to give _20:07_: them the other things because _20:08_: the the vectors in the one forms _20:11_: are the things that we sort of _20:13_: have to give names to in order _20:14_: to create the definition you _20:15_: saw in the last slide. _20:17_: Thank you. _20:20_: Um. _20:23_: And similarly if in this example we. _20:28_: Said talk about tea. _20:32_: Omega. Sigma. _20:37_: And and and and and and feel to _20:38_: fill in the vector ship argument. _20:40_: That would be a thing _20:42_: which has 01 form argument. _20:43_: Zero will zero open one form _20:45_: argument and one vector argument. _20:47_: In other words that is A1 form. _20:54_: Because all the one for means is. _20:56_: It is something which has a. _21:00_: AS01 form argument and one vector _21:02_: shaped argument such as that. _21:04_: OK. So, and I say we're keeping _21:08_: this abstract at the moment. _21:10_: Examples come to the moment. _21:13_: The last point relevant here is. _21:18_: In one form. As you recall, is our. _21:24_: 01 tensor. It takes a single vector _21:26_: as argument and gives a number. _21:32_: A vector. Takes a single one form _21:34_: of argument and gives a number. _21:36_: Now there is nothing to say _21:38_: that those are the same number. _21:40_: These are just functions. _21:41_: They're functions which take this _21:43_: thing and turn it, which is a number. _21:46_: They don't have to be equal. _21:47_: We are always going to _21:48_: assume that they are equal. _21:50_: OK, you can talk about this sort of this _21:53_: sort of stuff without that assumption. _21:56_: It makes things harder. _21:57_: We don't need extra hardness, so in this _21:59_: context this will always be the case. _22:01_: We'll always constrain. _22:04_: And so this this acts as a constraint _22:06_: on the functions that we allow here. _22:09_: So, so these functions, _22:10_: these these these one form, _22:12_: these functions are not arbitrary _22:13_: in that sense. _22:14_: We'll all they'll always have that _22:16_: reciprocal property and well, _22:18_: sometimes write that with _22:19_: these angle brackets, _22:20_: it just means either of those things _22:22_: and they're really use angle brackets. _22:24_: Sometimes it's just emphasise _22:26_: the symmetry of this. _22:27_: And that's true for all one _22:29_: form and vector arguments in GR. _22:32_: OK, that's a mathematician _22:34_: would not have that bit. _22:38_: If I'm working that, _22:38_: that, _22:39_: that, _22:39_: that's that's always gonna be the case. _22:42_: How we're doing so far? _22:47_: Makes a lot more sense checking down _22:49_: round when you go through your notes _22:51_: afterwards it it will be illuminating. _22:56_: This this is. _22:57_: Quick mathematical defense that that _22:60_: we are laying on the definitions here. _23:05_: I think it's worth stressing that. _23:08_: This definition is doing a lot of work. _23:11_: All the things that I've said. _23:13_: Your last 10 minutes so are just _23:16_: immediate consequences of this. _23:19_: Attention to function is linear. Which _23:22_: takes these arguments about a real number. _23:25_: The only thing I've said there isn't. _23:28_: An immediate consequence of that is this _23:31_: last point here, which is an extra. OK. _23:34_: And all that depends on the statement that _23:37_: the tensors are illness of vector space. _23:40_: So you get all those other properties _23:42_: you're being scooped up being provided _23:44_: for free in a sense by that definition. _23:46_: OK. So at this point, _23:48_: we don't mean you do not have a picture. _23:51_: I don't expect to have a picture _23:53_: of what tensors are yet. _23:54_: But you really know a lot about them. _23:56_: OK. And that's that, _23:57_: that's the the I think the the the sort of. _24:00_: When you come back to think through it again, _24:03_: that's with the clarity of a fully _24:04_: activated approach comes from. _24:05_: You already know a lot about them, _24:06_: even though you have a picture. _24:08_: OK. _24:11_: The so these definitions _24:12_: are doing a lot of work. _24:15_: Key. Um and? _24:23_: Umm. We have, and I'll mention just _24:27_: a bit of terminology. This is all. _24:29_: I will at this point freely, _24:31_: freely refer to contractions _24:32_: between vectors in one forms. _24:34_: And I say they were contraction. _24:35_: I mean that. _24:36_: I mean applying of extra one form _24:38_: that's contracting the vector P _24:39_: the the vector A and the one form _24:42_: one form P so you contraction. _24:44_: That's what I mean. _24:47_: There's more we can say about that, _24:48_: but we're not going to. Ohh question. _24:52_: In the example you're given for the. _24:56_: Yeah, yeah, this is. _24:58_: So the Q, is that the one form or is _25:01_: that it's a one form with child over? _25:04_: But that's also considered _25:05_: as an argument for M. _25:07_: So we considered as MN, so that's M&N. _25:12_: And if and if I'm if. _25:16_: I'm not sure that's a A21 tensor. _25:22_: It's telling us that there are two _25:24_: wonderful arguments, and one vector. _25:25_: Is that what you meant? _25:26_: OK, yeah, so that's that's an _25:28_: example of a true 1 tensor, and. _25:30_: And take your tea in each of both _25:33_: cases is A21 tensor by partial _25:35_: application which we turn into _25:37_: A-11 tensor and A10 tensor. OK? _25:42_: Sorry, what are the thoughts? _25:47_: Which bracket? And in _25:52_: practice. Throw some dots. _25:56_: Ohh the brackets here. _25:58_: And that alright, that I an important _26:02_: question and that's just picking _26:04_: up that that we've said T is a is a _26:08_: is a is a a function. So just like. _26:14_: Like here we're seeing F _26:16_: of X is a function symbol, _26:19_: brackets argument is a function, _26:22_: and that's that's why we're _26:24_: seeing that the we're big, _26:26_: that's the same rotation. _26:27_: So we're seeing the tensors are functions. _26:30_: They are machines which turn _26:32_: one thing into another. _26:34_: In this, in this case, _26:35_: it's a real number into _26:37_: another real number. _26:38_: In this case, _26:39_: it's a function which turns _26:40_: a number of 1 forms of _26:42_: vectors into a real number. _26:47_: The dot or the dots or? _26:49_: I think that's similar question here. _26:51_: That's just a fairly informal notation. _26:53_: Just to sort of guide the eyes _26:54_: what to to to to see that this is _26:57_: a thing which has three arguments. _27:01_: And these are, remember I said that the _27:04_: idea of a function as being a machine _27:07_: which has multiple holes in the top. _27:09_: You put things in the top, turn the handle, _27:11_: and now comes something else. _27:12_: So in this case the that that that _27:16_: tension is a machine which has _27:19_: two one form shaped arguments. _27:21_: One vector shaped argument. _27:24_: When you put things in, _27:25_: turn handle oakum sausage. _27:31_: Ohh yeah, because one one is A1 form _27:33_: shaped argument and one is a a vector _27:36_: shape argument and the notation there _27:37_: and informal notation intend to show _27:40_: that what we have with they're so empty. _27:43_: So so we'll we'll pre fills in one _27:44_: of the one of the arguments and _27:46_: we'll have two empty arguments. _27:50_: Is it about to see? _27:52_: Speaking. The two behind? _27:55_: Yeah, they're empty. _27:57_: They're 22 empty slots. _27:58_: Two empty slots. _28:03_: Yes you can, because remember _28:05_: that these attentions as well. _28:07_: That's just A10 tensor. _28:09_: That's a 01 tensor. _28:12_: So I think that with this _28:14_: update donors comes from so. _28:18_: A10 tensor. Takes 0 vectors as and one _28:23_: one form as input and a 01 tensor takes. _28:29_: 1. One form and 0101 forms and _28:33_: one-on-one vector as argument, _28:34_: so, but that's that's. _28:35_: Hence that that's all the answer to the _28:38_: upside downness there appeared to be _28:40_: present in the definition of of tensors. _28:44_: OK. And the question there, I'll call _28:47_: you next and then we better move on. _28:53_: 21 forms and the vector, yeah. _28:56_: I'm assuming, yeah, that. _28:58_: That could be anything you can have. _29:03_: Exactly, yeah, yeah, _29:03_: that, that, that, that. _29:05_: That's just an attempt to keep _29:07_: consistency between in each case. _29:09_: It's sort of sort of notion of the same. _29:11_: The same tensor. _29:12_: Yeah, here's a question. _29:17_: When we say so. _29:20_: Why is it not? _29:26_: So T is A21 tensor because it has two. _29:33_: Slots two holes which are which are one form _29:36_: shaped and one which is vector shaped. Um. _29:42_: Ohh, but S yes, true. _29:45_: So S with that, _29:46_: you know you've got machine. _29:48_: You've got one whole filled in. _29:51_: You then have two. _29:53_: A whole open one one form and one vector, _29:59_: and so S is A-11. _30:02_: Attention. _30:03_: Because it S with that whole _30:06_: field in has one one form _30:09_: and one vector as argument. _30:13_: But I should pressure. And. _30:17_: So. A quick question. _30:20_: Given an arbitrary tensor, _30:21_: an arbitrary 11 tensor T. _30:25_: What's the value of _30:28_: T2P3A? _30:31_: How many folks it was just just _30:33_: quickly would say it was one. _30:36_: 2. 3. Possible to see? _30:42_: Haven't given up yet. _30:45_: Have a chat to your neighbor. _31:03_: And and why and why you yeah before _31:06_: going and what are the rank of the _31:09_: object to T what the rank of that _31:11_: object and have a think and if S is _31:15_: A02 sensor an SAB is 5, what is? _31:19_: Have a chance to each of those questions. _31:40_: OK, so give an operator one tensor. _31:46_: Books. _31:49_: So given an average 11 tensor, _31:52_: what's the value of T2P3A? _31:54_: Was it with one? 2. 3. 4. _32:00_: Can't see. I think we didn't get _32:03_: everybody putting their hands up, _32:05_: but I'll, I'll, I'll make a, _32:06_: I'll make, I'll system that later. _32:08_: It's it's that. Because. _32:13_: Since T is linear in each argument. _32:17_: TF2P. Is 2 to FP. _32:21_: TO of O P3 is 3 times. TV. _32:26_: So two and three is 6. That's just _32:31_: because I was entertained, Sir. _32:33_: Therefore it's linear in each argument. _32:37_: What's the rank of the object 2TP _32:41_: empty space? And everyone shout out. _32:49_: Yes, it it it takes it. _32:50_: It takes 1 vector at argument. _32:52_: Yeah, so it's, so it's it's just A1 form. _32:56_: And if this is a. _32:59_: And as you 2 tensor, _33:01_: an SAB is 5 for a given A&B. _33:05_: What is a SBA, BBA? _33:09_: Do arguments live around? _33:10_: Who said it was five? _33:13_: Which it was minus 5. _33:15_: Who's impossible to see? _33:17_: Who had brand up yet? _33:20_: Who was it was five. _33:22_: Who was it minus 5? Who seemed policy? _33:27_: Is it possible to see? _33:28_: Because you don't know what the function is. _33:32_: How many? What the function is? _33:35_: It's a function. _33:36_: Which will turn 2 vectors into a number. _33:40_: But I think, I think the tension _33:41_: I've told you that much. _33:42_: It will be the answer when both _33:44_: things are filled in a number. _33:46_: But I've said nothing about what. _33:50_: What this what this machine _33:51_: does to the two vectors? _33:53_: Could they ignores the second argument? _33:56_: OK. So if it is the case that. _34:01_: SAB and SBA are equal for all A&B. _34:04_: Then the tension is known as symmetric. _34:08_: It is the case that SAB is equal _34:10_: to minus SBA and the tensor _34:13_: is not anti symmetric. _34:14_: But most sensors, _34:15_: it's whatever it is. _34:19_: So we won't see many tensors which. _34:22_: Are so important to see we're mostly _34:25_: dealing with symmetrical metric tensors. _34:27_: But the distinction is important and _34:29_: and the fact that this machine does _34:31_: stuff to effect to it to its arguments _34:33_: doesn't tell you what what it does. _34:35_: OK, it. Uncertain about that. _34:42_: You will have noticed I put these slides up. _34:47_: Generally, after a little while after the _34:49_: fact in the lecture notes thing I think _34:51_: I don't put the slides up beforehand, _34:52_: but after the the the part is finished, _34:54_: I put the the size up in the _34:56_: lecture notes folder on the Moodle. _34:58_: Ohh, and just because it's absolutely say. _35:02_: And I. Aspired to get the recordings _35:06_: audio recordings up before now. _35:09_: Hasn't happened. I indicate this _35:11_: week so it should be up and I know. _35:15_: OK, so I said that. Right. _35:21_: I last thing about, um, _35:25_: vectors is the notion of the and there's _35:28_: also just another another definition. _35:31_: Examples for me in the moment one last _35:33_: definition, the idea of the outer product. _35:36_: This circled cross. So given 2. _35:41_: Two tensors where of whatever rank you want, _35:44_: but let's in this example use 2 vectors. _35:48_: The outer product. _35:50_: Of these two two vectors this V cross W. _35:55_: Is it tender? _35:57_: Which tells you a lot about it. _35:59_: Which has two arguments. _36:02_: And and and the action of this outer _36:04_: product on those two arguments. _36:06_: Is they actually the first _36:08_: one on the 1st argument? _36:10_: Action second one of the second argument. _36:12_: Those are just real numbers _36:14_: multiplied together. _36:16_: OK. _36:17_: So that that that times there just an _36:19_: ordinary times that that's the times _36:20_: that you learned about in primary school, _36:22_: OK, that's not an exotic times, _36:24_: that is a company that's an exotic times. _36:26_: I think other product that's just _36:27_: the the the real product that's _36:29_: that's that's the apple themes. _36:30_: Oranges, right. _36:32_: I'm not going to mention outer _36:35_: products again for some weeks, _36:37_: but this is the place to introduce that term. _36:44_: So finally some examples. _36:48_: Umm. _36:53_: This is an example of a. Victor. _36:58_: Nice simple vector A2 in in the plane it's _37:03_: a vector which has a direction and length. _37:07_: And it has it composed. _37:09_: The space is spanned by two _37:13_: basis vectors E1 and E2. _37:16_: That's E1 and E2. And as you know, _37:20_: I'm I'm stressing this just to _37:22_: reassure you that stuff you know _37:25_: is still true that vector A. _37:26_: Can be broken down into components. _37:28_: It's some number, _37:29_: a real number times one piece of vector plus. _37:34_: You know, cause they're both vectors. _37:36_: That's a vector addition plus some _37:38_: number times the other basis vector. _37:41_: Nothing exotic there. _37:42_: Those things are slightly exotic. _37:44_: Is the placement of these _37:46_: indexes A1 and A2 OK, _37:50_: and they are there because these _37:52_: are the components of a vector. _37:54_: We'll see why that annotation _37:57_: matters in a moment. _37:60_: So that's just a vector. _38:04_: OK. And we'll also. So there. _38:16_: And we can write that as a equal to A1. _38:24_: And that's how you remember I say in _38:26_: school writing a column vector. OK, _38:29_: we can also imagine in this context A1 form. _38:39_: Being a rule vector. _38:43_: And you you feel like familiar with the idea _38:47_: of one form of vector turning of all vectors _38:50_: called vectors and and all that stuff. _38:52_: I'm in this context distinguishing _38:54_: vectors and one forms. They can. _38:57_: These two things are exist _38:58_: in different spaces. _38:58_: You have to, you have to. _38:59_: You have to do something to turn _39:01_: one of those into one of those. _39:03_: We can contract these, we can track the _39:07_: straight forwardly and the contraction. _39:09_: Of P. And E. You would be fine. _39:15_: That being P1U2. One trick I do apologize. _39:21_: Ah, you NP21. I'm going to be, _39:27_: yes, so apologies for that. _39:31_: The components of 1 forms are are labeled _39:35_: with the A1A2 and that as you know _39:40_: will is going to be P1A1 plus P2. 2. _39:46_: And since these components are numbers. _39:52_: And that should ordinary, _39:53_: you know, simple numerical. _39:56_: That is a number. In other words, _39:58_: the contraction of this one form _39:60_: and this vector is a number. _40:03_: And we've are our choice here. _40:07_: Is our choice to define the _40:10_: contraction in this way? _40:12_: Now as you can see here. _40:16_: We ask, OK, that's P applied to a. _40:21_: What's a applied to P? _40:23_: Well, we can't really use our experience _40:26_: of of of row vectors and column _40:28_: vectors to answer that question, _40:30_: so we're just going to say ohh. _40:31_: But of course we don't have to _40:33_: because we know it's. HP print A. OK. _40:37_: Through the contract, so we define. That. _40:46_: To be the thing that we've just defined here, _40:48_: the contraction, OK, so that's an _40:51_: example of all the things that were _40:54_: mentioned up to this point. And. This is. _40:60_: These are both elements of vector space. _41:02_: You can add two column vectors together, _41:04_: a column vector, you can add two _41:05_: row vectors together or row vector, _41:07_: etcetera, etcetera. _41:07_: All the other I can supply and _41:11_: we have defined a contraction. _41:14_: And so on. _41:16_: And we can also define hierarchy question. _41:24_: In this case, well, the. _41:30_: That is because it when you _41:32_: learn about about vectors, _41:33_: I convectors vector stuff, it's. _41:35_: It's cool you are told are all vector _41:38_: just a convection outside. Yeah, yeah, _41:41_: your tools are very straightforward. _41:42_: We are going from one to the other. _41:44_: And in this case we could do that, _41:47_: but we're keeping them as separate things. _41:49_: So we have, so I haven't said how you, _41:51_: how you get, how you get. _41:56_: So so so this. _42:03_: I have said nothing about how you get that. _42:05_: From that, OK, _42:06_: that's just completely different thing. _42:08_: OK, because I have, I've chosen not to. _42:11_: OK, so the difference is that they are just. _42:16_: The difference is we haven't is is that _42:18_: we have abstained from the thing that _42:20_: we learn about in school, which is, _42:22_: which is how do you go from one to the other? _42:23_: So I I wish we implicitly said _42:25_: that the two things are, _42:27_: are are the same sort of thing. _42:28_: So we are we are holding you _42:30_: back from that and saying these _42:32_: separate things and and. _42:33_: But but I I've linked them. _42:35_: By this process of defining the _42:38_: contraction between them and. _42:39_: I can also define. _42:44_: Our attention. Um. _42:54_: Which in this example is square matrix. _42:58_: And now I can apply a tensor. _43:02_: To a vector. And get. _43:12_: The. _43:15_: Et cetera. I mean, _43:16_: in usual things, super. _43:17_: So the point here is that this _43:20_: tensor is definable in this way. _43:23_: It's a, a thing which takes a. _43:26_: A single one form. _43:28_: Of course could be we could left _43:31_: apply a road matrix to this and _43:33_: we can right apply column matrix to this. _43:36_: And if we? To. _43:43_: P1P2. _43:47_: T1A2, we could let's apply _43:49_: the row vector, right, _43:50_: apply the column vector and _43:52_: get a number. So this works. _43:54_: We would that that is equal to. _43:55_: Is A is a thing. _43:59_: And on your line, if we partially apply it, _44:01_: if if we give this tensor, _44:03_: there's 11 tensor, just one thing, _44:06_: what we get is another vector. _44:08_: Which is a is a I think, which takes _44:10_: one form of argument and gives a number. _44:13_: And similarly, _44:13_: if we're left applied at one form, _44:15_: we'd get also get another _44:18_: one form which is a I think, _44:19_: which is a vector vector as argument, so. _44:22_: So. The thing this this stuff real vectors, _44:26_: column vectors, _44:27_: matrices you're very familiar with. _44:29_: It has been an example of of of _44:32_: three different types of tensors, _44:34_: which you didn't realize when _44:35_: you learn about the school. _44:37_: OK, so these ideas are not _44:40_: the the terminology is exotic. _44:43_: And we're going to use the _44:44_: power of that terminology, _44:46_: and the power of those debased definitions, _44:48_: fairly freely hereafter. _44:50_: But the things are examples of them. _44:53_: Are not exotic. _44:54_: There are things you you you you, _44:56_: you you are familiar with. _44:58_: And um. _45:01_: Another point here is that I'm going to. _45:04_: I want you to end up with. _45:08_: And this will matter a lot. _45:10_: In the next part, _45:11_: we're going to hold on to the idea _45:14_: of a vector as being two things. _45:16_: I would hold on to the idea. _45:17_: There's a pointy thing, OK? _45:19_: Cause that's a good thing to hold on to you. _45:21_: That's a clear idea in your head, _45:23_: and you could hold on to that as a as a _45:26_: as a notion to hold on to that thought. _45:28_: I also want to stress that you can. _45:34_: That you must be also be able to think of um. _45:39_: I've lost an example of it. _45:44_: To think of vectors as functions. _45:49_: Think of a vector as a function _45:51_: which takes one form as argument. _45:53_: And turned a number. So yeah, _45:55_: you have these two pictures in your head, _45:56_: swapping in and out as we go on, _45:59_: because they're both right. OK. _46:03_: OK. _46:07_: And I and I I I I jumped ahead of myself. _46:10_: So so this will have to apply A1 _46:12_: form to to to to to our tensor. _46:15_: We get a I think, _46:17_: which is another victory. _46:18_: So it's another another one form. _46:21_: In this example. OK. _46:28_: And. _46:32_: In general, we'll also talk about fields. _46:37_: In this context, a field is a. _46:43_: This is under any mathematicians in the _46:45_: audience you might want to close your ears. _46:47_: Now I feel is a thing which takes _46:50_: different values over the space. _46:52_: So a scalar field is something _46:54_: which takes different values in say, _46:57_: the 3D space of this room. _46:58_: So the pressure in the room _46:60_: is a 3D is a scalar field. _47:02_: At different points of the room, _47:03_: the pressure, air pressure is different. _47:04_: It's just a number. _47:06_: On Vector field is something _47:08_: which different vector values at _47:10_: different points in the space. _47:12_: So the magnetic field of the _47:14_: earth is a vector field, _47:15_: the different points in the space. _47:17_: In the Space 3 space it takes different _47:19_: values and each at each of those _47:20_: points it it has a a vector value. _47:22_: It has a middle of the earth has _47:24_: a a direction and a strength. _47:27_: OK, that's what field means in this context. _47:31_: Um. _47:34_: And we can then go on to _47:37_: visualize these things. Were we? _47:42_: Victors. _47:46_: There's nothing complicated _47:46_: about visualizing a vector. _47:47_: It's a pointy thing. _47:49_: OK, but we're going to also, _47:50_: but we're going to visualize. _47:53_: One forms. _47:57_: I'm saying this because you _47:58_: know to help you, to help you. _48:01_: A need to thought rather it's a _48:04_: movement actually calculated with _48:06_: good would visualize 1 forms as. _48:08_: Well, in 2D lines they have. A direction. _48:14_: That this has a direction and length. _48:17_: This set of planes has a direction. _48:21_: And it has a sort of pitch. OK. _48:25_: So there there's a magnitude to this, so. _48:31_: That's in this in this picture. _48:34_: That's a one form which is the same _48:36_: direction but has a higher magnitude. _48:38_: They're closer together. _48:40_: Why closer together? _48:41_: Because are we in this picture? _48:44_: So this is not another example of of danger. _48:46_: We we can imagine the contraction _48:48_: of a vector in one form in this _48:51_: picture is by laying the. Vector. _48:54_: Across this one forms and saying. _48:58_: This vector so goes crosses 2 lines. _49:00_: It goes from one cross cross. _49:03_: And it gives an answer of two _49:04_: so that the contraction of that _49:05_: one form and that vector is 2. _49:07_: The direction of this one form 11234 is 4, _49:10_: so that the contraction of that the same _49:12_: vector with with with with the that the _49:15_: larger one form gives a different number. _49:18_: Tell the direction of one _49:19_: form without using the right, _49:20_: because it could be, it could be, yes. _49:24_: That's not a very mathematically _49:25_: very precise picture, _49:27_: so I think that I think that. _49:30_: We think of vectors is useful to have _49:32_: this notion of a point of of an arrow. _49:34_: You're just trying to get _49:36_: your things in your head. _49:38_: When I took one form useful to _49:39_: have a a picture like that, _49:41_: maybe you could color one side _49:42_: of them once, one side the other. _49:49_: Somewhere in between 21, yes, so. _49:52_: Yeah, so the field, obviously, _49:54_: I mean, I think we we don't want to _49:56_: overstress the precision of this, _49:58_: that, that, that. _49:59_: That's just a picture. That's useful. _50:01_: And also it means that things like. _50:04_: Like this big sense. _50:06_: So there's a A1 form field time stop. _50:10_: There's one form field. _50:12_: And like like the gradient lines on the map. _50:16_: So so the gradient it turns _50:19_: out to be modeled as A1 form. _50:22_: In this case, the, you know, up uphill. _50:24_: Has a direction, _50:26_: and the closer these lines are together, _50:27_: the steeper the slope is and if you. _50:33_: Move, move like that. _50:35_: Or like that, or like that. _50:38_: In each case, you've crossed the same number _50:40_: of grid lines or of of contour lines, sorry. _50:42_: And so you've you've risen the same _50:44_: amount of, you've claimed the same distance. _50:47_: So in that sense the contraction _50:50_: of those three different vectors. _50:52_: With the one form field. _50:54_: There's the same number. _50:55_: So when I talk about of of vectors, _50:58_: think of pointer pointy things. _50:59_: We're talking of A1 form fields. _51:01_: Think of the gradient lines of the gradient _51:03_: lines on the country lines on a map. _51:05_: And with that picture, _51:06_: I'll let you go. _51:07_: We should. _51:08_: I think we've we've done not too _51:10_: badly in getting just about right _51:12_: just through this this part. _51:13_: I'll see you again on Friday, I think. _51:15_: Not in this room. _51:17_: OK, all the other elected in _51:19_: this room and other words,