Transcript of gr-sup1 ========== _0:07_: Well, folks. This is the the, the, _0:12_: the first of I think 2 supervisions, _0:14_: but there's also tutorial. _0:16_: Now, I've never been, _0:17_: never been clear to me what the _0:18_: difference is between these things _0:19_: because it's not as if we do anything _0:21_: particularly different in each of these. _0:23_: I have various times trying to think _0:25_: is there an exciting thing we could _0:28_: do in in in this context, and I don't. _0:31_: I've never really thought _0:33_: of one that convinced me. _0:36_: Because the point of all all those exciting _0:38_: things to do in attribution computational, _0:40_: is to find some way of _0:41_: encouraging a conversation. _0:44_: Because the the basic idea is you have _0:47_: there are things you don't understand. _0:50_: Good to ask and I can try and sort out those _0:52_: problems which might be from last lecture, _0:55_: I might be from lecture 1 because _0:56_: there might be some of the there's just _0:58_: being a block at the very beginning, _1:00_: the whole thing, or maybe there's a _1:02_: little bit of tuning at the end that _1:04_: you that you just smoothed over. _1:06_: So this is the latest version of _1:08_: of doing the useful thing here. _1:11_: Which is I want to put up a questions _1:13_: in a bit of the padlet and so thank _1:15_: you to those who did that. _1:20_: So there aren't any of those. _1:22_: No one liked any of them, _1:23_: which is a pity, but that's not _1:25_: actually quite in this context. _1:27_: That's quite a useful button to like this. _1:30_: You know, this is an idea. _1:31_: I have this question too. _1:32_: So maybe when we do this for the _1:34_: next supervision of the tutorial, _1:35_: I'll encourage you again to to do that, _1:38_: but I think they are _1:39_: useful way of proceeding. _1:40_: Because always easier to ask a _1:43_: supplementary question than it is to ask. _1:45_: So the initial question but your _1:47_: hand up I think you're proceeding _1:49_: is to talk about these questions _1:52_: here which I I think are. _1:58_: I think that the the illustrate _2:01_: quite small but important, _2:03_: you know, Rd bumps here and I think _2:06_: though we'll use those I hope, _2:08_: as a way of asking for the for the questions, _2:10_: digging into other things _2:12_: that are a bit obscure. _2:14_: And if you have any, _2:16_: you feel free to add to add things _2:17_: to that even while we're talking or _2:19_: just put your hand up and see ask _2:21_: question you the old fashioned way. _2:22_: Just I have this question. _2:25_: OK, let's go through those _2:28_: vector in one form thing. _2:30_: Yes, and thinking about this it is the _2:32_: way that I introduce vectors 1 forms, _2:35_: vector one form and tensors does _2:38_: seem a bit self referential. _2:39_: It seems as if it's it's _2:41_: building on building on itself _2:43_: before it has an opportunity. _2:45_: And I think that's partly because the. Um. _2:53_: Because the definition. _2:54_: Of of a tensor is something _2:57_: which takes N1 forms and vectors _2:59_: as arguments, blah blah blah. _3:03_: Is presented to you before _3:04_: I've said what vectors in one _3:06_: forms are. So it feels a bit. _3:10_: You know, which is an inconsistency here. _3:13_: It's something missing this _3:14_: in this definition and I _3:15_: think it's just a matter of. _3:17_: I think it's just a matter _3:18_: of terminology in that. _3:21_: I think it's just a matter of terminology, _3:23_: so let's let's go through that. _3:25_: I think in, in, in in an order _3:27_: which doesn't confuse things. _3:31_: Vision. Can we see that? Nope. _3:41_: Oh God. _3:44_: So let's see a. I think the first thing. _3:51_: Is to say that there are there _3:54_: are there exist sets of. _3:58_: Things which we'll call centers. _4:01_: And there are multiple sets of these tensors. _4:04_: Or rather, there are we which have _4:06_: the structure first of all that _4:08_: they exist in the vector spaces, _4:10_: so each of those tensors of those spaces, _4:14_: so we'll call it tensor. _4:17_: And M. Uh. Is a. _4:29_: Is a vector space. _4:31_: And if you remember from the beginning, _4:34_: Part 2, by simply saying that _4:37_: I've said quite a lot about what _4:40_: these things called tensors are. _4:42_: OK, I said you could add them _4:45_: together to get another tensor. _4:47_: You can multiply by scalar, _4:49_: there exists a a unit. _4:53_: No, there's an inverse, _4:54_: and there exists a 0. _4:55_: So I've said a lot about these _4:57_: things just by writing that down, OK? _5:01_: And I'm saying that there are multiple _5:03_: sets of the so for each of this NM, _5:06_: and haven't said LMR yet, _5:08_: this thing called tensor is a vector space. _5:13_: And then I'm going to, _5:15_: for reasons which will become, _5:16_: you know, which will emerge _5:17_: as important later on. _5:18_: The definition, I'm going to say, _5:20_: going to have special names. _5:22_: To two of these spaces. So um. _5:28_: 10. Tenters. _5:32_: Are called. _5:36_: Victors and 01. Called one. _5:43_: Go 1 forms and that's just I'm doing nothing _5:46_: other than giving a conventional name. _5:48_: To those two vector vector spaces ohh and. _5:56_: 00. _6:01_: 00 the 00 vector space are, _6:04_: we'll call those the the the members _6:07_: of that vector space fillers. _6:10_: We've done no math there. _6:12_: We've just called. _6:13_: We've labeled things. _6:16_: And then and that and it's at _6:18_: this point that we can in a sense _6:20_: legitimately give the the the _6:21_: definition that we sort of started off _6:23_: with in in the notes that a tensor. _6:30_: Is a function. _6:35_: OK. So. I'm at that point seeing already _6:40_: giving quite a lot of information is a _6:43_: thing which takes things and turns them _6:45_: into other things, and that sounds like _6:48_: another baby definition of function. _6:50_: It's usually that's trivial and silly, _6:52_: but it's a very general thing. _6:53_: So so you we are used to _6:56_: thinking of functions. _6:57_: We learn functions in school. _6:60_: There are some of you. _7:01_: There's something which things like X ^2, _7:03_: you take a number and it comes _7:06_: out with another number. _7:08_: And so it's easy to fall into the _7:09_: rut of thinking of a function as _7:12_: something which manipulates numbers. _7:14_: But a function in mathematical terms, _7:15_: I think, which manipulates _7:16_: things have done to other things. _7:18_: So it's a very general notion. _7:20_: So we're seeing a tensor. _7:23_: It's a function, obviously. _7:27_: Component. _7:31_: So how to write the components of _7:32_: potential in terms of the metric _7:34_: that's weird on the line here. _7:36_: So that's that we've because _7:39_: you've several several a couple of _7:42_: chapters at like half a chapter to _7:43_: go forward before we get to there. _7:45_: So we haven't got nearly as far _7:47_: as talking about components yet. _7:48_: So we know that there will be components. _7:54_: Yeah, yes. So in in Part 2 we we go on to _7:57_: talk about our components quite quickly. _7:59_: So the fact that we are saying this _8:01_: is a vector space means that we're _8:03_: importing all that technology. _8:04_: So we know that we will be at some _8:06_: point talking about components _8:08_: and basis and and bases. _8:10_: So there will be a basis, _8:12_: a basis which spans each of _8:15_: these of these spaces. _8:17_: So yes, so I think that yes, _8:19_: we will be talking about _8:21_: components eventually because _8:22_: we've talked about vector spaces, _8:24_: but that's still a little while yet _8:27_: because we haven't yet even really _8:28_: got a notion of what a tensor is. _8:30_: And. _8:34_: But it's important to be _8:35_: able to do that, but not yet. _8:40_: So it's a function. _8:41_: Which is something to do other things. _8:43_: So the question is. _8:44_: What does it turn into what? _8:46_: What does it turn into what? _8:50_: And the domain. _8:53_: Sorry, the range. Of each of all tensors. _9:00_: Is. The real line. _9:04_: So from all from all the things _9:06_: that the attention could be, _9:07_: we're now down to things which are. _9:11_: We mapped to the real line. _9:12_: We mapped R, not RN, just R. _9:15_: So tension thing which takes things _9:18_: and gives and turns them into numbers. _9:20_: What's what's the domain of _9:23_: of of of tensors? Um. The. _9:30_: To mean of. Over. In. And. _9:37_: Is uh. _9:40_: Well, and and and I'll say put it. _9:43_: I'll write this this way so that I can. _9:48_: Link to to that question is. _9:55_: And. _9:58_: Papa bomb. _10:02_: View one. _10:04_: Cross. _10:20_: Now I live in perpetual terror of _10:21_: getting you the wrong way around, OK? _10:23_: And and the wrong way around. _10:25_: I I may I may correct myself in a moment. _10:29_: So we're seeing that that's a that's a, _10:33_: a, a formal way of saying that the. _10:39_: Attention or not just has the _10:42_: the range of through line, _10:44_: but the things that go into it. _10:45_: The things that turns into a real number are. _10:51_: These things would say to _10:52_: call 1 forms and vectors. _10:57_: You said that vectors and one forms are just _10:60_: functions that take something into a real. _11:02_: Yes? But how can you have a denser _11:05_: which is something that there is a _11:08_: real into a real but? It doesn't. _11:12_: It's very weird that you say that _11:14_: the domain is waveforms in one form, _11:16_: which are theoretically real numbers. _11:17_: But you can't put the scale all right, _11:19_: because yeah, make something into it, yes. _11:23_: To another if if if I'm following you this, _11:27_: this isn't the the result of of the _11:31_: application of the of the of the of the _11:34_: tensor, it's the the function itself. _11:38_: So This is why I'm being quite out. _11:40_: So this is a function which takes _11:42_: functions as arguments. Yeah. _11:45_: That's why. That's why I was being. _11:48_: Very general about things and things. _11:52_: Not scale when you apply them to. _11:56_: So what this means? _11:60_: A10 tensor. It's called a vector. _12:02_: By this definition, _12:03_: that means that 10. Tensor. _12:10_: It's something which maps. _12:13_: The set rather informal notation. _12:18_: 01. _12:21_: Suppose the set of. _12:26_: Yeah, we call that a vector that maps _12:30_: the set of 1 forms to the real line. _12:34_: So attention is something which maps. _12:35_: It's not not the not the result _12:37_: of applying that effect function, _12:39_: but the function itself. _12:42_: So that sounds very strange. _12:46_: A function which takes function as argument. _12:48_: But that's in a sense is why we introduced. _12:52_: It occurs to me, not just, _12:53_: not just now talking about it, that _12:54_: perhaps is why we we introduced this this, _12:56_: this terminology vectors in one forms _12:59_: quite promptly because what we want to _13:02_: immediately start thinking of those of _13:04_: those vectors 1 forms as as pointy things. _13:07_: So in this sense, _13:09_: what the attention is doing is it's taking. _13:14_: And you know that. _13:15_: Yeah, that's why we jump to. _13:18_: Not this rather formal _13:19_: way of thinking about it, _13:20_: but we jump to the terminology where _13:22_: we were talking about vectors in _13:24_: one forms because at this point we _13:26_: want to immediately start thinking _13:27_: of those as geometrical objects. _13:29_: So in that case I wanted you _13:32_: to answer some of which takes. _13:35_: A vector, no one form, sorry. _13:38_: Enter into number. _13:42_: Um. _13:46_: So. _13:52_: We did does that. _13:54_: You're still not looking. _13:55_: Totally convinced because you _13:56_: said that it takes a function. _13:58_: Yeah, but the scalar is a function. _13:60_: But they can't put a scalar _14:02_: into an argument as a no. _14:04_: 0. Potential is a function. _14:08_: Yes, but they can put it as _14:09_: an argument to attention. _14:10_: You can't know. _14:12_: So, so sorry. Yes. _14:14_: So, so this definition of tensor _14:16_: says that we're going to restrict _14:19_: ourselves not to potentially don't _14:22_: take arbitrary tensors as arguments. _14:24_: We're going to, we've said not only the, _14:27_: the range of the changes in the real numbers, _14:31_: but the. _14:32_: Argument of tensors are going _14:34_: to be not just uproot tensors, _14:36_: but specifically. _14:39_: One forms and vectors, _14:41_: so of all the things, _14:43_: so you don't get 10 get A20 tensor _14:46_: as an argument to a tensor. _14:49_: We say, we declare. _14:51_: So it could be. _14:52_: Otherwise you could imagine a structure, _14:54_: and quite probably there are in the head _14:56_: of mathematicians structures where where _14:58_: you where you have what looked like tensors, _15:00_: which can take 10. _15:01_: You know other tensor tensors as arguments. _15:06_: We don't do that, _15:07_: so we we so so this is quite _15:10_: a narrow definition of. _15:11_: We've narrowed this definition general a lot, _15:14_: first by restricting the range through line. _15:17_: Secondly by restricting the arguments _15:20_: to only vectors in one forms. _15:23_: And we have divided up the _15:27_: sensors into things which take. _15:31_: And one of the other so, _15:33_: so these so and so, _15:34_: so the different so the 20. _15:40_: 20 and two, one and 02 tensors _15:43_: are all separate vector spaces. _15:45_: In principle, _15:46_: they have nothing to do with each other. _15:50_: At this point I'm sorry to I'm I _15:52_: think I'm wondering worrying I'm sort _15:53_: of starting to you talk around in _15:55_: circles enough there's making it so _15:57_: more confusing than it than it is. _15:58_: I think the thing I want to _15:60_: go over is that yes, _16:01_: the definition as presented in _16:03_: in the notes does appear to be _16:06_: uncomfortably self referential, _16:08_: and it seems to start talking about _16:10_: vectors before we say what vectors are. _16:13_: And and. I think that what this might _16:18_: illustrate is that if you try and avoid that. _16:22_: By avoiding the word vector until _16:24_: a little bit later, _16:26_: you end up with an explanation, _16:27_: which is possibly a little you know, _16:30_: possibly more formally correct. _16:31_: But a bit more confusing. _16:36_: Does that. Feel better in a sense. _16:44_: I'm not seeing any nods or _16:47_: shakes of heads or tears, so. _16:52_: I think because I mentioned _16:54_: this this situation here with _16:56_: with with the other products, _16:58_: I think it's fairly natural to go on and _17:01_: talk about the the tensor product here. _17:04_: Because again, that's another _17:06_: thing that I think is looks more _17:09_: more exotic than it really is. _17:13_: And it's a way of. _17:17_: Building up a tensor from other tensors. _17:21_: And you know, the other product _17:23_: or the tensor product appears in _17:25_: multiple contexts in mathematics. _17:26_: It it basically means just _17:28_: putting two things together. _17:29_: Most of the cases where you you, _17:31_: you you are an outer product, _17:33_: you're you're just in jamming things, _17:35_: you're not into leaving them, _17:36_: you're not doing anything complicated, _17:37_: you just going 1/2 and put them _17:40_: network each other and and dealing _17:42_: with the parents together. _17:43_: That's the intuition that often _17:45_: is behind the notion of the tensor _17:48_: product or the product in mathematics. _17:50_: So in this case. _17:54_: The. _17:57_: So if we had. _18:01_: See function F of X is X ^2. _18:06_: The function G of. X is equal to. _18:11_: Um to. X + 1 for example. _18:16_: That each of those is a function which maps. _18:19_: I'm making this up as I go _18:20_: along so many time myself. _18:21_: Notes here and which maps _18:23_: the real line through line. _18:24_: Nice simple function, _18:25_: nothing exotic about them. _18:29_: But what about the function F? _18:32_: Outer product G. As a function. _18:36_: What does that mean? _18:40_: We're going to see that the definition of _18:42_: that outer product there in this context _18:45_: for functions some, you know, similar. _18:48_: It's similar to using other things is that? _18:51_: Is a function. Which has two arguments. _18:57_: And the. The way in which the definition _19:02_: of the what this function does to _19:04_: these two arguments is straightforward. _19:06_: It is defined. As the first linear _19:10_: product applied to the first argument. _19:14_: Multiplied by the 2nd. _19:16_: Applied to the other argument. _19:19_: Which in this case is X ^2. _19:22_: Times two y + 1. _19:26_: And there's nothing more to it than that. _19:30_: So the outer product in this context, _19:32_: when you apply it to functions, _19:33_: is just a way of composing, _19:35_: because there's multiple ways you can compose _19:38_: functions you know that you could talk about. _19:41_: If after G. And that's. _19:45_: Of X, which is F applied to G, _19:49_: of X, and so on. _19:51_: So there's multiple ways you can you can _19:53_: you combine function into other functions. _19:54_: You have you seen that notation after? _19:57_: Yeah, so this is just one of the ways you _20:00_: can make what you can combine functions, _20:03_: and it means nothing more than that. _20:05_: So. You make a two argument function _20:09_: from 2/1 argument functions. _20:10_: Specifically, _20:11_: by multiplying the first argument, _20:14_: the first function to the first argument, _20:15_: the second argument, _20:16_: the 2nd to set the first function to, _20:19_: the to the first argument, _20:20_: and the second function to the second _20:22_: argument, and the numbers together. _20:24_: And similarly, _20:24_: that means that if you have a. _20:29_: Vector V. _20:30_: Which is is I'm which is A1 form of argument. _20:35_: Outer product. Um, P for example. _20:39_: Which is A1 form takes the vectors _20:42_: argument and that is. A function. _20:48_: Is a function which can take. _20:52_: A1 form. And a vector as argument. _20:56_: Let's call that that them, um, _21:00_: Q&A. Which is such that by definition. _21:05_: We apply the first function. _21:07_: To the. First argument. _21:09_: The second function to the _21:12_: second argument and multiply the. _21:14_: These are both numbers multiplied together. _21:18_: So we're not doing anything there, _21:19_: but we're not doing there except _21:20_: we're just doing this case. _21:21_: We're doing it with specific functions _21:24_: which are vectors of one form, _21:27_: and you can do this we don't. _21:29_: But you can do this with _21:31_: higher order tensors, _21:32_: higher rank tensors as well. _21:34_: So you could have a V cross T and that _21:37_: if T is a is A2 argument tends to, _21:39_: then V cross T would be a _21:41_: three argument that tensor. _21:43_: So this is the way of combining _21:44_: lower first rank one object _21:46_: into a rank two objects. _21:50_: And So what? Yeah. _21:55_: And and what this is what's happening _21:57_: here with outer product here is _21:59_: that also the idea of you know, _22:01_: just in jamming multiple spaces together? _22:04_: And and becoming the domain _22:06_: of the of the Nam tensor. _22:10_: So it's the same notion. _22:12_: The sort of straightforward German. _22:16_: And. _22:19_: So if we look at that a _22:22_: particular. Question there. _22:25_: And. So what we're doing here _22:29_: is just what we're doing here. _22:34_: But for the special case where the _22:36_: vectors we're talking about are. _22:37_: Why is that blue, I wonder? _22:41_: Oh, the special case where the vectors _22:45_: in question are the basis vectors. _22:48_: So there this this T. _22:51_: Which now the you know, _22:54_: there's a couple of things _22:55_: going on here, one of which is _22:56_: the question of of components. _22:60_: This. _23:04_: And. The the The this object El Cross E. _23:13_: M. Cross EN. That's our function. _23:19_: Which takes what at arguments? Someone. _23:27_: 31 forms, yes. So let's call them and _23:33_: P1P2. P. Three and, and and and that the. _23:40_: Value of that. Function. It is therefore _23:48_: A&I A31A30 tensor. Just purely because _23:52_: it takes 31 forms of argument. _23:54_: That's the definition and the and _23:57_: and the value of that function _23:59_: applied to those arguments is El _24:01_: applied to P 1 * E M applied P. _24:07_: 2 * E and apply to P. 3. _24:15_: But we're doing a little bit _24:16_: more here by saying the 10th were _24:19_: actually want that's that's so. _24:20_: So we're defined at a tensor _24:23_: there AA30 tensor for each of _24:27_: every possible value of LM&N. _24:30_: The change that we actually _24:31_: want in this case. _24:34_: Is a linear combination of those. _24:39_: So it's. E 1 cross one, cross one plus, _24:43_: E some multiple of E 1 cross _24:45_: one cross E2 plus plus plus E 1 _24:48_: cross one cross E 3 and so on. _24:51_: So it's a large number. _24:55_: Uh. Ohh sorry. The way is is is different. _25:00_: That's OK. OK, switching switch, _25:05_: switching to pressure in the 1st Place Elm. _25:12_: NOFEL cross EM cross. _25:16_: Omega. And. That's a linear _25:20_: combination of n * n * N terms. _25:24_: Where? _25:28_: There are. Yeah, so? So there's n _25:32_: * n * N terms in that sum. What? _25:39_: Thanks. It was intended on that thumb. _25:44_: And that means that when you _25:46_: apply T to things. We apply. _25:54_: We just drop these arguments. _25:57_: You want one form vector argument into the. _26:01_: Argument positions of each of _26:04_: these things and straightforward. _26:07_: Real number of application of _26:08_: the results between them added _26:10_: the result and in this case. _26:12_: So there's there's two things happening here. _26:14_: First of all, in top think we're _26:17_: defining this tensor as the sum _26:20_: of N by N by N outer product. _26:24_: With a with a different coefficient, _26:26_: a different a different multiple of each _26:29_: of those which we can choose there. And. _26:38_: Just because. I mean, _26:41_: I mean the same thing in both, _26:42_: in both cases. It just is. _26:43_: It's it's the same thing, yes. _26:45_: So I I I'm rather than writing. _26:48_: And we'll have to avoid to to _26:51_: to slightly avoid confusion. _26:54_: I'm, I'm. I'm changing my mind _26:55_: between right bringing across. _26:57_: It looks a bit like an X. So yeah. _26:60_: So there's two things happening here. _27:03_: First, that's the. _27:05_: At most places, and it's true. _27:07_: It's just simple multiplication _27:09_: of real numbers. _27:10_: So that there's nothing, nothing, _27:12_: nothing exciting there, that's just 2 * 3. _27:15_: OK. Over there? _27:18_: If so, _27:19_: that's our sum of multiple terms with _27:23_: coefficients given by this matrix here. _27:25_: Essentially, _27:25_: we'll call it a matrix in that way. _27:29_: Key elements of _27:32_: TLM&N with isolation convention. _27:36_: It's a man. Well, I think right there _27:41_: there it's we're thinking as a matrix what _27:45_: we're going to discuss since when we. _27:48_: OK, so temporarily we're going _27:50_: to think of it as just matrix. _27:53_: OK all all is there is a matrix _27:56_: of numbers of north by N numbers. _27:59_: And then when we ask what is the value _28:03_: of that tensor, what the new make? _28:05_: What is the number to which this _28:08_: evaluates? We. _28:11_: You know, we just drop in. _28:15_: Section 1 forms if we specifically drop in _28:19_: basis one forms and basis vectors. Then. _28:22_: By the definition of product, we get this. _28:27_: Well, these are simple. _28:30_: Numerical applications. _28:33_: The next step here is to recall. _28:37_: That our choice. _28:38_: Of 1 forms by choice or one form basis, _28:42_: one forms are always going to be dual. _28:47_: To our basis vectors, meaning that we _28:50_: have we choose that the basis one forms. _28:55_: Will always have. _28:56_: In this situation, _28:57_: will always evaluate to the Chronicle delta, _29:02_: so that would be 0 unless L is equal to I. _29:05_: When it will be 1. _29:08_: So we can replace each of these terms _29:10_: by Delta IIL, Delta LM, Delta NK. _29:14_: At which point we can do the the three sums. _29:17_: So if we now do the sum over over, _29:21_: well, say L. _29:24_: So we add so that is. _29:25_: It is the summation sign in _29:28_: front of that which sums for L. _29:33_: M&N. From one to N. When we do that _29:37_: for L that term there will be 0. _29:41_: Except when L is equal to I. _29:44_: So the only 10 that will survive from that? _29:46_: Sum is when L is equal to I. _29:50_: Who do the sum of M exactly the same. _29:53_: The only term that survives is when M is _29:55_: equal to J and we'll do the sum over N. _29:58_: Go in terms of survive is the _29:60_: one where N is equal to K. _30:02_: Should we discover? _30:04_: By virtue of the definition _30:06_: of the outer product. _30:08_: That when we do specifically this and _30:10_: give not just arbitrary what one forms _30:13_: of vectors as arguments for the tensor, _30:16_: but specifically basis vectors in one forms, _30:19_: what we get out turn the handle. _30:21_: What we get out is this. _30:23_: The Matrix was started off with. _30:27_: Up there. And it's at this point we see ohh, _30:31_: that's not just a a matrix. _30:33_: We're going to see if that's _30:35_: the components of the tensor. _30:36_: And that's and and and _30:38_: when we in that context, _30:39_: we write this matrix carefully with _30:42_: indexes staggered and and and it's so _30:45_: although we started off saying well. _30:47_: We discovered that this, _30:49_: this set of numbers actually had a, a, A. _30:52_: We could we think of it in a _30:54_: slightly enriched way by thinking _30:57_: of it as the the component, _30:59_: because that's what they are of, _31:02_: of the center. _31:03_: So all tensors will have components. _31:07_: So for any tensor. _31:09_: That you do this to, _31:11_: you could do the same thing _31:13_: and get that answer there. _31:15_: So that this doesn't depend on that. _31:20_: For, that's true for any danger. _31:23_: What's special about this? _31:24_: Is that a particular way of _31:27_: calculating a higher rank tensor? _31:29_: Which we can see to be consistent, _31:30_: because when we do this to that tensor, _31:33_: what we get out is what we started with. _31:36_: So there's more than so. Uh. _31:40_: Now let's see could this right around? _31:45_: Not all tensors are the outer _31:48_: product of lower rank tensors. _31:51_: But all tensors can be decomposed into _31:55_: a sum of outer products like this. _32:00_: Below the 2.9 equation, _32:01_: but this is some of them by by. _32:06_: M by M. Or because these are just. _32:11_: Indexes was run from 1:00 to _32:14_: 1:00 to end, so if you recall. _32:24_: So when we see T KLM. _32:32_: T = t LNEL. Because he. And cross. _32:38_: When we go and what we mean is. _32:43_: L = 1 to N. And. _32:52_: OK, that's confusing, right? Tea. _32:60_: LMNEL cross E. And cross E Omega _33:05_: N so remember that the the the. _33:13_: And the summation convention is _33:15_: taking the summation for granted, _33:17_: so there's north. And so I I see yes. _33:22_: So the problem is I've I've I've _33:23_: I've used in two different ways. _33:28_: They have brought. _33:33_: So I've used in two different ways, right? _33:37_: Didn't even think of that. _33:38_: So so yes, this this end here is a _33:42_: dummy index as opposed to this end here, _33:44_: which is obviously the the dimensionality _33:46_: of the of the vector species. _33:49_: Sorry, I I yeah, _33:50_: I make it myself to rewrite that. _33:57_: Questions. _34:04_: Answer. _34:06_: Space. Right. But which has more _34:11_: dimensions than all, right? So if the. _34:18_: That yes that's another restriction _34:20_: here to the to to to what we. _34:24_: Our initial definition definition _34:26_: of tensors as functions to _34:28_: take things and other things. _34:30_: We restricted it in various _34:31_: ways on the previous page, _34:33_: but I another restriction is that _34:36_: the vectors that you add. That you. _34:42_: Put his arguments in here. _34:44_: They have to have the same dimensionality. _34:47_: That should be in our. _34:51_: Vectors in an N dimensional vector space _34:54_: the same dimensionality of the tensor. _34:57_: So all of these one forms and vectors are _34:60_: all in a four dimensional vector space. _35:02_: 4 dimensional vector spaces, _35:04_: same formation vector space. Umm. _35:10_: No, they are all vector spaces _35:13_: and each so each of the N&M. _35:16_: Usually sets of M tensors. _35:19_: Corresponds to a vector space. _35:21_: And they all must have the _35:24_: same number of dimensions. And. _35:27_: Alright, this doesn't work. _35:28_: But that, that, that's, _35:29_: that's a good point, _35:29_: which hadn't really occurred to me and now. _35:36_: In our case. We are all we are. _35:39_: We are. We're doing GR here, not math. _35:41_: So we are primarily interested in and. _35:44_: The case where the vector space _35:46_: in question is Minkowski Minkowski _35:48_: space or or the four dimensional _35:50_: space-time were interested in. _35:51_: So I think so I think I've sort of _35:55_: in some parts of this explanation _35:57_: I may have implicitly assumed that _35:59_: energy equal to four all the way _36:01_: through so but that's that's important _36:03_: point is yet another way in which _36:04_: we have the from the void range _36:06_: of possibilities of tensors we've _36:08_: narrowed that we've we've you chopped _36:10_: off possibilities here here and _36:11_: there and that's further constraint. _36:16_: So if you. _36:21_: Cancer, yes. In three dimensions. _36:26_: How would you feel? _36:30_: As you've chosen faces, right? Yeah. _36:35_: If you have like a choice _36:36_: of three basis vectors. _36:41_: Ohh, is he right? Yeah, right. _36:44_: Well, I think here if, if, if, _36:48_: if I have heard you properly then the. _36:55_: The way that you would. _36:60_: So what you're doing here is this is for, _37:03_: you know, arbitrary I GK. _37:06_: So pick an IG K when you're when you _37:09_: apply those three, the Omega one, _37:12_: Omega 2E4, whatever you apply those _37:15_: then what you get is is T 11124 here. _37:21_: So if you were doing this by hand, _37:23_: as it were, you'd never do this by hand. _37:25_: You would work your way through _37:28_: all of the the end dimensions here _37:31_: and get the NMN numbers here, _37:34_: and that would allow you to reconstruct _37:36_: that that, that, that tensor. _37:38_: But if you change your mind about what _37:41_: the basis vectors and what one forms were. _37:44_: You have to do the calculation _37:45_: all over again. _37:46_: And you get different numbers here, _37:49_: which would apply to. _37:52_: That would be right for the _37:53_: different set of basis vectors. _37:55_: 1 forms in there so that and. _37:60_: I think this might be touching on the the the _38:03_: question on your question that This is why. _38:09_: The illustrating in a way or _38:11_: touching on the way in which the. _38:14_: The components are basically dependent. _38:17_: You change your mind about the basis. _38:20_: So you change your mind but need arguments. _38:22_: So you end up getting different numbers here. _38:24_: But. _38:27_: Potential is the same in both cases. _38:29_: And that's what it you know, in a way, what. _38:34_: Is into an answer to the other question. _38:38_: Geometrical objects. _38:40_: So you can see something has an underlying _38:42_: geometrical object associated with it. _38:46_: I think there's a couple of things. _38:47_: I, I there's a couple things I mean there. _38:50_: One is. When I see that, _38:54_: I mean this is not a a basis dependent thing. _38:59_: There's a thing which isn't basis dependent. _39:00_: Clearly that discussion there _39:02_: would highly be dependent. _39:03_: The point being that if you change your mind _39:04_: about the basis of 1 forms, you change the, _39:07_: the components you got out, but the. _39:10_: What this ends up supporting is a notion _39:15_: of the of the tense underlying tensor the _39:17_: the the key on the left hand side there, _39:19_: which isn't based dependent. _39:23_: And so one sense of geometrical. _39:27_: Is just it is some negative sense that by _39:31_: geometrical I mean not basis dependent. _39:34_: So it it, it, it's it's the same. _39:37_: It has the IT has a sort of continuity _39:40_: irrespective of the obesity you choose. _39:42_: But the other sense in which I'm talking _39:46_: about things be geometrical is that for _39:49_: all these things there is the the way _39:53_: that this man knowledge is built up. _39:56_: There is a fairly natural interpretation _39:58_: in terms of pointy things and and and and _40:01_: and planes that we can use to think with. _40:03_: And so the idea of a vector as a direction _40:07_: and another size is a geometrical notion. _40:11_: The idea of 1 forms as. _40:15_: Are foliation of of the space which _40:18_: has a direction basically and A _40:20_: and a size in a geometrical notion. _40:25_: And that matters because what _40:27_: we where we would end up with, _40:30_: we want to end up is talking _40:32_: about generativity in a in an _40:35_: approach where we're focusing _40:36_: on the shape of space-time. _40:39_: And the way we can do that _40:41_: is it is in a way which. _40:45_: Um. Are routes to that are route to _40:49_: that which which doesn't involve _40:51_: getting bogged does involve. _40:53_: So the same old fashioned way _40:55_: of introducing generativity and _40:57_: different geometry is, you know, _40:59_: start with components that beginning. _41:01_: It says a vector, _41:02_: something which transforms like _41:04_: a vector and I never liked that _41:06_: definition because it seemed that _41:08_: that really seems self referential _41:09_: in the in the sense that what the _41:13_: the the idea is there is that. _41:21_: Let's let's let's let's not let's _41:22_: not go there because I don't want _41:24_: to start you know teaching you. _41:25_: I think we I think even talking _41:26_: about I think is a slightly _41:27_: mad way of approaching this. _41:28_: But that was the same old _41:30_: fashion way of doing it. _41:30_: And then in the 70s it was with Mr Stone _41:33_: Wheeler in the vanguard of this this _41:35_: change to the way different different _41:37_: geometry generativity was taught. _41:39_: They are the ones who emphasized _41:41_: the the geometry first approach. _41:43_: And talking about and finally of the _41:46_: things like tangent vectors and and _41:48_: tensors defined in this rather abstract way, _41:51_: I'm talking about privileging _41:53_: them and interest them first, _41:55_: and then from that just discovering _41:57_: components and so the behavior of _41:59_: components near the transformation vector, _42:00_: like we it comes out, _42:02_: comes out in the wash. _42:02_: If you're like, it's a consequence. _42:05_: And there's another way you _42:07_: can talk about these things _42:09_: called geometric algebra, _42:10_: which is. _42:14_: I don't want to see fashion, _42:16_: but it's a bit of a hobby in some circles _42:18_: just now and that's the way it was. _42:20_: Really. Does see geometry at all _42:22_: and find a way of talking and and _42:25_: and talks about the the How you can. _42:28_: Define an algebra for directions. _42:30_: How do you combine two directions _42:31_: to give another direction? _42:32_: And so on. And it's very lovely _42:34_: and possibly very powerful. _42:36_: And there's some folk could bang on the _42:38_: table and say this is how George be taught, _42:40_: but you know, there's quite a lot of _42:42_: mathematical knowledge which has to _42:43_: be rewritten and rethought and so on, _42:45_: you know, for that to work. _42:46_: And this and this approach is OK. _42:48_: So if you are feeling _42:50_: under under underemployed, _42:52_: if you're feeling bored, _42:54_: then Google Geometric algebra and _42:56_: you'll find some sort of lecture _42:58_: notes and things about that. _43:01_: It's very nice. _43:01_: Do not distract yourself with that, OK? _43:04_: But we always mention it because that. _43:06_: Very, very much. _43:07_: That's really going all the way _43:09_: along a line which says Geometry _43:11_: 1st and and how do you? _43:13_: How do you? _43:17_: Calculate with geometry _43:19_: with geometrical objects. _43:21_: Components are one way. _43:21_: The other way we do, we do, _43:22_: we do in this, in this case, _43:24_: we the way folk do in this context. _43:31_: So, so yeah, I suppose in that _43:33_: sense I do mean two things. _43:35_: Geometrical, I mean, I mean there's _43:37_: this general approach which is. _43:41_: Shapes first, directions 1st. _43:42_: And when I see this is geometrical _43:45_: object, I mean this is a _43:47_: component independent object. _43:48_: This is a basis independent object. _43:52_: And yes. That's your question. _43:58_: I think ohh. I've seen. _44:02_: Can we go over like? Like basis. _44:08_: Because I understand like. _44:12_: OK. _44:15_: More than. _44:18_: Alright, OK. You know is that there is _44:21_: a particular part of the of the of the _44:23_: notes that you're thinking of there. _44:31_: Ah, _44:34_: OK. It's relating to. _44:44_: So. _44:50_: OK. _44:52_: OK, so. _44:57_: Is it? And this is it the point _44:59_: before or after that statement _45:01_: that we're talking about here? _45:06_: I didn't read. Ohh yeah I think so. _45:12_: Like it? _45:15_: Considering. _45:20_: Right, so Section 2 to 7, let's see. _45:24_: Let's go back and see what I see there. _45:29_: Changing business. OK, so. _45:36_: I don't know the famous words. _45:39_: It is easy to show. Um. Right, I I'm. _45:49_: Thank you. _45:51_: It's easy to show. _45:55_: I'm going to try and come up _45:58_: with that easy show live. _46:01_: I'm not convinced I am. _46:03_: Well, how do we do this? We. Um. _46:13_: Well, let's let's have a go _46:15_: and see what happens. So um. _46:21_: Grade A. Ibar Lambda I, bar I. AI. _46:30_: BJ bar equals Lambda J bar J. _46:35_: PG. So if then we say T. Um. _46:42_: Etcetera or I. Let's say we write. _46:49_: T is our. _46:53_: If you 20 tensor, so it _46:55_: takes 2 vectors. So, um T. _47:01_: IG will be equal to. Um. _47:08_: TE i.e. G. Right. _47:13_: OK, but this might work now if _47:17_: that's we if we rewrite that as E. _47:22_: I bar Lambda I. Bar i.e. _47:28_: G bar Lambda G Bar G. _47:34_: What I have done there is simply done this. _47:38_: For the basis vectors EI and e.g, _47:42_: SOEI will be EI expressed in the other basis. _47:46_: We have an expression like this but. _47:49_: We said that ohh yeah. _47:51_: Another restriction on the on on the _47:52_: functions of the vectors are the tensions _47:54_: are is that they're linear in each argument. _47:56_: We said that that's crucial. _47:59_: And what that means is we _48:02_: can take these numbers out. _48:04_: So we can write that as. _48:06_: Lambda I bar I. Lambda G Bar G. _48:13_: Thank you, TEI bar. _48:17_: EG bar. _48:18_: Which is the Lambda I bar I Lambda G bar. _48:25_: GTI Bar G. Both. Which I _48:30_: think is what was required. _48:35_: So all we've done there is use this _48:39_: general property of expanding 1 vector. _48:42_: In terms of I know the basis using _48:47_: the transformation matrix there. _48:50_: And. Use the linearity. _48:54_: To take these numbers out. _48:57_: Leaving expression there which we _48:59_: recognize as a way of getting the _49:02_: T in the components that the IGIBJ _49:05_: bar components of T in the bar. _49:09_: Precious. And I haven't. _49:13_: I didn't draw my memory for _49:15_: any of these things. _49:16_: In each of these cases, _49:18_: I Simply put in the eyes and _49:20_: eye bars in the only place that _49:22_: could go that was consistent _49:25_: with the Association Convention. _49:30_: The answer to. Getting very direction. _49:36_: Invite previous. OK. I finally I _49:41_: finally better be quick. I'm sorry. _49:46_: Right, because the part of the _49:48_: definition of the of the tensors _49:51_: was they were functions which _49:52_: are linear in each argument. _49:54_: And what linear means what linear, _49:56_: So what? Just to recap there, _49:57_: what linear means F? Is linear. _50:02_: Means that F of a X is equal to a. F of X. _50:11_: So Lambda here are so each of the. _50:15_: Each of the components of that matrix. _50:18_: So that's that's an end by matrix. _50:21_: For each of the components of it is a number. _50:23_: So in this thing here each of the. _50:29_: Lambda is a number and _50:31_: therefore can come out. _50:34_: OK, I'll see you. I think. _50:36_: I see you next Wednesday and I _50:38_: think we're in a different room.