standard matrix of a orthogonal projection linear transformation
$begingroup$
I am try to solve part (b) of question 1 in:
I'm not exactly sure how I should approach part (b). My attempt at the question so far is to plug in the elementary basis vectors $e_1, e_2, e_3$ into the span of vector given to see what the linear transformation does to $e_1, e_2, e_3$ and form the standard matrix from there which hasn't worked so far. Any hints?
linear-algebra
$endgroup$
|
show 3 more comments
$begingroup$
I am try to solve part (b) of question 1 in:
I'm not exactly sure how I should approach part (b). My attempt at the question so far is to plug in the elementary basis vectors $e_1, e_2, e_3$ into the span of vector given to see what the linear transformation does to $e_1, e_2, e_3$ and form the standard matrix from there which hasn't worked so far. Any hints?
linear-algebra
$endgroup$
$begingroup$
Try to think about how you would (orthogonally) project a vector $x$ onto the span of $e_1$ and $e_2$. Once you've done this, see if you can write it as a matrix operation.
$endgroup$
– tch
Jan 18 at 21:44
$begingroup$
do you mean project $e1$ and $e2$ with the span vectors? i.e. ((1,1,0)($e1$))($e1$) and so forth with $e2$ ?
$endgroup$
– lohboys
Jan 18 at 21:48
$begingroup$
Basically. I assume by $((1,1,0)(e1))$ you mean the inner product?
$endgroup$
– tch
Jan 18 at 22:08
$begingroup$
yes, inner product. So by computing what I suggested, I should be able to figure out the standard matrix for the linear transformation?
$endgroup$
– lohboys
Jan 18 at 22:11
$begingroup$
Sorry, instead of $e_1$ and $e_2$ i should have said $v_1$ and $v_2$, your orthonormal basis vectors for the span of $V$.
$endgroup$
– tch
Jan 18 at 22:19
|
show 3 more comments
$begingroup$
I am try to solve part (b) of question 1 in:
I'm not exactly sure how I should approach part (b). My attempt at the question so far is to plug in the elementary basis vectors $e_1, e_2, e_3$ into the span of vector given to see what the linear transformation does to $e_1, e_2, e_3$ and form the standard matrix from there which hasn't worked so far. Any hints?
linear-algebra
$endgroup$
I am try to solve part (b) of question 1 in:
I'm not exactly sure how I should approach part (b). My attempt at the question so far is to plug in the elementary basis vectors $e_1, e_2, e_3$ into the span of vector given to see what the linear transformation does to $e_1, e_2, e_3$ and form the standard matrix from there which hasn't worked so far. Any hints?
linear-algebra
linear-algebra
edited Jan 18 at 22:10
idriskameni
749321
749321
asked Jan 18 at 21:42
lohboyslohboys
10319
10319
$begingroup$
Try to think about how you would (orthogonally) project a vector $x$ onto the span of $e_1$ and $e_2$. Once you've done this, see if you can write it as a matrix operation.
$endgroup$
– tch
Jan 18 at 21:44
$begingroup$
do you mean project $e1$ and $e2$ with the span vectors? i.e. ((1,1,0)($e1$))($e1$) and so forth with $e2$ ?
$endgroup$
– lohboys
Jan 18 at 21:48
$begingroup$
Basically. I assume by $((1,1,0)(e1))$ you mean the inner product?
$endgroup$
– tch
Jan 18 at 22:08
$begingroup$
yes, inner product. So by computing what I suggested, I should be able to figure out the standard matrix for the linear transformation?
$endgroup$
– lohboys
Jan 18 at 22:11
$begingroup$
Sorry, instead of $e_1$ and $e_2$ i should have said $v_1$ and $v_2$, your orthonormal basis vectors for the span of $V$.
$endgroup$
– tch
Jan 18 at 22:19
|
show 3 more comments
$begingroup$
Try to think about how you would (orthogonally) project a vector $x$ onto the span of $e_1$ and $e_2$. Once you've done this, see if you can write it as a matrix operation.
$endgroup$
– tch
Jan 18 at 21:44
$begingroup$
do you mean project $e1$ and $e2$ with the span vectors? i.e. ((1,1,0)($e1$))($e1$) and so forth with $e2$ ?
$endgroup$
– lohboys
Jan 18 at 21:48
$begingroup$
Basically. I assume by $((1,1,0)(e1))$ you mean the inner product?
$endgroup$
– tch
Jan 18 at 22:08
$begingroup$
yes, inner product. So by computing what I suggested, I should be able to figure out the standard matrix for the linear transformation?
$endgroup$
– lohboys
Jan 18 at 22:11
$begingroup$
Sorry, instead of $e_1$ and $e_2$ i should have said $v_1$ and $v_2$, your orthonormal basis vectors for the span of $V$.
$endgroup$
– tch
Jan 18 at 22:19
$begingroup$
Try to think about how you would (orthogonally) project a vector $x$ onto the span of $e_1$ and $e_2$. Once you've done this, see if you can write it as a matrix operation.
$endgroup$
– tch
Jan 18 at 21:44
$begingroup$
Try to think about how you would (orthogonally) project a vector $x$ onto the span of $e_1$ and $e_2$. Once you've done this, see if you can write it as a matrix operation.
$endgroup$
– tch
Jan 18 at 21:44
$begingroup$
do you mean project $e1$ and $e2$ with the span vectors? i.e. ((1,1,0)($e1$))($e1$) and so forth with $e2$ ?
$endgroup$
– lohboys
Jan 18 at 21:48
$begingroup$
do you mean project $e1$ and $e2$ with the span vectors? i.e. ((1,1,0)($e1$))($e1$) and so forth with $e2$ ?
$endgroup$
– lohboys
Jan 18 at 21:48
$begingroup$
Basically. I assume by $((1,1,0)(e1))$ you mean the inner product?
$endgroup$
– tch
Jan 18 at 22:08
$begingroup$
Basically. I assume by $((1,1,0)(e1))$ you mean the inner product?
$endgroup$
– tch
Jan 18 at 22:08
$begingroup$
yes, inner product. So by computing what I suggested, I should be able to figure out the standard matrix for the linear transformation?
$endgroup$
– lohboys
Jan 18 at 22:11
$begingroup$
yes, inner product. So by computing what I suggested, I should be able to figure out the standard matrix for the linear transformation?
$endgroup$
– lohboys
Jan 18 at 22:11
$begingroup$
Sorry, instead of $e_1$ and $e_2$ i should have said $v_1$ and $v_2$, your orthonormal basis vectors for the span of $V$.
$endgroup$
– tch
Jan 18 at 22:19
$begingroup$
Sorry, instead of $e_1$ and $e_2$ i should have said $v_1$ and $v_2$, your orthonormal basis vectors for the span of $V$.
$endgroup$
– tch
Jan 18 at 22:19
|
show 3 more comments
2 Answers
2
active
oldest
votes
$begingroup$
Here is a way. We note that for $yin mathbb{R}^3$ the projected vector $Ay$ is such that $y-Ay$ is orthogonal to $V$. In particular if we form a matrix $B$ whose columns are $(1,1,0)^T$ and $(0,0,1)^T$ respectively, we get that
$$
B'(y-Ay)=0.
$$
But $Ay=Bc$ for some $c$ whence
$$
B'(y-Bc)=0iff B'y=B'Bc
$$
Since $B$ is full-rank, it follows that $B'B$ is invertible whence
$$
c=(B'B)^{-1}B'yimplies Ay=Bc=B(B'B)^{-1}B'y.
$$
So $A=B(B'B)^{-1}B'$.
$endgroup$
$begingroup$
this is quite a complex explanation for myself, what exactly is $B'$ in this instance?
$endgroup$
– lohboys
Jan 18 at 22:30
1
$begingroup$
The transpose of the matrix.
$endgroup$
– Foobaz John
Jan 18 at 22:48
add a comment |
$begingroup$
To expand on my comments since they were getting long.
Suppose we have a set of orthonormal vectors $v_1$ and $v_2$. If we want to compute the orthogonal projection of $x$ onto the span of $v_1$ and $v_2$ we would first project $x$ onto $v_1$:
$$
y = P_1x = (v_1^Tx) v_1
$$
and then onto $v_2$:
$$
P_2x = (v_2^Tx)v_2
$$
Therefore,
$$
Px = (v_1^Tx)v_1 + (v_2^Tx)v_2
$$
We can write this as the sum of rank-1 outer products:
$$
Px = v_1v_1^Tx + v_2v_2^Tx = (v_1v_1^T+v_2v_2^T)x
$$
Therefore,
$$
P = VV^T = [v_1 v_2]
begin{bmatrix}
v_1^T \ v_2^T
end{bmatrix}
$$
If $v_1$ and $v_2$ are not orthonormal you can first orthonormalize them (using Gram-Schmidt) and then do this. Alternatively, you can use the formula Foobaz John gave.
$endgroup$
$begingroup$
if I'm interpreting this correctly, $P$ gives the standard matrix of this linear transformation yes?
$endgroup$
– lohboys
Jan 19 at 2:46
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3078788%2fstandard-matrix-of-a-orthogonal-projection-linear-transformation%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Here is a way. We note that for $yin mathbb{R}^3$ the projected vector $Ay$ is such that $y-Ay$ is orthogonal to $V$. In particular if we form a matrix $B$ whose columns are $(1,1,0)^T$ and $(0,0,1)^T$ respectively, we get that
$$
B'(y-Ay)=0.
$$
But $Ay=Bc$ for some $c$ whence
$$
B'(y-Bc)=0iff B'y=B'Bc
$$
Since $B$ is full-rank, it follows that $B'B$ is invertible whence
$$
c=(B'B)^{-1}B'yimplies Ay=Bc=B(B'B)^{-1}B'y.
$$
So $A=B(B'B)^{-1}B'$.
$endgroup$
$begingroup$
this is quite a complex explanation for myself, what exactly is $B'$ in this instance?
$endgroup$
– lohboys
Jan 18 at 22:30
1
$begingroup$
The transpose of the matrix.
$endgroup$
– Foobaz John
Jan 18 at 22:48
add a comment |
$begingroup$
Here is a way. We note that for $yin mathbb{R}^3$ the projected vector $Ay$ is such that $y-Ay$ is orthogonal to $V$. In particular if we form a matrix $B$ whose columns are $(1,1,0)^T$ and $(0,0,1)^T$ respectively, we get that
$$
B'(y-Ay)=0.
$$
But $Ay=Bc$ for some $c$ whence
$$
B'(y-Bc)=0iff B'y=B'Bc
$$
Since $B$ is full-rank, it follows that $B'B$ is invertible whence
$$
c=(B'B)^{-1}B'yimplies Ay=Bc=B(B'B)^{-1}B'y.
$$
So $A=B(B'B)^{-1}B'$.
$endgroup$
$begingroup$
this is quite a complex explanation for myself, what exactly is $B'$ in this instance?
$endgroup$
– lohboys
Jan 18 at 22:30
1
$begingroup$
The transpose of the matrix.
$endgroup$
– Foobaz John
Jan 18 at 22:48
add a comment |
$begingroup$
Here is a way. We note that for $yin mathbb{R}^3$ the projected vector $Ay$ is such that $y-Ay$ is orthogonal to $V$. In particular if we form a matrix $B$ whose columns are $(1,1,0)^T$ and $(0,0,1)^T$ respectively, we get that
$$
B'(y-Ay)=0.
$$
But $Ay=Bc$ for some $c$ whence
$$
B'(y-Bc)=0iff B'y=B'Bc
$$
Since $B$ is full-rank, it follows that $B'B$ is invertible whence
$$
c=(B'B)^{-1}B'yimplies Ay=Bc=B(B'B)^{-1}B'y.
$$
So $A=B(B'B)^{-1}B'$.
$endgroup$
Here is a way. We note that for $yin mathbb{R}^3$ the projected vector $Ay$ is such that $y-Ay$ is orthogonal to $V$. In particular if we form a matrix $B$ whose columns are $(1,1,0)^T$ and $(0,0,1)^T$ respectively, we get that
$$
B'(y-Ay)=0.
$$
But $Ay=Bc$ for some $c$ whence
$$
B'(y-Bc)=0iff B'y=B'Bc
$$
Since $B$ is full-rank, it follows that $B'B$ is invertible whence
$$
c=(B'B)^{-1}B'yimplies Ay=Bc=B(B'B)^{-1}B'y.
$$
So $A=B(B'B)^{-1}B'$.
answered Jan 18 at 22:15
Foobaz JohnFoobaz John
23k41552
23k41552
$begingroup$
this is quite a complex explanation for myself, what exactly is $B'$ in this instance?
$endgroup$
– lohboys
Jan 18 at 22:30
1
$begingroup$
The transpose of the matrix.
$endgroup$
– Foobaz John
Jan 18 at 22:48
add a comment |
$begingroup$
this is quite a complex explanation for myself, what exactly is $B'$ in this instance?
$endgroup$
– lohboys
Jan 18 at 22:30
1
$begingroup$
The transpose of the matrix.
$endgroup$
– Foobaz John
Jan 18 at 22:48
$begingroup$
this is quite a complex explanation for myself, what exactly is $B'$ in this instance?
$endgroup$
– lohboys
Jan 18 at 22:30
$begingroup$
this is quite a complex explanation for myself, what exactly is $B'$ in this instance?
$endgroup$
– lohboys
Jan 18 at 22:30
1
1
$begingroup$
The transpose of the matrix.
$endgroup$
– Foobaz John
Jan 18 at 22:48
$begingroup$
The transpose of the matrix.
$endgroup$
– Foobaz John
Jan 18 at 22:48
add a comment |
$begingroup$
To expand on my comments since they were getting long.
Suppose we have a set of orthonormal vectors $v_1$ and $v_2$. If we want to compute the orthogonal projection of $x$ onto the span of $v_1$ and $v_2$ we would first project $x$ onto $v_1$:
$$
y = P_1x = (v_1^Tx) v_1
$$
and then onto $v_2$:
$$
P_2x = (v_2^Tx)v_2
$$
Therefore,
$$
Px = (v_1^Tx)v_1 + (v_2^Tx)v_2
$$
We can write this as the sum of rank-1 outer products:
$$
Px = v_1v_1^Tx + v_2v_2^Tx = (v_1v_1^T+v_2v_2^T)x
$$
Therefore,
$$
P = VV^T = [v_1 v_2]
begin{bmatrix}
v_1^T \ v_2^T
end{bmatrix}
$$
If $v_1$ and $v_2$ are not orthonormal you can first orthonormalize them (using Gram-Schmidt) and then do this. Alternatively, you can use the formula Foobaz John gave.
$endgroup$
$begingroup$
if I'm interpreting this correctly, $P$ gives the standard matrix of this linear transformation yes?
$endgroup$
– lohboys
Jan 19 at 2:46
add a comment |
$begingroup$
To expand on my comments since they were getting long.
Suppose we have a set of orthonormal vectors $v_1$ and $v_2$. If we want to compute the orthogonal projection of $x$ onto the span of $v_1$ and $v_2$ we would first project $x$ onto $v_1$:
$$
y = P_1x = (v_1^Tx) v_1
$$
and then onto $v_2$:
$$
P_2x = (v_2^Tx)v_2
$$
Therefore,
$$
Px = (v_1^Tx)v_1 + (v_2^Tx)v_2
$$
We can write this as the sum of rank-1 outer products:
$$
Px = v_1v_1^Tx + v_2v_2^Tx = (v_1v_1^T+v_2v_2^T)x
$$
Therefore,
$$
P = VV^T = [v_1 v_2]
begin{bmatrix}
v_1^T \ v_2^T
end{bmatrix}
$$
If $v_1$ and $v_2$ are not orthonormal you can first orthonormalize them (using Gram-Schmidt) and then do this. Alternatively, you can use the formula Foobaz John gave.
$endgroup$
$begingroup$
if I'm interpreting this correctly, $P$ gives the standard matrix of this linear transformation yes?
$endgroup$
– lohboys
Jan 19 at 2:46
add a comment |
$begingroup$
To expand on my comments since they were getting long.
Suppose we have a set of orthonormal vectors $v_1$ and $v_2$. If we want to compute the orthogonal projection of $x$ onto the span of $v_1$ and $v_2$ we would first project $x$ onto $v_1$:
$$
y = P_1x = (v_1^Tx) v_1
$$
and then onto $v_2$:
$$
P_2x = (v_2^Tx)v_2
$$
Therefore,
$$
Px = (v_1^Tx)v_1 + (v_2^Tx)v_2
$$
We can write this as the sum of rank-1 outer products:
$$
Px = v_1v_1^Tx + v_2v_2^Tx = (v_1v_1^T+v_2v_2^T)x
$$
Therefore,
$$
P = VV^T = [v_1 v_2]
begin{bmatrix}
v_1^T \ v_2^T
end{bmatrix}
$$
If $v_1$ and $v_2$ are not orthonormal you can first orthonormalize them (using Gram-Schmidt) and then do this. Alternatively, you can use the formula Foobaz John gave.
$endgroup$
To expand on my comments since they were getting long.
Suppose we have a set of orthonormal vectors $v_1$ and $v_2$. If we want to compute the orthogonal projection of $x$ onto the span of $v_1$ and $v_2$ we would first project $x$ onto $v_1$:
$$
y = P_1x = (v_1^Tx) v_1
$$
and then onto $v_2$:
$$
P_2x = (v_2^Tx)v_2
$$
Therefore,
$$
Px = (v_1^Tx)v_1 + (v_2^Tx)v_2
$$
We can write this as the sum of rank-1 outer products:
$$
Px = v_1v_1^Tx + v_2v_2^Tx = (v_1v_1^T+v_2v_2^T)x
$$
Therefore,
$$
P = VV^T = [v_1 v_2]
begin{bmatrix}
v_1^T \ v_2^T
end{bmatrix}
$$
If $v_1$ and $v_2$ are not orthonormal you can first orthonormalize them (using Gram-Schmidt) and then do this. Alternatively, you can use the formula Foobaz John gave.
answered Jan 19 at 0:18
tchtch
833310
833310
$begingroup$
if I'm interpreting this correctly, $P$ gives the standard matrix of this linear transformation yes?
$endgroup$
– lohboys
Jan 19 at 2:46
add a comment |
$begingroup$
if I'm interpreting this correctly, $P$ gives the standard matrix of this linear transformation yes?
$endgroup$
– lohboys
Jan 19 at 2:46
$begingroup$
if I'm interpreting this correctly, $P$ gives the standard matrix of this linear transformation yes?
$endgroup$
– lohboys
Jan 19 at 2:46
$begingroup$
if I'm interpreting this correctly, $P$ gives the standard matrix of this linear transformation yes?
$endgroup$
– lohboys
Jan 19 at 2:46
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3078788%2fstandard-matrix-of-a-orthogonal-projection-linear-transformation%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Try to think about how you would (orthogonally) project a vector $x$ onto the span of $e_1$ and $e_2$. Once you've done this, see if you can write it as a matrix operation.
$endgroup$
– tch
Jan 18 at 21:44
$begingroup$
do you mean project $e1$ and $e2$ with the span vectors? i.e. ((1,1,0)($e1$))($e1$) and so forth with $e2$ ?
$endgroup$
– lohboys
Jan 18 at 21:48
$begingroup$
Basically. I assume by $((1,1,0)(e1))$ you mean the inner product?
$endgroup$
– tch
Jan 18 at 22:08
$begingroup$
yes, inner product. So by computing what I suggested, I should be able to figure out the standard matrix for the linear transformation?
$endgroup$
– lohboys
Jan 18 at 22:11
$begingroup$
Sorry, instead of $e_1$ and $e_2$ i should have said $v_1$ and $v_2$, your orthonormal basis vectors for the span of $V$.
$endgroup$
– tch
Jan 18 at 22:19