Do Diagonal Matrices Always Commute?
$begingroup$
Let $A$ be an $n times n$ matrix and let $Lambda$ be an $n times n$ diagonal matrix. Is it always the case that $ALambda = Lambda A$? If not, when is it the case that $A Lambda = Lambda A$?
If we restrict the diagonal entries of $Lambda$ to being the equal (i.e. $Lambda = text{drag}(a, a, dots, a)$), then it is clear that $ALambda = AaI = aIA = Lambda A$. However, I can't seem to come up with an argument for the general case.
linear-algebra matrices
$endgroup$
add a comment |
$begingroup$
Let $A$ be an $n times n$ matrix and let $Lambda$ be an $n times n$ diagonal matrix. Is it always the case that $ALambda = Lambda A$? If not, when is it the case that $A Lambda = Lambda A$?
If we restrict the diagonal entries of $Lambda$ to being the equal (i.e. $Lambda = text{drag}(a, a, dots, a)$), then it is clear that $ALambda = AaI = aIA = Lambda A$. However, I can't seem to come up with an argument for the general case.
linear-algebra matrices
$endgroup$
8
$begingroup$
Have you tried simple examples?
$endgroup$
– Friedrich Philipp
Mar 15 '16 at 1:18
7
$begingroup$
When the diagonal matrix is on the right, it scales the columns of the matrix it is multiplying. when the diagonal matrix is on the left, it scales the rows. Since column-scaling and row scaling are different operations, there are only very limited circumstances that the matrices will commute.
$endgroup$
– Nick Alger
Mar 15 '16 at 1:30
add a comment |
$begingroup$
Let $A$ be an $n times n$ matrix and let $Lambda$ be an $n times n$ diagonal matrix. Is it always the case that $ALambda = Lambda A$? If not, when is it the case that $A Lambda = Lambda A$?
If we restrict the diagonal entries of $Lambda$ to being the equal (i.e. $Lambda = text{drag}(a, a, dots, a)$), then it is clear that $ALambda = AaI = aIA = Lambda A$. However, I can't seem to come up with an argument for the general case.
linear-algebra matrices
$endgroup$
Let $A$ be an $n times n$ matrix and let $Lambda$ be an $n times n$ diagonal matrix. Is it always the case that $ALambda = Lambda A$? If not, when is it the case that $A Lambda = Lambda A$?
If we restrict the diagonal entries of $Lambda$ to being the equal (i.e. $Lambda = text{drag}(a, a, dots, a)$), then it is clear that $ALambda = AaI = aIA = Lambda A$. However, I can't seem to come up with an argument for the general case.
linear-algebra matrices
linear-algebra matrices
asked Mar 15 '16 at 1:13
Jeff ScottJeff Scott
129123
129123
8
$begingroup$
Have you tried simple examples?
$endgroup$
– Friedrich Philipp
Mar 15 '16 at 1:18
7
$begingroup$
When the diagonal matrix is on the right, it scales the columns of the matrix it is multiplying. when the diagonal matrix is on the left, it scales the rows. Since column-scaling and row scaling are different operations, there are only very limited circumstances that the matrices will commute.
$endgroup$
– Nick Alger
Mar 15 '16 at 1:30
add a comment |
8
$begingroup$
Have you tried simple examples?
$endgroup$
– Friedrich Philipp
Mar 15 '16 at 1:18
7
$begingroup$
When the diagonal matrix is on the right, it scales the columns of the matrix it is multiplying. when the diagonal matrix is on the left, it scales the rows. Since column-scaling and row scaling are different operations, there are only very limited circumstances that the matrices will commute.
$endgroup$
– Nick Alger
Mar 15 '16 at 1:30
8
8
$begingroup$
Have you tried simple examples?
$endgroup$
– Friedrich Philipp
Mar 15 '16 at 1:18
$begingroup$
Have you tried simple examples?
$endgroup$
– Friedrich Philipp
Mar 15 '16 at 1:18
7
7
$begingroup$
When the diagonal matrix is on the right, it scales the columns of the matrix it is multiplying. when the diagonal matrix is on the left, it scales the rows. Since column-scaling and row scaling are different operations, there are only very limited circumstances that the matrices will commute.
$endgroup$
– Nick Alger
Mar 15 '16 at 1:30
$begingroup$
When the diagonal matrix is on the right, it scales the columns of the matrix it is multiplying. when the diagonal matrix is on the left, it scales the rows. Since column-scaling and row scaling are different operations, there are only very limited circumstances that the matrices will commute.
$endgroup$
– Nick Alger
Mar 15 '16 at 1:30
add a comment |
6 Answers
6
active
oldest
votes
$begingroup$
If all the diagonal entries of$Lambda$ are distinct, it commutes only with diagonal matrices.
In contrast, for each $k$ consecutive equal diagonal entries in $Lambda,$ we may allow $A$ to have anything at all in the corresponding $k$ by $k$ square block with both corners on the main diagonal.
This means that the set of matrices that commute with $Lambda$ has a minimum dimension $n$ and a maximum dimension $n^2.$ Suppose we have $r$ different diagonal entries, and there are $k_i$ copies of diagonal entry $lambda_i.$ Each $k_i geq 1,$ and we have
$$ k_1 + k_2 + cdots + k_r = n. $$
Then by the block construction I mentioned above, the dimension of the space of matrices that commute with $Lambda$ is
$$ k_1^2 + k_2^2 + cdots + k_r^2. $$
The minimum is when $r=n,$ so all $k_i = 1,$ and the dimension is $n$
The maximum is when $r=1,$ and $k_1=n,$ the matrix is a scalar multiple of the identity matrix, and the dimension is $n^2.$
$endgroup$
$begingroup$
Do the copies have to be right next to each other? e.g. diag(1,1,1,2,2,7,8,8)?
$endgroup$
– Hrit Roy
Feb 8 '18 at 5:25
$begingroup$
@HritRoy, no. You can change your basis to express the copies next to each other if you want to. But it is not necessary
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:22
add a comment |
$begingroup$
It is possible that a diagonal matrix $Lambda$ commutes with a matrix $A$ when $A$ is symmetric and $A Lambda$ is also symmetric. We have
$$
Lambda A = (A^{top}Lambda^top)^{top} = (ALambda)^top = ALambda
$$
The above trivially holds when $A$ and $Lambda$ are both diagonal.
$endgroup$
add a comment |
$begingroup$
A diagonal matrix will not commute with every matrix.
$$
begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix}*begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}=begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}$$
But:
$$begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix} * begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix} = begin{pmatrix} 0 & 2 \ 0 & 0 end{pmatrix}.$$
$endgroup$
add a comment |
$begingroup$
The claim is false in general. Take $A = begin{bmatrix}1 & 2\3 & 4end{bmatrix}$
$Lambda = begin{bmatrix}2 & 0\0 & 3end{bmatrix}$. Then
$A Lambda = begin{bmatrix}2 & 6\6 & 12end{bmatrix}$
$Lambda A = begin{bmatrix}2 & 4\9 & 12end{bmatrix}$
On a more useful note, you can look up commuting matrices on Wikipedia.
$endgroup$
add a comment |
$begingroup$
We want to find $Lambda$ such that $ALambda=Lambda A$. Then we have $$left( begin{array}{cc}
sum a_{1n}lambda_{n1} & sum a_{1n}lambda_{n2} & sum a_{1n}lambda_{n3} & ... \
sum a_{2n}lambda_{n1} & sum a_{2n}lambda_{n2} & sum a_{2n}lambda_{n3} & ...\
sum a_{3n}lambda_{n1} & sum a_{3n}lambda_{n2} & sum a_{3n}lambda_{n3} & ...\
... & ... & ... & ...end{array} right)=left( begin{array}{cc}
sum a_{n1}lambda_{1n} & sum a_{n1}lambda_{2n} & sum a_{n1}lambda_{3n} & ... \
sum a_{n2}lambda_{1n} & sum a_{n2}lambda_{2n} & sum a_{n2}lambda_{3n} & ...\
sum a_{n3}lambda_{1n} & sum a_{n3}lambda_{2n} & sum a_{n3}lambda_{3n} & ...\
... & ... & ... & ...end{array} right)$$ Hence for the equality to hold, both $A$ and $Lambda$ must be symmetric since $a_{in}=a_{ni}$ and $lambda_{in}=lambda_{ni}$ for $i=1,2,3,...$ A diagonal matrix is a special case of this.
$endgroup$
$begingroup$
Hello, what is the name of this theorem? Or where is it in a book? I want to reference it. Thanks a lot.
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:09
$begingroup$
I am not aware of a theorem whose result is this since it is just derived from the definition.
$endgroup$
– TheSimpliFire
Apr 28 '18 at 8:20
$begingroup$
"both $A$ and $Lambda$ must be symmetric" <-- not true as stated (it's not a necessary condition); also, $Lambda$ is symmetric already (being diagonal).
$endgroup$
– darij grinberg
Jan 12 at 22:45
add a comment |
$begingroup$
The answer from @AOK is not generally true. Obviously for diagonal identical matrix $Lambda$ is valid only when $A$ is symmetric matrix.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1697991%2fdo-diagonal-matrices-always-commute%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
6 Answers
6
active
oldest
votes
6 Answers
6
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
If all the diagonal entries of$Lambda$ are distinct, it commutes only with diagonal matrices.
In contrast, for each $k$ consecutive equal diagonal entries in $Lambda,$ we may allow $A$ to have anything at all in the corresponding $k$ by $k$ square block with both corners on the main diagonal.
This means that the set of matrices that commute with $Lambda$ has a minimum dimension $n$ and a maximum dimension $n^2.$ Suppose we have $r$ different diagonal entries, and there are $k_i$ copies of diagonal entry $lambda_i.$ Each $k_i geq 1,$ and we have
$$ k_1 + k_2 + cdots + k_r = n. $$
Then by the block construction I mentioned above, the dimension of the space of matrices that commute with $Lambda$ is
$$ k_1^2 + k_2^2 + cdots + k_r^2. $$
The minimum is when $r=n,$ so all $k_i = 1,$ and the dimension is $n$
The maximum is when $r=1,$ and $k_1=n,$ the matrix is a scalar multiple of the identity matrix, and the dimension is $n^2.$
$endgroup$
$begingroup$
Do the copies have to be right next to each other? e.g. diag(1,1,1,2,2,7,8,8)?
$endgroup$
– Hrit Roy
Feb 8 '18 at 5:25
$begingroup$
@HritRoy, no. You can change your basis to express the copies next to each other if you want to. But it is not necessary
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:22
add a comment |
$begingroup$
If all the diagonal entries of$Lambda$ are distinct, it commutes only with diagonal matrices.
In contrast, for each $k$ consecutive equal diagonal entries in $Lambda,$ we may allow $A$ to have anything at all in the corresponding $k$ by $k$ square block with both corners on the main diagonal.
This means that the set of matrices that commute with $Lambda$ has a minimum dimension $n$ and a maximum dimension $n^2.$ Suppose we have $r$ different diagonal entries, and there are $k_i$ copies of diagonal entry $lambda_i.$ Each $k_i geq 1,$ and we have
$$ k_1 + k_2 + cdots + k_r = n. $$
Then by the block construction I mentioned above, the dimension of the space of matrices that commute with $Lambda$ is
$$ k_1^2 + k_2^2 + cdots + k_r^2. $$
The minimum is when $r=n,$ so all $k_i = 1,$ and the dimension is $n$
The maximum is when $r=1,$ and $k_1=n,$ the matrix is a scalar multiple of the identity matrix, and the dimension is $n^2.$
$endgroup$
$begingroup$
Do the copies have to be right next to each other? e.g. diag(1,1,1,2,2,7,8,8)?
$endgroup$
– Hrit Roy
Feb 8 '18 at 5:25
$begingroup$
@HritRoy, no. You can change your basis to express the copies next to each other if you want to. But it is not necessary
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:22
add a comment |
$begingroup$
If all the diagonal entries of$Lambda$ are distinct, it commutes only with diagonal matrices.
In contrast, for each $k$ consecutive equal diagonal entries in $Lambda,$ we may allow $A$ to have anything at all in the corresponding $k$ by $k$ square block with both corners on the main diagonal.
This means that the set of matrices that commute with $Lambda$ has a minimum dimension $n$ and a maximum dimension $n^2.$ Suppose we have $r$ different diagonal entries, and there are $k_i$ copies of diagonal entry $lambda_i.$ Each $k_i geq 1,$ and we have
$$ k_1 + k_2 + cdots + k_r = n. $$
Then by the block construction I mentioned above, the dimension of the space of matrices that commute with $Lambda$ is
$$ k_1^2 + k_2^2 + cdots + k_r^2. $$
The minimum is when $r=n,$ so all $k_i = 1,$ and the dimension is $n$
The maximum is when $r=1,$ and $k_1=n,$ the matrix is a scalar multiple of the identity matrix, and the dimension is $n^2.$
$endgroup$
If all the diagonal entries of$Lambda$ are distinct, it commutes only with diagonal matrices.
In contrast, for each $k$ consecutive equal diagonal entries in $Lambda,$ we may allow $A$ to have anything at all in the corresponding $k$ by $k$ square block with both corners on the main diagonal.
This means that the set of matrices that commute with $Lambda$ has a minimum dimension $n$ and a maximum dimension $n^2.$ Suppose we have $r$ different diagonal entries, and there are $k_i$ copies of diagonal entry $lambda_i.$ Each $k_i geq 1,$ and we have
$$ k_1 + k_2 + cdots + k_r = n. $$
Then by the block construction I mentioned above, the dimension of the space of matrices that commute with $Lambda$ is
$$ k_1^2 + k_2^2 + cdots + k_r^2. $$
The minimum is when $r=n,$ so all $k_i = 1,$ and the dimension is $n$
The maximum is when $r=1,$ and $k_1=n,$ the matrix is a scalar multiple of the identity matrix, and the dimension is $n^2.$
edited Mar 15 '16 at 2:44
answered Mar 15 '16 at 2:19
Will JagyWill Jagy
104k5102201
104k5102201
$begingroup$
Do the copies have to be right next to each other? e.g. diag(1,1,1,2,2,7,8,8)?
$endgroup$
– Hrit Roy
Feb 8 '18 at 5:25
$begingroup$
@HritRoy, no. You can change your basis to express the copies next to each other if you want to. But it is not necessary
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:22
add a comment |
$begingroup$
Do the copies have to be right next to each other? e.g. diag(1,1,1,2,2,7,8,8)?
$endgroup$
– Hrit Roy
Feb 8 '18 at 5:25
$begingroup$
@HritRoy, no. You can change your basis to express the copies next to each other if you want to. But it is not necessary
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:22
$begingroup$
Do the copies have to be right next to each other? e.g. diag(1,1,1,2,2,7,8,8)?
$endgroup$
– Hrit Roy
Feb 8 '18 at 5:25
$begingroup$
Do the copies have to be right next to each other? e.g. diag(1,1,1,2,2,7,8,8)?
$endgroup$
– Hrit Roy
Feb 8 '18 at 5:25
$begingroup$
@HritRoy, no. You can change your basis to express the copies next to each other if you want to. But it is not necessary
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:22
$begingroup$
@HritRoy, no. You can change your basis to express the copies next to each other if you want to. But it is not necessary
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:22
add a comment |
$begingroup$
It is possible that a diagonal matrix $Lambda$ commutes with a matrix $A$ when $A$ is symmetric and $A Lambda$ is also symmetric. We have
$$
Lambda A = (A^{top}Lambda^top)^{top} = (ALambda)^top = ALambda
$$
The above trivially holds when $A$ and $Lambda$ are both diagonal.
$endgroup$
add a comment |
$begingroup$
It is possible that a diagonal matrix $Lambda$ commutes with a matrix $A$ when $A$ is symmetric and $A Lambda$ is also symmetric. We have
$$
Lambda A = (A^{top}Lambda^top)^{top} = (ALambda)^top = ALambda
$$
The above trivially holds when $A$ and $Lambda$ are both diagonal.
$endgroup$
add a comment |
$begingroup$
It is possible that a diagonal matrix $Lambda$ commutes with a matrix $A$ when $A$ is symmetric and $A Lambda$ is also symmetric. We have
$$
Lambda A = (A^{top}Lambda^top)^{top} = (ALambda)^top = ALambda
$$
The above trivially holds when $A$ and $Lambda$ are both diagonal.
$endgroup$
It is possible that a diagonal matrix $Lambda$ commutes with a matrix $A$ when $A$ is symmetric and $A Lambda$ is also symmetric. We have
$$
Lambda A = (A^{top}Lambda^top)^{top} = (ALambda)^top = ALambda
$$
The above trivially holds when $A$ and $Lambda$ are both diagonal.
edited Mar 15 '16 at 1:50
answered Mar 15 '16 at 1:45
user322903user322903
1564
1564
add a comment |
add a comment |
$begingroup$
A diagonal matrix will not commute with every matrix.
$$
begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix}*begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}=begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}$$
But:
$$begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix} * begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix} = begin{pmatrix} 0 & 2 \ 0 & 0 end{pmatrix}.$$
$endgroup$
add a comment |
$begingroup$
A diagonal matrix will not commute with every matrix.
$$
begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix}*begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}=begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}$$
But:
$$begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix} * begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix} = begin{pmatrix} 0 & 2 \ 0 & 0 end{pmatrix}.$$
$endgroup$
add a comment |
$begingroup$
A diagonal matrix will not commute with every matrix.
$$
begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix}*begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}=begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}$$
But:
$$begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix} * begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix} = begin{pmatrix} 0 & 2 \ 0 & 0 end{pmatrix}.$$
$endgroup$
A diagonal matrix will not commute with every matrix.
$$
begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix}*begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}=begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix}$$
But:
$$begin{pmatrix} 0 & 1 \ 0 & 0 end{pmatrix} * begin{pmatrix} 1 & 0 \ 0 & 2 end{pmatrix} = begin{pmatrix} 0 & 2 \ 0 & 0 end{pmatrix}.$$
edited Mar 15 '16 at 2:49
pjs36
15.8k32962
15.8k32962
answered Mar 15 '16 at 1:25
Gregory SimonGregory Simon
50124
50124
add a comment |
add a comment |
$begingroup$
The claim is false in general. Take $A = begin{bmatrix}1 & 2\3 & 4end{bmatrix}$
$Lambda = begin{bmatrix}2 & 0\0 & 3end{bmatrix}$. Then
$A Lambda = begin{bmatrix}2 & 6\6 & 12end{bmatrix}$
$Lambda A = begin{bmatrix}2 & 4\9 & 12end{bmatrix}$
On a more useful note, you can look up commuting matrices on Wikipedia.
$endgroup$
add a comment |
$begingroup$
The claim is false in general. Take $A = begin{bmatrix}1 & 2\3 & 4end{bmatrix}$
$Lambda = begin{bmatrix}2 & 0\0 & 3end{bmatrix}$. Then
$A Lambda = begin{bmatrix}2 & 6\6 & 12end{bmatrix}$
$Lambda A = begin{bmatrix}2 & 4\9 & 12end{bmatrix}$
On a more useful note, you can look up commuting matrices on Wikipedia.
$endgroup$
add a comment |
$begingroup$
The claim is false in general. Take $A = begin{bmatrix}1 & 2\3 & 4end{bmatrix}$
$Lambda = begin{bmatrix}2 & 0\0 & 3end{bmatrix}$. Then
$A Lambda = begin{bmatrix}2 & 6\6 & 12end{bmatrix}$
$Lambda A = begin{bmatrix}2 & 4\9 & 12end{bmatrix}$
On a more useful note, you can look up commuting matrices on Wikipedia.
$endgroup$
The claim is false in general. Take $A = begin{bmatrix}1 & 2\3 & 4end{bmatrix}$
$Lambda = begin{bmatrix}2 & 0\0 & 3end{bmatrix}$. Then
$A Lambda = begin{bmatrix}2 & 6\6 & 12end{bmatrix}$
$Lambda A = begin{bmatrix}2 & 4\9 & 12end{bmatrix}$
On a more useful note, you can look up commuting matrices on Wikipedia.
answered Mar 15 '16 at 1:31
cpiegorecpiegore
4982526
4982526
add a comment |
add a comment |
$begingroup$
We want to find $Lambda$ such that $ALambda=Lambda A$. Then we have $$left( begin{array}{cc}
sum a_{1n}lambda_{n1} & sum a_{1n}lambda_{n2} & sum a_{1n}lambda_{n3} & ... \
sum a_{2n}lambda_{n1} & sum a_{2n}lambda_{n2} & sum a_{2n}lambda_{n3} & ...\
sum a_{3n}lambda_{n1} & sum a_{3n}lambda_{n2} & sum a_{3n}lambda_{n3} & ...\
... & ... & ... & ...end{array} right)=left( begin{array}{cc}
sum a_{n1}lambda_{1n} & sum a_{n1}lambda_{2n} & sum a_{n1}lambda_{3n} & ... \
sum a_{n2}lambda_{1n} & sum a_{n2}lambda_{2n} & sum a_{n2}lambda_{3n} & ...\
sum a_{n3}lambda_{1n} & sum a_{n3}lambda_{2n} & sum a_{n3}lambda_{3n} & ...\
... & ... & ... & ...end{array} right)$$ Hence for the equality to hold, both $A$ and $Lambda$ must be symmetric since $a_{in}=a_{ni}$ and $lambda_{in}=lambda_{ni}$ for $i=1,2,3,...$ A diagonal matrix is a special case of this.
$endgroup$
$begingroup$
Hello, what is the name of this theorem? Or where is it in a book? I want to reference it. Thanks a lot.
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:09
$begingroup$
I am not aware of a theorem whose result is this since it is just derived from the definition.
$endgroup$
– TheSimpliFire
Apr 28 '18 at 8:20
$begingroup$
"both $A$ and $Lambda$ must be symmetric" <-- not true as stated (it's not a necessary condition); also, $Lambda$ is symmetric already (being diagonal).
$endgroup$
– darij grinberg
Jan 12 at 22:45
add a comment |
$begingroup$
We want to find $Lambda$ such that $ALambda=Lambda A$. Then we have $$left( begin{array}{cc}
sum a_{1n}lambda_{n1} & sum a_{1n}lambda_{n2} & sum a_{1n}lambda_{n3} & ... \
sum a_{2n}lambda_{n1} & sum a_{2n}lambda_{n2} & sum a_{2n}lambda_{n3} & ...\
sum a_{3n}lambda_{n1} & sum a_{3n}lambda_{n2} & sum a_{3n}lambda_{n3} & ...\
... & ... & ... & ...end{array} right)=left( begin{array}{cc}
sum a_{n1}lambda_{1n} & sum a_{n1}lambda_{2n} & sum a_{n1}lambda_{3n} & ... \
sum a_{n2}lambda_{1n} & sum a_{n2}lambda_{2n} & sum a_{n2}lambda_{3n} & ...\
sum a_{n3}lambda_{1n} & sum a_{n3}lambda_{2n} & sum a_{n3}lambda_{3n} & ...\
... & ... & ... & ...end{array} right)$$ Hence for the equality to hold, both $A$ and $Lambda$ must be symmetric since $a_{in}=a_{ni}$ and $lambda_{in}=lambda_{ni}$ for $i=1,2,3,...$ A diagonal matrix is a special case of this.
$endgroup$
$begingroup$
Hello, what is the name of this theorem? Or where is it in a book? I want to reference it. Thanks a lot.
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:09
$begingroup$
I am not aware of a theorem whose result is this since it is just derived from the definition.
$endgroup$
– TheSimpliFire
Apr 28 '18 at 8:20
$begingroup$
"both $A$ and $Lambda$ must be symmetric" <-- not true as stated (it's not a necessary condition); also, $Lambda$ is symmetric already (being diagonal).
$endgroup$
– darij grinberg
Jan 12 at 22:45
add a comment |
$begingroup$
We want to find $Lambda$ such that $ALambda=Lambda A$. Then we have $$left( begin{array}{cc}
sum a_{1n}lambda_{n1} & sum a_{1n}lambda_{n2} & sum a_{1n}lambda_{n3} & ... \
sum a_{2n}lambda_{n1} & sum a_{2n}lambda_{n2} & sum a_{2n}lambda_{n3} & ...\
sum a_{3n}lambda_{n1} & sum a_{3n}lambda_{n2} & sum a_{3n}lambda_{n3} & ...\
... & ... & ... & ...end{array} right)=left( begin{array}{cc}
sum a_{n1}lambda_{1n} & sum a_{n1}lambda_{2n} & sum a_{n1}lambda_{3n} & ... \
sum a_{n2}lambda_{1n} & sum a_{n2}lambda_{2n} & sum a_{n2}lambda_{3n} & ...\
sum a_{n3}lambda_{1n} & sum a_{n3}lambda_{2n} & sum a_{n3}lambda_{3n} & ...\
... & ... & ... & ...end{array} right)$$ Hence for the equality to hold, both $A$ and $Lambda$ must be symmetric since $a_{in}=a_{ni}$ and $lambda_{in}=lambda_{ni}$ for $i=1,2,3,...$ A diagonal matrix is a special case of this.
$endgroup$
We want to find $Lambda$ such that $ALambda=Lambda A$. Then we have $$left( begin{array}{cc}
sum a_{1n}lambda_{n1} & sum a_{1n}lambda_{n2} & sum a_{1n}lambda_{n3} & ... \
sum a_{2n}lambda_{n1} & sum a_{2n}lambda_{n2} & sum a_{2n}lambda_{n3} & ...\
sum a_{3n}lambda_{n1} & sum a_{3n}lambda_{n2} & sum a_{3n}lambda_{n3} & ...\
... & ... & ... & ...end{array} right)=left( begin{array}{cc}
sum a_{n1}lambda_{1n} & sum a_{n1}lambda_{2n} & sum a_{n1}lambda_{3n} & ... \
sum a_{n2}lambda_{1n} & sum a_{n2}lambda_{2n} & sum a_{n2}lambda_{3n} & ...\
sum a_{n3}lambda_{1n} & sum a_{n3}lambda_{2n} & sum a_{n3}lambda_{3n} & ...\
... & ... & ... & ...end{array} right)$$ Hence for the equality to hold, both $A$ and $Lambda$ must be symmetric since $a_{in}=a_{ni}$ and $lambda_{in}=lambda_{ni}$ for $i=1,2,3,...$ A diagonal matrix is a special case of this.
answered Dec 16 '17 at 16:06
TheSimpliFireTheSimpliFire
12.8k62461
12.8k62461
$begingroup$
Hello, what is the name of this theorem? Or where is it in a book? I want to reference it. Thanks a lot.
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:09
$begingroup$
I am not aware of a theorem whose result is this since it is just derived from the definition.
$endgroup$
– TheSimpliFire
Apr 28 '18 at 8:20
$begingroup$
"both $A$ and $Lambda$ must be symmetric" <-- not true as stated (it's not a necessary condition); also, $Lambda$ is symmetric already (being diagonal).
$endgroup$
– darij grinberg
Jan 12 at 22:45
add a comment |
$begingroup$
Hello, what is the name of this theorem? Or where is it in a book? I want to reference it. Thanks a lot.
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:09
$begingroup$
I am not aware of a theorem whose result is this since it is just derived from the definition.
$endgroup$
– TheSimpliFire
Apr 28 '18 at 8:20
$begingroup$
"both $A$ and $Lambda$ must be symmetric" <-- not true as stated (it's not a necessary condition); also, $Lambda$ is symmetric already (being diagonal).
$endgroup$
– darij grinberg
Jan 12 at 22:45
$begingroup$
Hello, what is the name of this theorem? Or where is it in a book? I want to reference it. Thanks a lot.
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:09
$begingroup$
Hello, what is the name of this theorem? Or where is it in a book? I want to reference it. Thanks a lot.
$endgroup$
– Vladimir Vargas
Apr 27 '18 at 22:09
$begingroup$
I am not aware of a theorem whose result is this since it is just derived from the definition.
$endgroup$
– TheSimpliFire
Apr 28 '18 at 8:20
$begingroup$
I am not aware of a theorem whose result is this since it is just derived from the definition.
$endgroup$
– TheSimpliFire
Apr 28 '18 at 8:20
$begingroup$
"both $A$ and $Lambda$ must be symmetric" <-- not true as stated (it's not a necessary condition); also, $Lambda$ is symmetric already (being diagonal).
$endgroup$
– darij grinberg
Jan 12 at 22:45
$begingroup$
"both $A$ and $Lambda$ must be symmetric" <-- not true as stated (it's not a necessary condition); also, $Lambda$ is symmetric already (being diagonal).
$endgroup$
– darij grinberg
Jan 12 at 22:45
add a comment |
$begingroup$
The answer from @AOK is not generally true. Obviously for diagonal identical matrix $Lambda$ is valid only when $A$ is symmetric matrix.
$endgroup$
add a comment |
$begingroup$
The answer from @AOK is not generally true. Obviously for diagonal identical matrix $Lambda$ is valid only when $A$ is symmetric matrix.
$endgroup$
add a comment |
$begingroup$
The answer from @AOK is not generally true. Obviously for diagonal identical matrix $Lambda$ is valid only when $A$ is symmetric matrix.
$endgroup$
The answer from @AOK is not generally true. Obviously for diagonal identical matrix $Lambda$ is valid only when $A$ is symmetric matrix.
answered Jan 12 at 22:22
BigFOX IBigFOX I
11
11
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1697991%2fdo-diagonal-matrices-always-commute%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
8
$begingroup$
Have you tried simple examples?
$endgroup$
– Friedrich Philipp
Mar 15 '16 at 1:18
7
$begingroup$
When the diagonal matrix is on the right, it scales the columns of the matrix it is multiplying. when the diagonal matrix is on the left, it scales the rows. Since column-scaling and row scaling are different operations, there are only very limited circumstances that the matrices will commute.
$endgroup$
– Nick Alger
Mar 15 '16 at 1:30