How to solve $AX=XB$ for $X$ Matrix?












1












$begingroup$


I have two symmetric $3times 3$ matrices $A, B$. I am interested in solving the system



$$AX= XB$$



Is there a way this is usually done? The matrices are not necessarily non singular.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    $X=0$ is a solution.
    $endgroup$
    – Michael
    Dec 6 '18 at 6:53












  • $begingroup$
    @Michael Sorry, meant non trivial, if any.
    $endgroup$
    – AspiringMat
    Dec 6 '18 at 7:03






  • 4




    $begingroup$
    This is the special case of the Sylvester equation.
    $endgroup$
    – Tianlalu
    Dec 6 '18 at 7:08






  • 3




    $begingroup$
    You can just write down the 9 linear equations you get and solve that homogeneous system of linear equations as usual.
    $endgroup$
    – Christoph
    Dec 6 '18 at 11:16












  • $begingroup$
    Are your matrices in $mathcal{M}_n(mathbb{R})$ or $mathcal{M}_n(mathbb{C})$?
    $endgroup$
    – Bill O'Haran
    Dec 7 '18 at 9:12
















1












$begingroup$


I have two symmetric $3times 3$ matrices $A, B$. I am interested in solving the system



$$AX= XB$$



Is there a way this is usually done? The matrices are not necessarily non singular.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    $X=0$ is a solution.
    $endgroup$
    – Michael
    Dec 6 '18 at 6:53












  • $begingroup$
    @Michael Sorry, meant non trivial, if any.
    $endgroup$
    – AspiringMat
    Dec 6 '18 at 7:03






  • 4




    $begingroup$
    This is the special case of the Sylvester equation.
    $endgroup$
    – Tianlalu
    Dec 6 '18 at 7:08






  • 3




    $begingroup$
    You can just write down the 9 linear equations you get and solve that homogeneous system of linear equations as usual.
    $endgroup$
    – Christoph
    Dec 6 '18 at 11:16












  • $begingroup$
    Are your matrices in $mathcal{M}_n(mathbb{R})$ or $mathcal{M}_n(mathbb{C})$?
    $endgroup$
    – Bill O'Haran
    Dec 7 '18 at 9:12














1












1








1


2



$begingroup$


I have two symmetric $3times 3$ matrices $A, B$. I am interested in solving the system



$$AX= XB$$



Is there a way this is usually done? The matrices are not necessarily non singular.










share|cite|improve this question











$endgroup$




I have two symmetric $3times 3$ matrices $A, B$. I am interested in solving the system



$$AX= XB$$



Is there a way this is usually done? The matrices are not necessarily non singular.







linear-algebra systems-of-equations matrix-equations symmetric-matrices sylvester-equation






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 6 '18 at 11:10







user593746

















asked Dec 6 '18 at 6:50









AspiringMatAspiringMat

535518




535518








  • 1




    $begingroup$
    $X=0$ is a solution.
    $endgroup$
    – Michael
    Dec 6 '18 at 6:53












  • $begingroup$
    @Michael Sorry, meant non trivial, if any.
    $endgroup$
    – AspiringMat
    Dec 6 '18 at 7:03






  • 4




    $begingroup$
    This is the special case of the Sylvester equation.
    $endgroup$
    – Tianlalu
    Dec 6 '18 at 7:08






  • 3




    $begingroup$
    You can just write down the 9 linear equations you get and solve that homogeneous system of linear equations as usual.
    $endgroup$
    – Christoph
    Dec 6 '18 at 11:16












  • $begingroup$
    Are your matrices in $mathcal{M}_n(mathbb{R})$ or $mathcal{M}_n(mathbb{C})$?
    $endgroup$
    – Bill O'Haran
    Dec 7 '18 at 9:12














  • 1




    $begingroup$
    $X=0$ is a solution.
    $endgroup$
    – Michael
    Dec 6 '18 at 6:53












  • $begingroup$
    @Michael Sorry, meant non trivial, if any.
    $endgroup$
    – AspiringMat
    Dec 6 '18 at 7:03






  • 4




    $begingroup$
    This is the special case of the Sylvester equation.
    $endgroup$
    – Tianlalu
    Dec 6 '18 at 7:08






  • 3




    $begingroup$
    You can just write down the 9 linear equations you get and solve that homogeneous system of linear equations as usual.
    $endgroup$
    – Christoph
    Dec 6 '18 at 11:16












  • $begingroup$
    Are your matrices in $mathcal{M}_n(mathbb{R})$ or $mathcal{M}_n(mathbb{C})$?
    $endgroup$
    – Bill O'Haran
    Dec 7 '18 at 9:12








1




1




$begingroup$
$X=0$ is a solution.
$endgroup$
– Michael
Dec 6 '18 at 6:53






$begingroup$
$X=0$ is a solution.
$endgroup$
– Michael
Dec 6 '18 at 6:53














$begingroup$
@Michael Sorry, meant non trivial, if any.
$endgroup$
– AspiringMat
Dec 6 '18 at 7:03




$begingroup$
@Michael Sorry, meant non trivial, if any.
$endgroup$
– AspiringMat
Dec 6 '18 at 7:03




4




4




$begingroup$
This is the special case of the Sylvester equation.
$endgroup$
– Tianlalu
Dec 6 '18 at 7:08




$begingroup$
This is the special case of the Sylvester equation.
$endgroup$
– Tianlalu
Dec 6 '18 at 7:08




3




3




$begingroup$
You can just write down the 9 linear equations you get and solve that homogeneous system of linear equations as usual.
$endgroup$
– Christoph
Dec 6 '18 at 11:16






$begingroup$
You can just write down the 9 linear equations you get and solve that homogeneous system of linear equations as usual.
$endgroup$
– Christoph
Dec 6 '18 at 11:16














$begingroup$
Are your matrices in $mathcal{M}_n(mathbb{R})$ or $mathcal{M}_n(mathbb{C})$?
$endgroup$
– Bill O'Haran
Dec 7 '18 at 9:12




$begingroup$
Are your matrices in $mathcal{M}_n(mathbb{R})$ or $mathcal{M}_n(mathbb{C})$?
$endgroup$
– Bill O'Haran
Dec 7 '18 at 9:12










3 Answers
3






active

oldest

votes


















2












$begingroup$

You can use vectorization and Kronecker products.



Assume we have the equation
$$bf AXB=C$$



$$({bf B}^T otimes {bf A})text{vec}({bf X}) = text{vec}({bf AXB}) = text{vec}({bf C})$$



Now how can we modify it to get what we want?






share|cite|improve this answer











$endgroup$













  • $begingroup$
    This is the right way to solve this question. The problem is that few people have been introduced to Kronecker products, and are aware that this is a particular case of Sylvester's equation (en.wikipedia.org/wiki/Sylvester_equation)
    $endgroup$
    – Jean Marie
    Dec 31 '18 at 20:01



















1












$begingroup$

This is a partial answer, but for a generalized setting where $A,Bin M_n(Bbb{K})$ for some field $Bbb{K}$. Suppose that $A$ and $B$ are diagonalizable over $Bbb K$. Let $a_1,a_2,ldots,a_n$ be a complete set of eigenvectors of $A$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_n$, and $b_1,b_2,ldots,b_n$ a complete set of eigenvectors of $B^t$ with eigenvealues $mu_1,mu_2,ldots,mu_n$. Then, the matrices $C_{i,j}=a_ib_j^t$ form a basis of $M_n(Bbb{K})$.



Consider the linear map $T:M_n(Bbb{K})to M_n(Bbb{K})$ given by $T(X)=AX-XB$. If $X=sum_{i,j}x_{i,j}C_{i,j}$, then
$$T(X)=sum_{i,j}x_{i,j}(lambda_i-mu_j)C_{i,j}.$$
In particular, $T(X)=0$ if and only if $x_{i,j}=0$ whenever $lambda_ineq mu_j$. Therefore,
$$ker T=operatorname{span}big{C_{i,j}:lambda_i=mu_jbig}.$$



In your case, the matrices $A$ and $B$ are real symmetric (you didn't specify the field, so I guess the field is the reals), and so they are diagonalizable. Therefore, you can use my approach.



However, if the field is $Bbb{C}$, then a symmetric matrix may not be diagonalizable. I do not know the answer if $A$ or $B$ is not diagonalizable. But the wiki link given by Tianlalu may help.





After I made some calculations, I found this result. Assume that the characteristic polynomials of $A$ and $B$ split into linear factors over $Bbb{K}$. Suppose that $A$ has $k$ Jordan blocks of sizes $r_1,r_2,ldots,r_k$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_k$, and suppose that $B$ has $l$ Jordan blocks of sizes $s_1,s_2,ldots,s_l$ with eigenvalues $mu_1,mu_2,ldots,mu_l$. That is, we have the following proposition.




Proposition: Let $T:M_n(Bbb{K})to M_n(Bbb{K})$ be defined by $T(X)=AX-XB$, where $A$ and $B$ are fixed elements of $M_n(Bbb{K})$. Then, the linear map $T$ is diagonalizable over $Bbb K$ if and only if both $A$ and $B$ are diagonalizable over $Bbb{K}$.




Write $(a_i^1,a_i^2,ldots,a_i^{r_i})$ for a generalized eigenvector sequence of $A$ in the $i$th block, that is,
$$Aa_i^p=lambda_ia_i^p+a_i^{p+1}$$
for $p=1,2,ldots,r_i$, with $a_i^{r_i+1}=0$. Similarly, write $(b_j^1,b_j^2,ldots,b_j^{s_j})$ for a generalized eigenvector sequence of $B$ in the $j$th block, that is,
$$Bb_j^q=mu_jb_j^q+b_j^{q+1}$$
for $q=1,2,ldots,s_j$, with $b_j^{s_j+1}=0$. Then, for fixed $i=1,2,ldots,k$ and $j=1,2,ldots,l$, the span of $C_{i,j}^{p,q}=a_i^p(b_j^q)^t$ with $p$ and $q$ ranging from ${1,2,ldots,r_i}$ and ${1,2,ldots,s_j}$ is a direct sum of generalized eigenspaces of $T$ with eigenvalue $lambda_i-mu_j$.

This is because
$$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=sum_{r=0}^h(-1)^rbinom{h}{r}C_{i,j}^{p+h-r,q+r},$$
where $C_{i,j}^{p,q}=0$ if $p>r_i$ or $q>s_j$. Therefore, if $h>r_i-p+s_j-q$, then $$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=0.$$ In characteristic $0$, there are $m_{i,j}=min{r_i,s_j}$ generalized eigenspaces and the dimensions of the generalized eigenspaces are
$$r_i+s_j-1,r_i+s_j-3,r_i+s_j-5,ldots,r_i+s_j+1-2m_{i,j}.$$



This shows that
$$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subseteqker Tsubseteq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$
For $Xin M_n(Bbb{K})$, we write $X=sum_{i,j,p,q}x_{i,j}^{p,q}C_{i,j}$. If $Xin ker T$, then $x_{i,j}^{p,q}=0$ whenever $lambda_ineq mu_j$, and so
$$T(X)=sum_{i,j,p,q}x_{i,j}^{p,q}(C_{i,j}^{p+1,q}-C_{i,j}^{p,q+1})=sum_{i,j}^{p,q}(x_{i,j}^{p-1,q}-x_{i,j}^{p,q-1})C_{i,j}^{p,q},$$
with $x_{i,j}^{0,q}=x_{i,j}^{p,0}=0$. This is as far as I can go, but it tells me that, unless both $A$ and $B$ are diagonalizable, we have
$$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subsetneq ker Tsubsetneq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$






share|cite|improve this answer











$endgroup$





















    1












    $begingroup$

    As stated in the comment from Christoph, here is a direct approach solving $AX=XB$ given both $A$ and $B$ are square matrices of order $k$.




    Proposition. Let $A=(a_{ij})_{ktimes k}$, $B=(b_{ij})_{ktimes k}$ and the unknown $X=(x_{ij})_{ktimes k}$. We also let $$Y=(x_{11},x_{21},cdots,x_{k1},x_{12},x_{22},cdots,x_{k2},cdots,x_{1k},x_{2k},cdots,x_{kk})^{T}$$
    be the column vector with dimension $k^2$. Then, the solution of $AX=XB$ is given by the solution of the homogeneous equation $QY=0$, where $Q$ is the block matrix
    $$Q=begin{bmatrix}
    Q_{11}&Q_{12}&cdots&Q_{1k}\
    Q_{21}&Q_{22}&cdots&Q_{2k}\
    cdots&cdots&cdots&cdots\
    Q_{k1}&Q_{k2}&cdots&Q_{kk}\
    end{bmatrix}$$

    with $Q_{ii}=A-b_{ii}I$ ($i=1,2,dots,k$ and $I$ is the identity matrix of order $k$) and $Q_{ij}=-b_{ji}I$ ($i=1,2,dots,k$).




    The proof follows by direct multiplication of matrices. We get the system of linear equations about every elements in $X$ and write the equation as the form $QY=0$. Since there always has solutions for the homogeneous equation $QY=0$, we get all the solutions for $AX=XB$.





    Example. Solve $AX=XB$, where $A=begin{bmatrix}-4&1\-9&2end{bmatrix}$ and $B=begin{bmatrix}-2&1\-1&0end{bmatrix}$.



    From the proposition, we have
    $$Q=begin{bmatrix}
    -2&1&1&0\
    -9&4&0&1\
    -1&0&-4&1\
    0&-1&-9&2\
    end{bmatrix}xrightarrow{text{row operation}}
    begin{bmatrix}
    1&0&4&-1\
    0&1&9&-2\
    0&0&0&0\
    0&0&0&0\
    end{bmatrix}=P.$$

    Since $PY=0$ has the same solution to $QY=0$, we get
    $$Y=(-4u+v,-9u+2v,u,v)^T,qquad forall u,vinBbb C.$$
    Therefore,
    $$X=begin{bmatrix}
    -4u+v&u\
    -9u+2v&v\
    end{bmatrix}=
    ubegin{bmatrix}-4&1\-9&0end{bmatrix}+vbegin{bmatrix}1&0\2&1end{bmatrix},qquad forall u,vinBbb C.$$





    From the example we can see the steps are




    • express $Q$ in terms of $A$ and $B$, apply row operations to obtain $P$,


    • solve $Y$ for $PY=0$,


    • write down the elements in $X$ (corresponding to the elements in $Y$).







    share|cite|improve this answer











    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3028145%2fhow-to-solve-ax-xb-for-x-matrix%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      2












      $begingroup$

      You can use vectorization and Kronecker products.



      Assume we have the equation
      $$bf AXB=C$$



      $$({bf B}^T otimes {bf A})text{vec}({bf X}) = text{vec}({bf AXB}) = text{vec}({bf C})$$



      Now how can we modify it to get what we want?






      share|cite|improve this answer











      $endgroup$













      • $begingroup$
        This is the right way to solve this question. The problem is that few people have been introduced to Kronecker products, and are aware that this is a particular case of Sylvester's equation (en.wikipedia.org/wiki/Sylvester_equation)
        $endgroup$
        – Jean Marie
        Dec 31 '18 at 20:01
















      2












      $begingroup$

      You can use vectorization and Kronecker products.



      Assume we have the equation
      $$bf AXB=C$$



      $$({bf B}^T otimes {bf A})text{vec}({bf X}) = text{vec}({bf AXB}) = text{vec}({bf C})$$



      Now how can we modify it to get what we want?






      share|cite|improve this answer











      $endgroup$













      • $begingroup$
        This is the right way to solve this question. The problem is that few people have been introduced to Kronecker products, and are aware that this is a particular case of Sylvester's equation (en.wikipedia.org/wiki/Sylvester_equation)
        $endgroup$
        – Jean Marie
        Dec 31 '18 at 20:01














      2












      2








      2





      $begingroup$

      You can use vectorization and Kronecker products.



      Assume we have the equation
      $$bf AXB=C$$



      $$({bf B}^T otimes {bf A})text{vec}({bf X}) = text{vec}({bf AXB}) = text{vec}({bf C})$$



      Now how can we modify it to get what we want?






      share|cite|improve this answer











      $endgroup$



      You can use vectorization and Kronecker products.



      Assume we have the equation
      $$bf AXB=C$$



      $$({bf B}^T otimes {bf A})text{vec}({bf X}) = text{vec}({bf AXB}) = text{vec}({bf C})$$



      Now how can we modify it to get what we want?







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited Dec 31 '18 at 21:30

























      answered Dec 6 '18 at 10:35









      mathreadlermathreadler

      14.8k72160




      14.8k72160












      • $begingroup$
        This is the right way to solve this question. The problem is that few people have been introduced to Kronecker products, and are aware that this is a particular case of Sylvester's equation (en.wikipedia.org/wiki/Sylvester_equation)
        $endgroup$
        – Jean Marie
        Dec 31 '18 at 20:01


















      • $begingroup$
        This is the right way to solve this question. The problem is that few people have been introduced to Kronecker products, and are aware that this is a particular case of Sylvester's equation (en.wikipedia.org/wiki/Sylvester_equation)
        $endgroup$
        – Jean Marie
        Dec 31 '18 at 20:01
















      $begingroup$
      This is the right way to solve this question. The problem is that few people have been introduced to Kronecker products, and are aware that this is a particular case of Sylvester's equation (en.wikipedia.org/wiki/Sylvester_equation)
      $endgroup$
      – Jean Marie
      Dec 31 '18 at 20:01




      $begingroup$
      This is the right way to solve this question. The problem is that few people have been introduced to Kronecker products, and are aware that this is a particular case of Sylvester's equation (en.wikipedia.org/wiki/Sylvester_equation)
      $endgroup$
      – Jean Marie
      Dec 31 '18 at 20:01











      1












      $begingroup$

      This is a partial answer, but for a generalized setting where $A,Bin M_n(Bbb{K})$ for some field $Bbb{K}$. Suppose that $A$ and $B$ are diagonalizable over $Bbb K$. Let $a_1,a_2,ldots,a_n$ be a complete set of eigenvectors of $A$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_n$, and $b_1,b_2,ldots,b_n$ a complete set of eigenvectors of $B^t$ with eigenvealues $mu_1,mu_2,ldots,mu_n$. Then, the matrices $C_{i,j}=a_ib_j^t$ form a basis of $M_n(Bbb{K})$.



      Consider the linear map $T:M_n(Bbb{K})to M_n(Bbb{K})$ given by $T(X)=AX-XB$. If $X=sum_{i,j}x_{i,j}C_{i,j}$, then
      $$T(X)=sum_{i,j}x_{i,j}(lambda_i-mu_j)C_{i,j}.$$
      In particular, $T(X)=0$ if and only if $x_{i,j}=0$ whenever $lambda_ineq mu_j$. Therefore,
      $$ker T=operatorname{span}big{C_{i,j}:lambda_i=mu_jbig}.$$



      In your case, the matrices $A$ and $B$ are real symmetric (you didn't specify the field, so I guess the field is the reals), and so they are diagonalizable. Therefore, you can use my approach.



      However, if the field is $Bbb{C}$, then a symmetric matrix may not be diagonalizable. I do not know the answer if $A$ or $B$ is not diagonalizable. But the wiki link given by Tianlalu may help.





      After I made some calculations, I found this result. Assume that the characteristic polynomials of $A$ and $B$ split into linear factors over $Bbb{K}$. Suppose that $A$ has $k$ Jordan blocks of sizes $r_1,r_2,ldots,r_k$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_k$, and suppose that $B$ has $l$ Jordan blocks of sizes $s_1,s_2,ldots,s_l$ with eigenvalues $mu_1,mu_2,ldots,mu_l$. That is, we have the following proposition.




      Proposition: Let $T:M_n(Bbb{K})to M_n(Bbb{K})$ be defined by $T(X)=AX-XB$, where $A$ and $B$ are fixed elements of $M_n(Bbb{K})$. Then, the linear map $T$ is diagonalizable over $Bbb K$ if and only if both $A$ and $B$ are diagonalizable over $Bbb{K}$.




      Write $(a_i^1,a_i^2,ldots,a_i^{r_i})$ for a generalized eigenvector sequence of $A$ in the $i$th block, that is,
      $$Aa_i^p=lambda_ia_i^p+a_i^{p+1}$$
      for $p=1,2,ldots,r_i$, with $a_i^{r_i+1}=0$. Similarly, write $(b_j^1,b_j^2,ldots,b_j^{s_j})$ for a generalized eigenvector sequence of $B$ in the $j$th block, that is,
      $$Bb_j^q=mu_jb_j^q+b_j^{q+1}$$
      for $q=1,2,ldots,s_j$, with $b_j^{s_j+1}=0$. Then, for fixed $i=1,2,ldots,k$ and $j=1,2,ldots,l$, the span of $C_{i,j}^{p,q}=a_i^p(b_j^q)^t$ with $p$ and $q$ ranging from ${1,2,ldots,r_i}$ and ${1,2,ldots,s_j}$ is a direct sum of generalized eigenspaces of $T$ with eigenvalue $lambda_i-mu_j$.

      This is because
      $$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=sum_{r=0}^h(-1)^rbinom{h}{r}C_{i,j}^{p+h-r,q+r},$$
      where $C_{i,j}^{p,q}=0$ if $p>r_i$ or $q>s_j$. Therefore, if $h>r_i-p+s_j-q$, then $$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=0.$$ In characteristic $0$, there are $m_{i,j}=min{r_i,s_j}$ generalized eigenspaces and the dimensions of the generalized eigenspaces are
      $$r_i+s_j-1,r_i+s_j-3,r_i+s_j-5,ldots,r_i+s_j+1-2m_{i,j}.$$



      This shows that
      $$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subseteqker Tsubseteq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$
      For $Xin M_n(Bbb{K})$, we write $X=sum_{i,j,p,q}x_{i,j}^{p,q}C_{i,j}$. If $Xin ker T$, then $x_{i,j}^{p,q}=0$ whenever $lambda_ineq mu_j$, and so
      $$T(X)=sum_{i,j,p,q}x_{i,j}^{p,q}(C_{i,j}^{p+1,q}-C_{i,j}^{p,q+1})=sum_{i,j}^{p,q}(x_{i,j}^{p-1,q}-x_{i,j}^{p,q-1})C_{i,j}^{p,q},$$
      with $x_{i,j}^{0,q}=x_{i,j}^{p,0}=0$. This is as far as I can go, but it tells me that, unless both $A$ and $B$ are diagonalizable, we have
      $$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subsetneq ker Tsubsetneq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$






      share|cite|improve this answer











      $endgroup$


















        1












        $begingroup$

        This is a partial answer, but for a generalized setting where $A,Bin M_n(Bbb{K})$ for some field $Bbb{K}$. Suppose that $A$ and $B$ are diagonalizable over $Bbb K$. Let $a_1,a_2,ldots,a_n$ be a complete set of eigenvectors of $A$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_n$, and $b_1,b_2,ldots,b_n$ a complete set of eigenvectors of $B^t$ with eigenvealues $mu_1,mu_2,ldots,mu_n$. Then, the matrices $C_{i,j}=a_ib_j^t$ form a basis of $M_n(Bbb{K})$.



        Consider the linear map $T:M_n(Bbb{K})to M_n(Bbb{K})$ given by $T(X)=AX-XB$. If $X=sum_{i,j}x_{i,j}C_{i,j}$, then
        $$T(X)=sum_{i,j}x_{i,j}(lambda_i-mu_j)C_{i,j}.$$
        In particular, $T(X)=0$ if and only if $x_{i,j}=0$ whenever $lambda_ineq mu_j$. Therefore,
        $$ker T=operatorname{span}big{C_{i,j}:lambda_i=mu_jbig}.$$



        In your case, the matrices $A$ and $B$ are real symmetric (you didn't specify the field, so I guess the field is the reals), and so they are diagonalizable. Therefore, you can use my approach.



        However, if the field is $Bbb{C}$, then a symmetric matrix may not be diagonalizable. I do not know the answer if $A$ or $B$ is not diagonalizable. But the wiki link given by Tianlalu may help.





        After I made some calculations, I found this result. Assume that the characteristic polynomials of $A$ and $B$ split into linear factors over $Bbb{K}$. Suppose that $A$ has $k$ Jordan blocks of sizes $r_1,r_2,ldots,r_k$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_k$, and suppose that $B$ has $l$ Jordan blocks of sizes $s_1,s_2,ldots,s_l$ with eigenvalues $mu_1,mu_2,ldots,mu_l$. That is, we have the following proposition.




        Proposition: Let $T:M_n(Bbb{K})to M_n(Bbb{K})$ be defined by $T(X)=AX-XB$, where $A$ and $B$ are fixed elements of $M_n(Bbb{K})$. Then, the linear map $T$ is diagonalizable over $Bbb K$ if and only if both $A$ and $B$ are diagonalizable over $Bbb{K}$.




        Write $(a_i^1,a_i^2,ldots,a_i^{r_i})$ for a generalized eigenvector sequence of $A$ in the $i$th block, that is,
        $$Aa_i^p=lambda_ia_i^p+a_i^{p+1}$$
        for $p=1,2,ldots,r_i$, with $a_i^{r_i+1}=0$. Similarly, write $(b_j^1,b_j^2,ldots,b_j^{s_j})$ for a generalized eigenvector sequence of $B$ in the $j$th block, that is,
        $$Bb_j^q=mu_jb_j^q+b_j^{q+1}$$
        for $q=1,2,ldots,s_j$, with $b_j^{s_j+1}=0$. Then, for fixed $i=1,2,ldots,k$ and $j=1,2,ldots,l$, the span of $C_{i,j}^{p,q}=a_i^p(b_j^q)^t$ with $p$ and $q$ ranging from ${1,2,ldots,r_i}$ and ${1,2,ldots,s_j}$ is a direct sum of generalized eigenspaces of $T$ with eigenvalue $lambda_i-mu_j$.

        This is because
        $$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=sum_{r=0}^h(-1)^rbinom{h}{r}C_{i,j}^{p+h-r,q+r},$$
        where $C_{i,j}^{p,q}=0$ if $p>r_i$ or $q>s_j$. Therefore, if $h>r_i-p+s_j-q$, then $$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=0.$$ In characteristic $0$, there are $m_{i,j}=min{r_i,s_j}$ generalized eigenspaces and the dimensions of the generalized eigenspaces are
        $$r_i+s_j-1,r_i+s_j-3,r_i+s_j-5,ldots,r_i+s_j+1-2m_{i,j}.$$



        This shows that
        $$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subseteqker Tsubseteq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$
        For $Xin M_n(Bbb{K})$, we write $X=sum_{i,j,p,q}x_{i,j}^{p,q}C_{i,j}$. If $Xin ker T$, then $x_{i,j}^{p,q}=0$ whenever $lambda_ineq mu_j$, and so
        $$T(X)=sum_{i,j,p,q}x_{i,j}^{p,q}(C_{i,j}^{p+1,q}-C_{i,j}^{p,q+1})=sum_{i,j}^{p,q}(x_{i,j}^{p-1,q}-x_{i,j}^{p,q-1})C_{i,j}^{p,q},$$
        with $x_{i,j}^{0,q}=x_{i,j}^{p,0}=0$. This is as far as I can go, but it tells me that, unless both $A$ and $B$ are diagonalizable, we have
        $$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subsetneq ker Tsubsetneq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$






        share|cite|improve this answer











        $endgroup$
















          1












          1








          1





          $begingroup$

          This is a partial answer, but for a generalized setting where $A,Bin M_n(Bbb{K})$ for some field $Bbb{K}$. Suppose that $A$ and $B$ are diagonalizable over $Bbb K$. Let $a_1,a_2,ldots,a_n$ be a complete set of eigenvectors of $A$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_n$, and $b_1,b_2,ldots,b_n$ a complete set of eigenvectors of $B^t$ with eigenvealues $mu_1,mu_2,ldots,mu_n$. Then, the matrices $C_{i,j}=a_ib_j^t$ form a basis of $M_n(Bbb{K})$.



          Consider the linear map $T:M_n(Bbb{K})to M_n(Bbb{K})$ given by $T(X)=AX-XB$. If $X=sum_{i,j}x_{i,j}C_{i,j}$, then
          $$T(X)=sum_{i,j}x_{i,j}(lambda_i-mu_j)C_{i,j}.$$
          In particular, $T(X)=0$ if and only if $x_{i,j}=0$ whenever $lambda_ineq mu_j$. Therefore,
          $$ker T=operatorname{span}big{C_{i,j}:lambda_i=mu_jbig}.$$



          In your case, the matrices $A$ and $B$ are real symmetric (you didn't specify the field, so I guess the field is the reals), and so they are diagonalizable. Therefore, you can use my approach.



          However, if the field is $Bbb{C}$, then a symmetric matrix may not be diagonalizable. I do not know the answer if $A$ or $B$ is not diagonalizable. But the wiki link given by Tianlalu may help.





          After I made some calculations, I found this result. Assume that the characteristic polynomials of $A$ and $B$ split into linear factors over $Bbb{K}$. Suppose that $A$ has $k$ Jordan blocks of sizes $r_1,r_2,ldots,r_k$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_k$, and suppose that $B$ has $l$ Jordan blocks of sizes $s_1,s_2,ldots,s_l$ with eigenvalues $mu_1,mu_2,ldots,mu_l$. That is, we have the following proposition.




          Proposition: Let $T:M_n(Bbb{K})to M_n(Bbb{K})$ be defined by $T(X)=AX-XB$, where $A$ and $B$ are fixed elements of $M_n(Bbb{K})$. Then, the linear map $T$ is diagonalizable over $Bbb K$ if and only if both $A$ and $B$ are diagonalizable over $Bbb{K}$.




          Write $(a_i^1,a_i^2,ldots,a_i^{r_i})$ for a generalized eigenvector sequence of $A$ in the $i$th block, that is,
          $$Aa_i^p=lambda_ia_i^p+a_i^{p+1}$$
          for $p=1,2,ldots,r_i$, with $a_i^{r_i+1}=0$. Similarly, write $(b_j^1,b_j^2,ldots,b_j^{s_j})$ for a generalized eigenvector sequence of $B$ in the $j$th block, that is,
          $$Bb_j^q=mu_jb_j^q+b_j^{q+1}$$
          for $q=1,2,ldots,s_j$, with $b_j^{s_j+1}=0$. Then, for fixed $i=1,2,ldots,k$ and $j=1,2,ldots,l$, the span of $C_{i,j}^{p,q}=a_i^p(b_j^q)^t$ with $p$ and $q$ ranging from ${1,2,ldots,r_i}$ and ${1,2,ldots,s_j}$ is a direct sum of generalized eigenspaces of $T$ with eigenvalue $lambda_i-mu_j$.

          This is because
          $$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=sum_{r=0}^h(-1)^rbinom{h}{r}C_{i,j}^{p+h-r,q+r},$$
          where $C_{i,j}^{p,q}=0$ if $p>r_i$ or $q>s_j$. Therefore, if $h>r_i-p+s_j-q$, then $$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=0.$$ In characteristic $0$, there are $m_{i,j}=min{r_i,s_j}$ generalized eigenspaces and the dimensions of the generalized eigenspaces are
          $$r_i+s_j-1,r_i+s_j-3,r_i+s_j-5,ldots,r_i+s_j+1-2m_{i,j}.$$



          This shows that
          $$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subseteqker Tsubseteq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$
          For $Xin M_n(Bbb{K})$, we write $X=sum_{i,j,p,q}x_{i,j}^{p,q}C_{i,j}$. If $Xin ker T$, then $x_{i,j}^{p,q}=0$ whenever $lambda_ineq mu_j$, and so
          $$T(X)=sum_{i,j,p,q}x_{i,j}^{p,q}(C_{i,j}^{p+1,q}-C_{i,j}^{p,q+1})=sum_{i,j}^{p,q}(x_{i,j}^{p-1,q}-x_{i,j}^{p,q-1})C_{i,j}^{p,q},$$
          with $x_{i,j}^{0,q}=x_{i,j}^{p,0}=0$. This is as far as I can go, but it tells me that, unless both $A$ and $B$ are diagonalizable, we have
          $$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subsetneq ker Tsubsetneq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$






          share|cite|improve this answer











          $endgroup$



          This is a partial answer, but for a generalized setting where $A,Bin M_n(Bbb{K})$ for some field $Bbb{K}$. Suppose that $A$ and $B$ are diagonalizable over $Bbb K$. Let $a_1,a_2,ldots,a_n$ be a complete set of eigenvectors of $A$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_n$, and $b_1,b_2,ldots,b_n$ a complete set of eigenvectors of $B^t$ with eigenvealues $mu_1,mu_2,ldots,mu_n$. Then, the matrices $C_{i,j}=a_ib_j^t$ form a basis of $M_n(Bbb{K})$.



          Consider the linear map $T:M_n(Bbb{K})to M_n(Bbb{K})$ given by $T(X)=AX-XB$. If $X=sum_{i,j}x_{i,j}C_{i,j}$, then
          $$T(X)=sum_{i,j}x_{i,j}(lambda_i-mu_j)C_{i,j}.$$
          In particular, $T(X)=0$ if and only if $x_{i,j}=0$ whenever $lambda_ineq mu_j$. Therefore,
          $$ker T=operatorname{span}big{C_{i,j}:lambda_i=mu_jbig}.$$



          In your case, the matrices $A$ and $B$ are real symmetric (you didn't specify the field, so I guess the field is the reals), and so they are diagonalizable. Therefore, you can use my approach.



          However, if the field is $Bbb{C}$, then a symmetric matrix may not be diagonalizable. I do not know the answer if $A$ or $B$ is not diagonalizable. But the wiki link given by Tianlalu may help.





          After I made some calculations, I found this result. Assume that the characteristic polynomials of $A$ and $B$ split into linear factors over $Bbb{K}$. Suppose that $A$ has $k$ Jordan blocks of sizes $r_1,r_2,ldots,r_k$ with eigenvalues $lambda_1,lambda_2,ldots,lambda_k$, and suppose that $B$ has $l$ Jordan blocks of sizes $s_1,s_2,ldots,s_l$ with eigenvalues $mu_1,mu_2,ldots,mu_l$. That is, we have the following proposition.




          Proposition: Let $T:M_n(Bbb{K})to M_n(Bbb{K})$ be defined by $T(X)=AX-XB$, where $A$ and $B$ are fixed elements of $M_n(Bbb{K})$. Then, the linear map $T$ is diagonalizable over $Bbb K$ if and only if both $A$ and $B$ are diagonalizable over $Bbb{K}$.




          Write $(a_i^1,a_i^2,ldots,a_i^{r_i})$ for a generalized eigenvector sequence of $A$ in the $i$th block, that is,
          $$Aa_i^p=lambda_ia_i^p+a_i^{p+1}$$
          for $p=1,2,ldots,r_i$, with $a_i^{r_i+1}=0$. Similarly, write $(b_j^1,b_j^2,ldots,b_j^{s_j})$ for a generalized eigenvector sequence of $B$ in the $j$th block, that is,
          $$Bb_j^q=mu_jb_j^q+b_j^{q+1}$$
          for $q=1,2,ldots,s_j$, with $b_j^{s_j+1}=0$. Then, for fixed $i=1,2,ldots,k$ and $j=1,2,ldots,l$, the span of $C_{i,j}^{p,q}=a_i^p(b_j^q)^t$ with $p$ and $q$ ranging from ${1,2,ldots,r_i}$ and ${1,2,ldots,s_j}$ is a direct sum of generalized eigenspaces of $T$ with eigenvalue $lambda_i-mu_j$.

          This is because
          $$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=sum_{r=0}^h(-1)^rbinom{h}{r}C_{i,j}^{p+h-r,q+r},$$
          where $C_{i,j}^{p,q}=0$ if $p>r_i$ or $q>s_j$. Therefore, if $h>r_i-p+s_j-q$, then $$big(T-(lambda_i-mu_j)big)^hC_{i,j}^{p,q}=0.$$ In characteristic $0$, there are $m_{i,j}=min{r_i,s_j}$ generalized eigenspaces and the dimensions of the generalized eigenspaces are
          $$r_i+s_j-1,r_i+s_j-3,r_i+s_j-5,ldots,r_i+s_j+1-2m_{i,j}.$$



          This shows that
          $$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subseteqker Tsubseteq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$
          For $Xin M_n(Bbb{K})$, we write $X=sum_{i,j,p,q}x_{i,j}^{p,q}C_{i,j}$. If $Xin ker T$, then $x_{i,j}^{p,q}=0$ whenever $lambda_ineq mu_j$, and so
          $$T(X)=sum_{i,j,p,q}x_{i,j}^{p,q}(C_{i,j}^{p+1,q}-C_{i,j}^{p,q+1})=sum_{i,j}^{p,q}(x_{i,j}^{p-1,q}-x_{i,j}^{p,q-1})C_{i,j}^{p,q},$$
          with $x_{i,j}^{0,q}=x_{i,j}^{p,0}=0$. This is as far as I can go, but it tells me that, unless both $A$ and $B$ are diagonalizable, we have
          $$operatorname{span}{C^{r_i,s_j}_{i,j}:lambda_i=mu_jbig}subsetneq ker Tsubsetneq operatorname{span}{C_{i,j}^{p,q}:lambda_i=mu_jbig}.$$







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Dec 6 '18 at 19:06

























          answered Dec 6 '18 at 11:05







          user593746






























              1












              $begingroup$

              As stated in the comment from Christoph, here is a direct approach solving $AX=XB$ given both $A$ and $B$ are square matrices of order $k$.




              Proposition. Let $A=(a_{ij})_{ktimes k}$, $B=(b_{ij})_{ktimes k}$ and the unknown $X=(x_{ij})_{ktimes k}$. We also let $$Y=(x_{11},x_{21},cdots,x_{k1},x_{12},x_{22},cdots,x_{k2},cdots,x_{1k},x_{2k},cdots,x_{kk})^{T}$$
              be the column vector with dimension $k^2$. Then, the solution of $AX=XB$ is given by the solution of the homogeneous equation $QY=0$, where $Q$ is the block matrix
              $$Q=begin{bmatrix}
              Q_{11}&Q_{12}&cdots&Q_{1k}\
              Q_{21}&Q_{22}&cdots&Q_{2k}\
              cdots&cdots&cdots&cdots\
              Q_{k1}&Q_{k2}&cdots&Q_{kk}\
              end{bmatrix}$$

              with $Q_{ii}=A-b_{ii}I$ ($i=1,2,dots,k$ and $I$ is the identity matrix of order $k$) and $Q_{ij}=-b_{ji}I$ ($i=1,2,dots,k$).




              The proof follows by direct multiplication of matrices. We get the system of linear equations about every elements in $X$ and write the equation as the form $QY=0$. Since there always has solutions for the homogeneous equation $QY=0$, we get all the solutions for $AX=XB$.





              Example. Solve $AX=XB$, where $A=begin{bmatrix}-4&1\-9&2end{bmatrix}$ and $B=begin{bmatrix}-2&1\-1&0end{bmatrix}$.



              From the proposition, we have
              $$Q=begin{bmatrix}
              -2&1&1&0\
              -9&4&0&1\
              -1&0&-4&1\
              0&-1&-9&2\
              end{bmatrix}xrightarrow{text{row operation}}
              begin{bmatrix}
              1&0&4&-1\
              0&1&9&-2\
              0&0&0&0\
              0&0&0&0\
              end{bmatrix}=P.$$

              Since $PY=0$ has the same solution to $QY=0$, we get
              $$Y=(-4u+v,-9u+2v,u,v)^T,qquad forall u,vinBbb C.$$
              Therefore,
              $$X=begin{bmatrix}
              -4u+v&u\
              -9u+2v&v\
              end{bmatrix}=
              ubegin{bmatrix}-4&1\-9&0end{bmatrix}+vbegin{bmatrix}1&0\2&1end{bmatrix},qquad forall u,vinBbb C.$$





              From the example we can see the steps are




              • express $Q$ in terms of $A$ and $B$, apply row operations to obtain $P$,


              • solve $Y$ for $PY=0$,


              • write down the elements in $X$ (corresponding to the elements in $Y$).







              share|cite|improve this answer











              $endgroup$


















                1












                $begingroup$

                As stated in the comment from Christoph, here is a direct approach solving $AX=XB$ given both $A$ and $B$ are square matrices of order $k$.




                Proposition. Let $A=(a_{ij})_{ktimes k}$, $B=(b_{ij})_{ktimes k}$ and the unknown $X=(x_{ij})_{ktimes k}$. We also let $$Y=(x_{11},x_{21},cdots,x_{k1},x_{12},x_{22},cdots,x_{k2},cdots,x_{1k},x_{2k},cdots,x_{kk})^{T}$$
                be the column vector with dimension $k^2$. Then, the solution of $AX=XB$ is given by the solution of the homogeneous equation $QY=0$, where $Q$ is the block matrix
                $$Q=begin{bmatrix}
                Q_{11}&Q_{12}&cdots&Q_{1k}\
                Q_{21}&Q_{22}&cdots&Q_{2k}\
                cdots&cdots&cdots&cdots\
                Q_{k1}&Q_{k2}&cdots&Q_{kk}\
                end{bmatrix}$$

                with $Q_{ii}=A-b_{ii}I$ ($i=1,2,dots,k$ and $I$ is the identity matrix of order $k$) and $Q_{ij}=-b_{ji}I$ ($i=1,2,dots,k$).




                The proof follows by direct multiplication of matrices. We get the system of linear equations about every elements in $X$ and write the equation as the form $QY=0$. Since there always has solutions for the homogeneous equation $QY=0$, we get all the solutions for $AX=XB$.





                Example. Solve $AX=XB$, where $A=begin{bmatrix}-4&1\-9&2end{bmatrix}$ and $B=begin{bmatrix}-2&1\-1&0end{bmatrix}$.



                From the proposition, we have
                $$Q=begin{bmatrix}
                -2&1&1&0\
                -9&4&0&1\
                -1&0&-4&1\
                0&-1&-9&2\
                end{bmatrix}xrightarrow{text{row operation}}
                begin{bmatrix}
                1&0&4&-1\
                0&1&9&-2\
                0&0&0&0\
                0&0&0&0\
                end{bmatrix}=P.$$

                Since $PY=0$ has the same solution to $QY=0$, we get
                $$Y=(-4u+v,-9u+2v,u,v)^T,qquad forall u,vinBbb C.$$
                Therefore,
                $$X=begin{bmatrix}
                -4u+v&u\
                -9u+2v&v\
                end{bmatrix}=
                ubegin{bmatrix}-4&1\-9&0end{bmatrix}+vbegin{bmatrix}1&0\2&1end{bmatrix},qquad forall u,vinBbb C.$$





                From the example we can see the steps are




                • express $Q$ in terms of $A$ and $B$, apply row operations to obtain $P$,


                • solve $Y$ for $PY=0$,


                • write down the elements in $X$ (corresponding to the elements in $Y$).







                share|cite|improve this answer











                $endgroup$
















                  1












                  1








                  1





                  $begingroup$

                  As stated in the comment from Christoph, here is a direct approach solving $AX=XB$ given both $A$ and $B$ are square matrices of order $k$.




                  Proposition. Let $A=(a_{ij})_{ktimes k}$, $B=(b_{ij})_{ktimes k}$ and the unknown $X=(x_{ij})_{ktimes k}$. We also let $$Y=(x_{11},x_{21},cdots,x_{k1},x_{12},x_{22},cdots,x_{k2},cdots,x_{1k},x_{2k},cdots,x_{kk})^{T}$$
                  be the column vector with dimension $k^2$. Then, the solution of $AX=XB$ is given by the solution of the homogeneous equation $QY=0$, where $Q$ is the block matrix
                  $$Q=begin{bmatrix}
                  Q_{11}&Q_{12}&cdots&Q_{1k}\
                  Q_{21}&Q_{22}&cdots&Q_{2k}\
                  cdots&cdots&cdots&cdots\
                  Q_{k1}&Q_{k2}&cdots&Q_{kk}\
                  end{bmatrix}$$

                  with $Q_{ii}=A-b_{ii}I$ ($i=1,2,dots,k$ and $I$ is the identity matrix of order $k$) and $Q_{ij}=-b_{ji}I$ ($i=1,2,dots,k$).




                  The proof follows by direct multiplication of matrices. We get the system of linear equations about every elements in $X$ and write the equation as the form $QY=0$. Since there always has solutions for the homogeneous equation $QY=0$, we get all the solutions for $AX=XB$.





                  Example. Solve $AX=XB$, where $A=begin{bmatrix}-4&1\-9&2end{bmatrix}$ and $B=begin{bmatrix}-2&1\-1&0end{bmatrix}$.



                  From the proposition, we have
                  $$Q=begin{bmatrix}
                  -2&1&1&0\
                  -9&4&0&1\
                  -1&0&-4&1\
                  0&-1&-9&2\
                  end{bmatrix}xrightarrow{text{row operation}}
                  begin{bmatrix}
                  1&0&4&-1\
                  0&1&9&-2\
                  0&0&0&0\
                  0&0&0&0\
                  end{bmatrix}=P.$$

                  Since $PY=0$ has the same solution to $QY=0$, we get
                  $$Y=(-4u+v,-9u+2v,u,v)^T,qquad forall u,vinBbb C.$$
                  Therefore,
                  $$X=begin{bmatrix}
                  -4u+v&u\
                  -9u+2v&v\
                  end{bmatrix}=
                  ubegin{bmatrix}-4&1\-9&0end{bmatrix}+vbegin{bmatrix}1&0\2&1end{bmatrix},qquad forall u,vinBbb C.$$





                  From the example we can see the steps are




                  • express $Q$ in terms of $A$ and $B$, apply row operations to obtain $P$,


                  • solve $Y$ for $PY=0$,


                  • write down the elements in $X$ (corresponding to the elements in $Y$).







                  share|cite|improve this answer











                  $endgroup$



                  As stated in the comment from Christoph, here is a direct approach solving $AX=XB$ given both $A$ and $B$ are square matrices of order $k$.




                  Proposition. Let $A=(a_{ij})_{ktimes k}$, $B=(b_{ij})_{ktimes k}$ and the unknown $X=(x_{ij})_{ktimes k}$. We also let $$Y=(x_{11},x_{21},cdots,x_{k1},x_{12},x_{22},cdots,x_{k2},cdots,x_{1k},x_{2k},cdots,x_{kk})^{T}$$
                  be the column vector with dimension $k^2$. Then, the solution of $AX=XB$ is given by the solution of the homogeneous equation $QY=0$, where $Q$ is the block matrix
                  $$Q=begin{bmatrix}
                  Q_{11}&Q_{12}&cdots&Q_{1k}\
                  Q_{21}&Q_{22}&cdots&Q_{2k}\
                  cdots&cdots&cdots&cdots\
                  Q_{k1}&Q_{k2}&cdots&Q_{kk}\
                  end{bmatrix}$$

                  with $Q_{ii}=A-b_{ii}I$ ($i=1,2,dots,k$ and $I$ is the identity matrix of order $k$) and $Q_{ij}=-b_{ji}I$ ($i=1,2,dots,k$).




                  The proof follows by direct multiplication of matrices. We get the system of linear equations about every elements in $X$ and write the equation as the form $QY=0$. Since there always has solutions for the homogeneous equation $QY=0$, we get all the solutions for $AX=XB$.





                  Example. Solve $AX=XB$, where $A=begin{bmatrix}-4&1\-9&2end{bmatrix}$ and $B=begin{bmatrix}-2&1\-1&0end{bmatrix}$.



                  From the proposition, we have
                  $$Q=begin{bmatrix}
                  -2&1&1&0\
                  -9&4&0&1\
                  -1&0&-4&1\
                  0&-1&-9&2\
                  end{bmatrix}xrightarrow{text{row operation}}
                  begin{bmatrix}
                  1&0&4&-1\
                  0&1&9&-2\
                  0&0&0&0\
                  0&0&0&0\
                  end{bmatrix}=P.$$

                  Since $PY=0$ has the same solution to $QY=0$, we get
                  $$Y=(-4u+v,-9u+2v,u,v)^T,qquad forall u,vinBbb C.$$
                  Therefore,
                  $$X=begin{bmatrix}
                  -4u+v&u\
                  -9u+2v&v\
                  end{bmatrix}=
                  ubegin{bmatrix}-4&1\-9&0end{bmatrix}+vbegin{bmatrix}1&0\2&1end{bmatrix},qquad forall u,vinBbb C.$$





                  From the example we can see the steps are




                  • express $Q$ in terms of $A$ and $B$, apply row operations to obtain $P$,


                  • solve $Y$ for $PY=0$,


                  • write down the elements in $X$ (corresponding to the elements in $Y$).








                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Dec 7 '18 at 5:16

























                  answered Dec 6 '18 at 16:36









                  TianlaluTianlalu

                  3,08621038




                  3,08621038






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3028145%2fhow-to-solve-ax-xb-for-x-matrix%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Human spaceflight

                      Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

                      張江高科駅