Two equivalent norms on $mathcal{L}(E)^n$












7












$begingroup$


Let $E$ be a complex Hilbert space, and $mathcal{L}(E)$ be the algebra of all bounded linear operators on $E$.



For ${bf A}:=(A_1,...,A_n) in mathcal{L}(E)^n$ we recall the definitions of the following two norms on $mathcal{L}(E)^n$:
$$|{bf A}|=displaystylesup_{|x|=1}bigg(displaystylesum_{k=1}^n|A_kx|^2bigg)^{frac{1}{2}},$$



and
$$omega_e({bf A})=displaystylesup_{|x|=1}bigg(displaystylesum_{k=1}^n|langle A_kx;|;xrangle|^2bigg)^{1/2}.$$




It's not difficult to prove that $omega_e({bf A}) leq |{bf A}|$. Are $|cdot|$ and $omega_e(cdot)$ two equivalent norms on $mathcal{L}(E)^n,?$ If the answer is true, I hope to find $alpha$ such that
$$alpha |{bf A}|leq omega_e({bf A}) leq |{bf A}|.$$
Note that if $n=1$, it is well known that
$$displaystylefrac{1}{2}|A|leq omega(A)leq|A|.$$




And you for you help.










share|cite|improve this question











$endgroup$

















    7












    $begingroup$


    Let $E$ be a complex Hilbert space, and $mathcal{L}(E)$ be the algebra of all bounded linear operators on $E$.



    For ${bf A}:=(A_1,...,A_n) in mathcal{L}(E)^n$ we recall the definitions of the following two norms on $mathcal{L}(E)^n$:
    $$|{bf A}|=displaystylesup_{|x|=1}bigg(displaystylesum_{k=1}^n|A_kx|^2bigg)^{frac{1}{2}},$$



    and
    $$omega_e({bf A})=displaystylesup_{|x|=1}bigg(displaystylesum_{k=1}^n|langle A_kx;|;xrangle|^2bigg)^{1/2}.$$




    It's not difficult to prove that $omega_e({bf A}) leq |{bf A}|$. Are $|cdot|$ and $omega_e(cdot)$ two equivalent norms on $mathcal{L}(E)^n,?$ If the answer is true, I hope to find $alpha$ such that
    $$alpha |{bf A}|leq omega_e({bf A}) leq |{bf A}|.$$
    Note that if $n=1$, it is well known that
    $$displaystylefrac{1}{2}|A|leq omega(A)leq|A|.$$




    And you for you help.










    share|cite|improve this question











    $endgroup$















      7












      7








      7


      5



      $begingroup$


      Let $E$ be a complex Hilbert space, and $mathcal{L}(E)$ be the algebra of all bounded linear operators on $E$.



      For ${bf A}:=(A_1,...,A_n) in mathcal{L}(E)^n$ we recall the definitions of the following two norms on $mathcal{L}(E)^n$:
      $$|{bf A}|=displaystylesup_{|x|=1}bigg(displaystylesum_{k=1}^n|A_kx|^2bigg)^{frac{1}{2}},$$



      and
      $$omega_e({bf A})=displaystylesup_{|x|=1}bigg(displaystylesum_{k=1}^n|langle A_kx;|;xrangle|^2bigg)^{1/2}.$$




      It's not difficult to prove that $omega_e({bf A}) leq |{bf A}|$. Are $|cdot|$ and $omega_e(cdot)$ two equivalent norms on $mathcal{L}(E)^n,?$ If the answer is true, I hope to find $alpha$ such that
      $$alpha |{bf A}|leq omega_e({bf A}) leq |{bf A}|.$$
      Note that if $n=1$, it is well known that
      $$displaystylefrac{1}{2}|A|leq omega(A)leq|A|.$$




      And you for you help.










      share|cite|improve this question











      $endgroup$




      Let $E$ be a complex Hilbert space, and $mathcal{L}(E)$ be the algebra of all bounded linear operators on $E$.



      For ${bf A}:=(A_1,...,A_n) in mathcal{L}(E)^n$ we recall the definitions of the following two norms on $mathcal{L}(E)^n$:
      $$|{bf A}|=displaystylesup_{|x|=1}bigg(displaystylesum_{k=1}^n|A_kx|^2bigg)^{frac{1}{2}},$$



      and
      $$omega_e({bf A})=displaystylesup_{|x|=1}bigg(displaystylesum_{k=1}^n|langle A_kx;|;xrangle|^2bigg)^{1/2}.$$




      It's not difficult to prove that $omega_e({bf A}) leq |{bf A}|$. Are $|cdot|$ and $omega_e(cdot)$ two equivalent norms on $mathcal{L}(E)^n,?$ If the answer is true, I hope to find $alpha$ such that
      $$alpha |{bf A}|leq omega_e({bf A}) leq |{bf A}|.$$
      Note that if $n=1$, it is well known that
      $$displaystylefrac{1}{2}|A|leq omega(A)leq|A|.$$




      And you for you help.







      functional-analysis hilbert-spaces






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 24 '17 at 5:48







      Student

















      asked Dec 11 '17 at 6:01









      StudentStudent

      2,4652524




      2,4652524






















          1 Answer
          1






          active

          oldest

          votes


















          2





          +50







          $begingroup$

          Yes, these two norms are equivalent. Recall that the following two norms are equivalent:



          $$ Vert cdot Vert_{max}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{max} = max_{jin {1, dots, n }} vert x_j vert $$



          and



          $$ Vert cdot Vert_{2}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{2} = left( sum_{j=1}^n vert x_j vert^2 right)^{1/2}.$$



          In fact, we have that for all $yin mathbb{R}^n$ holds (this are the optimal constants)



          $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max}.$$



          Note that



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2 $$



          and



          $$ omega_e(A) = sup_{Vert x Vert = 1} Vert (vert langle A_1 x,xrangle vert dots , vert langle A_n x, x ranglevert )Vert_2 $$



          Furthermore, as pointed out by @Student we have



          $$ Vert A Vert leq 2 sup_{Vert x Vert =1} vert langle Ax , x rangle vert .$$



          This follows from the fact that for normal operators $B$ we have $Vert B Vert = sup_{Vert x Vert=1} vert langle Bx, x rangle vert$ and the following computation



          $$ Vert A Vert
          leq frac{1}{2} ( Vert A + A^star Vert + Vert A - A^star Vert )
          = frac{1}{2} ( sup_{Vert x Vert = 1} vert langle ( A + A^star)x, x rangle vert + sup_{Vert x Vert = 1} vert langle ( A - A^star)x, x rangle vert)
          leq 2 sup_{Vert x Vert = 1} vert langle Ax, x rangle vert.$$



          Putting everything together yields



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2
          leq sqrt{n} sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_{max} leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{max}
          leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{2}
          = 2sqrt{n} omega_e(A). $$



          Therefore, $alpha= frac{1}{2sqrt{n}}$ will do the job.



          Added: I was asked to adress the optimality of the constants in the inequality. The estimate $omega_e(A) leq Vert A Vert$ is sharp, as we can take $A=(Id, dots, Id)$ and for this choice we get equality.



          The other estimate is more difficult. I only managed to prove optimality under the additional assumption $dim_mathbb{C}(E)geq n+1$. In case someone has some insight for lower dimension please let me know.



          If we assume $dim_mathbb{C}(E)geq n+1$, then it suffices to consider the case $E=mathbb{C}^{n+1}$ with the standard scalar product (otherwise choose a subspace of dimension $n+1$ and identify it with $mathbb{C}^{n+1}$). We choose $A_k: mathbb{C}^{n+1} rightarrow mathbb{C}^{n+1}$ such that
          $$ A_k (x_1, dots, x_{n+1})= (0, dots,0 , x_1, 0, dots, 0) $$
          where $x_1$ is in the kth slot. The corresponding matrix consists of all zeros and a one in the kth row of the first column, e.g. for $n=3, k=2$
          $$ A_2 = begin{pmatrix}
          0 & 0 & 0 \
          1 & 0 & 0 \
          0 & 0 & 0
          end{pmatrix}.$$

          Now we set
          $$ A=(A_2, dots, A_{n+1} )$$
          We have
          $$ Vert A_k x Vert^2 = langle (0, dots, x_1, dots, 0),(0, dots, x_1, dots, 0)rangle = vert x_1 vert^2 $$
          Thus, we get
          $$ Vert A Vert = sup_{Vert x Vert = 1} sqrt{n} vert x_1 vert = sqrt{n} $$
          On the other hand we have
          $$ vert langle A_k x, x rangle vert^2 = vert langle (0, dots, x_1, dots, 0), (x_1, dots, x_{n+1}) rangle vert^2 = vert x_1 vert^2 cdot vert x_k vert^2 $$
          And hence, for $Vert x Vert=1$
          $$ left(sum_{k=2}^{n+1} vert langle A_k x, x rangle vert^2right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot sum_{k=2}^{n+1} vert x_k vert^2 right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot (1- vert x_1 vert^2)right)^frac{1}{2}
          = vert x_1 vert cdot sqrt{1- vert x_1 vert^2}$$

          Hence, we get
          $$ omega_e(A) = sup_{Vert x Vert=1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = sup_{vert x_1 vertleq 1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = frac{1}{2} $$

          Thus, we finally get
          $$ frac{1}{2sqrt{n}} Vert A Vert = frac{1}{2sqrt{n}} cdot sqrt{n} = frac{1}{2} = omega_e(A) $$






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            @Student I added some details. Is it fine or should I elaborate a bit more?
            $endgroup$
            – Severin Schraven
            Dec 20 '17 at 11:06










          • $begingroup$
            @Student Thank you for pointing out my mistakes. I'll correct them as soon as possible.
            $endgroup$
            – Severin Schraven
            Dec 21 '17 at 11:38










          • $begingroup$
            Dear Professor Severin. I think that since the inequalities $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max},$$ are sharp, i.e. $1,sqrt{n}$ are the optimal constants, then the inequalities $$frac{1}{2sqrt{n}}|mathbf{A}|leq omega_e(mathbf{A})leq |mathbf{A}|,$$ are sharp (the constants $frac{1}{2sqrt{n}}$ and $1$ are the best). Do you agree with me? Thanks a lot.
            $endgroup$
            – Student
            Jan 10 at 10:15








          • 1




            $begingroup$
            Dear Severin, I would like to thank you for your effort in order to help me. Since a period I'm trying to find an example. According to ([arXiv][1], page 25), if we take $A_k=frac{1}{sqrt{n}}T$ for all $kin {1,cdots,n}$ with $$T=begin{pmatrix}0&0\1&0end{pmatrix}.$$ [1]: arxiv.org/pdf/math/0410492.pdf
            $endgroup$
            – Student
            Jan 11 at 6:05






          • 1




            $begingroup$
            We obtain: Hence, $$w(A_1,cdots,A_n)=sqrt{n}w(frac{1}{sqrt{n}}T)=w(T)=frac{1}{2},$$ however, $$displaystylefrac{1}{2sqrt{n}}left|displaystylesum_{k=1}^nA_k^*A_k right|^{1/2}=frac{1}{2sqrt{n}}sqrt{n}|frac{1}{sqrt{n}}T|=frac{1}{2sqrt{n}}.$$
            $endgroup$
            – Student
            Jan 11 at 6:06













          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2561166%2ftwo-equivalent-norms-on-mathcallen%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2





          +50







          $begingroup$

          Yes, these two norms are equivalent. Recall that the following two norms are equivalent:



          $$ Vert cdot Vert_{max}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{max} = max_{jin {1, dots, n }} vert x_j vert $$



          and



          $$ Vert cdot Vert_{2}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{2} = left( sum_{j=1}^n vert x_j vert^2 right)^{1/2}.$$



          In fact, we have that for all $yin mathbb{R}^n$ holds (this are the optimal constants)



          $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max}.$$



          Note that



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2 $$



          and



          $$ omega_e(A) = sup_{Vert x Vert = 1} Vert (vert langle A_1 x,xrangle vert dots , vert langle A_n x, x ranglevert )Vert_2 $$



          Furthermore, as pointed out by @Student we have



          $$ Vert A Vert leq 2 sup_{Vert x Vert =1} vert langle Ax , x rangle vert .$$



          This follows from the fact that for normal operators $B$ we have $Vert B Vert = sup_{Vert x Vert=1} vert langle Bx, x rangle vert$ and the following computation



          $$ Vert A Vert
          leq frac{1}{2} ( Vert A + A^star Vert + Vert A - A^star Vert )
          = frac{1}{2} ( sup_{Vert x Vert = 1} vert langle ( A + A^star)x, x rangle vert + sup_{Vert x Vert = 1} vert langle ( A - A^star)x, x rangle vert)
          leq 2 sup_{Vert x Vert = 1} vert langle Ax, x rangle vert.$$



          Putting everything together yields



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2
          leq sqrt{n} sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_{max} leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{max}
          leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{2}
          = 2sqrt{n} omega_e(A). $$



          Therefore, $alpha= frac{1}{2sqrt{n}}$ will do the job.



          Added: I was asked to adress the optimality of the constants in the inequality. The estimate $omega_e(A) leq Vert A Vert$ is sharp, as we can take $A=(Id, dots, Id)$ and for this choice we get equality.



          The other estimate is more difficult. I only managed to prove optimality under the additional assumption $dim_mathbb{C}(E)geq n+1$. In case someone has some insight for lower dimension please let me know.



          If we assume $dim_mathbb{C}(E)geq n+1$, then it suffices to consider the case $E=mathbb{C}^{n+1}$ with the standard scalar product (otherwise choose a subspace of dimension $n+1$ and identify it with $mathbb{C}^{n+1}$). We choose $A_k: mathbb{C}^{n+1} rightarrow mathbb{C}^{n+1}$ such that
          $$ A_k (x_1, dots, x_{n+1})= (0, dots,0 , x_1, 0, dots, 0) $$
          where $x_1$ is in the kth slot. The corresponding matrix consists of all zeros and a one in the kth row of the first column, e.g. for $n=3, k=2$
          $$ A_2 = begin{pmatrix}
          0 & 0 & 0 \
          1 & 0 & 0 \
          0 & 0 & 0
          end{pmatrix}.$$

          Now we set
          $$ A=(A_2, dots, A_{n+1} )$$
          We have
          $$ Vert A_k x Vert^2 = langle (0, dots, x_1, dots, 0),(0, dots, x_1, dots, 0)rangle = vert x_1 vert^2 $$
          Thus, we get
          $$ Vert A Vert = sup_{Vert x Vert = 1} sqrt{n} vert x_1 vert = sqrt{n} $$
          On the other hand we have
          $$ vert langle A_k x, x rangle vert^2 = vert langle (0, dots, x_1, dots, 0), (x_1, dots, x_{n+1}) rangle vert^2 = vert x_1 vert^2 cdot vert x_k vert^2 $$
          And hence, for $Vert x Vert=1$
          $$ left(sum_{k=2}^{n+1} vert langle A_k x, x rangle vert^2right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot sum_{k=2}^{n+1} vert x_k vert^2 right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot (1- vert x_1 vert^2)right)^frac{1}{2}
          = vert x_1 vert cdot sqrt{1- vert x_1 vert^2}$$

          Hence, we get
          $$ omega_e(A) = sup_{Vert x Vert=1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = sup_{vert x_1 vertleq 1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = frac{1}{2} $$

          Thus, we finally get
          $$ frac{1}{2sqrt{n}} Vert A Vert = frac{1}{2sqrt{n}} cdot sqrt{n} = frac{1}{2} = omega_e(A) $$






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            @Student I added some details. Is it fine or should I elaborate a bit more?
            $endgroup$
            – Severin Schraven
            Dec 20 '17 at 11:06










          • $begingroup$
            @Student Thank you for pointing out my mistakes. I'll correct them as soon as possible.
            $endgroup$
            – Severin Schraven
            Dec 21 '17 at 11:38










          • $begingroup$
            Dear Professor Severin. I think that since the inequalities $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max},$$ are sharp, i.e. $1,sqrt{n}$ are the optimal constants, then the inequalities $$frac{1}{2sqrt{n}}|mathbf{A}|leq omega_e(mathbf{A})leq |mathbf{A}|,$$ are sharp (the constants $frac{1}{2sqrt{n}}$ and $1$ are the best). Do you agree with me? Thanks a lot.
            $endgroup$
            – Student
            Jan 10 at 10:15








          • 1




            $begingroup$
            Dear Severin, I would like to thank you for your effort in order to help me. Since a period I'm trying to find an example. According to ([arXiv][1], page 25), if we take $A_k=frac{1}{sqrt{n}}T$ for all $kin {1,cdots,n}$ with $$T=begin{pmatrix}0&0\1&0end{pmatrix}.$$ [1]: arxiv.org/pdf/math/0410492.pdf
            $endgroup$
            – Student
            Jan 11 at 6:05






          • 1




            $begingroup$
            We obtain: Hence, $$w(A_1,cdots,A_n)=sqrt{n}w(frac{1}{sqrt{n}}T)=w(T)=frac{1}{2},$$ however, $$displaystylefrac{1}{2sqrt{n}}left|displaystylesum_{k=1}^nA_k^*A_k right|^{1/2}=frac{1}{2sqrt{n}}sqrt{n}|frac{1}{sqrt{n}}T|=frac{1}{2sqrt{n}}.$$
            $endgroup$
            – Student
            Jan 11 at 6:06


















          2





          +50







          $begingroup$

          Yes, these two norms are equivalent. Recall that the following two norms are equivalent:



          $$ Vert cdot Vert_{max}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{max} = max_{jin {1, dots, n }} vert x_j vert $$



          and



          $$ Vert cdot Vert_{2}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{2} = left( sum_{j=1}^n vert x_j vert^2 right)^{1/2}.$$



          In fact, we have that for all $yin mathbb{R}^n$ holds (this are the optimal constants)



          $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max}.$$



          Note that



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2 $$



          and



          $$ omega_e(A) = sup_{Vert x Vert = 1} Vert (vert langle A_1 x,xrangle vert dots , vert langle A_n x, x ranglevert )Vert_2 $$



          Furthermore, as pointed out by @Student we have



          $$ Vert A Vert leq 2 sup_{Vert x Vert =1} vert langle Ax , x rangle vert .$$



          This follows from the fact that for normal operators $B$ we have $Vert B Vert = sup_{Vert x Vert=1} vert langle Bx, x rangle vert$ and the following computation



          $$ Vert A Vert
          leq frac{1}{2} ( Vert A + A^star Vert + Vert A - A^star Vert )
          = frac{1}{2} ( sup_{Vert x Vert = 1} vert langle ( A + A^star)x, x rangle vert + sup_{Vert x Vert = 1} vert langle ( A - A^star)x, x rangle vert)
          leq 2 sup_{Vert x Vert = 1} vert langle Ax, x rangle vert.$$



          Putting everything together yields



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2
          leq sqrt{n} sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_{max} leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{max}
          leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{2}
          = 2sqrt{n} omega_e(A). $$



          Therefore, $alpha= frac{1}{2sqrt{n}}$ will do the job.



          Added: I was asked to adress the optimality of the constants in the inequality. The estimate $omega_e(A) leq Vert A Vert$ is sharp, as we can take $A=(Id, dots, Id)$ and for this choice we get equality.



          The other estimate is more difficult. I only managed to prove optimality under the additional assumption $dim_mathbb{C}(E)geq n+1$. In case someone has some insight for lower dimension please let me know.



          If we assume $dim_mathbb{C}(E)geq n+1$, then it suffices to consider the case $E=mathbb{C}^{n+1}$ with the standard scalar product (otherwise choose a subspace of dimension $n+1$ and identify it with $mathbb{C}^{n+1}$). We choose $A_k: mathbb{C}^{n+1} rightarrow mathbb{C}^{n+1}$ such that
          $$ A_k (x_1, dots, x_{n+1})= (0, dots,0 , x_1, 0, dots, 0) $$
          where $x_1$ is in the kth slot. The corresponding matrix consists of all zeros and a one in the kth row of the first column, e.g. for $n=3, k=2$
          $$ A_2 = begin{pmatrix}
          0 & 0 & 0 \
          1 & 0 & 0 \
          0 & 0 & 0
          end{pmatrix}.$$

          Now we set
          $$ A=(A_2, dots, A_{n+1} )$$
          We have
          $$ Vert A_k x Vert^2 = langle (0, dots, x_1, dots, 0),(0, dots, x_1, dots, 0)rangle = vert x_1 vert^2 $$
          Thus, we get
          $$ Vert A Vert = sup_{Vert x Vert = 1} sqrt{n} vert x_1 vert = sqrt{n} $$
          On the other hand we have
          $$ vert langle A_k x, x rangle vert^2 = vert langle (0, dots, x_1, dots, 0), (x_1, dots, x_{n+1}) rangle vert^2 = vert x_1 vert^2 cdot vert x_k vert^2 $$
          And hence, for $Vert x Vert=1$
          $$ left(sum_{k=2}^{n+1} vert langle A_k x, x rangle vert^2right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot sum_{k=2}^{n+1} vert x_k vert^2 right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot (1- vert x_1 vert^2)right)^frac{1}{2}
          = vert x_1 vert cdot sqrt{1- vert x_1 vert^2}$$

          Hence, we get
          $$ omega_e(A) = sup_{Vert x Vert=1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = sup_{vert x_1 vertleq 1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = frac{1}{2} $$

          Thus, we finally get
          $$ frac{1}{2sqrt{n}} Vert A Vert = frac{1}{2sqrt{n}} cdot sqrt{n} = frac{1}{2} = omega_e(A) $$






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            @Student I added some details. Is it fine or should I elaborate a bit more?
            $endgroup$
            – Severin Schraven
            Dec 20 '17 at 11:06










          • $begingroup$
            @Student Thank you for pointing out my mistakes. I'll correct them as soon as possible.
            $endgroup$
            – Severin Schraven
            Dec 21 '17 at 11:38










          • $begingroup$
            Dear Professor Severin. I think that since the inequalities $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max},$$ are sharp, i.e. $1,sqrt{n}$ are the optimal constants, then the inequalities $$frac{1}{2sqrt{n}}|mathbf{A}|leq omega_e(mathbf{A})leq |mathbf{A}|,$$ are sharp (the constants $frac{1}{2sqrt{n}}$ and $1$ are the best). Do you agree with me? Thanks a lot.
            $endgroup$
            – Student
            Jan 10 at 10:15








          • 1




            $begingroup$
            Dear Severin, I would like to thank you for your effort in order to help me. Since a period I'm trying to find an example. According to ([arXiv][1], page 25), if we take $A_k=frac{1}{sqrt{n}}T$ for all $kin {1,cdots,n}$ with $$T=begin{pmatrix}0&0\1&0end{pmatrix}.$$ [1]: arxiv.org/pdf/math/0410492.pdf
            $endgroup$
            – Student
            Jan 11 at 6:05






          • 1




            $begingroup$
            We obtain: Hence, $$w(A_1,cdots,A_n)=sqrt{n}w(frac{1}{sqrt{n}}T)=w(T)=frac{1}{2},$$ however, $$displaystylefrac{1}{2sqrt{n}}left|displaystylesum_{k=1}^nA_k^*A_k right|^{1/2}=frac{1}{2sqrt{n}}sqrt{n}|frac{1}{sqrt{n}}T|=frac{1}{2sqrt{n}}.$$
            $endgroup$
            – Student
            Jan 11 at 6:06
















          2





          +50







          2





          +50



          2




          +50



          $begingroup$

          Yes, these two norms are equivalent. Recall that the following two norms are equivalent:



          $$ Vert cdot Vert_{max}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{max} = max_{jin {1, dots, n }} vert x_j vert $$



          and



          $$ Vert cdot Vert_{2}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{2} = left( sum_{j=1}^n vert x_j vert^2 right)^{1/2}.$$



          In fact, we have that for all $yin mathbb{R}^n$ holds (this are the optimal constants)



          $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max}.$$



          Note that



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2 $$



          and



          $$ omega_e(A) = sup_{Vert x Vert = 1} Vert (vert langle A_1 x,xrangle vert dots , vert langle A_n x, x ranglevert )Vert_2 $$



          Furthermore, as pointed out by @Student we have



          $$ Vert A Vert leq 2 sup_{Vert x Vert =1} vert langle Ax , x rangle vert .$$



          This follows from the fact that for normal operators $B$ we have $Vert B Vert = sup_{Vert x Vert=1} vert langle Bx, x rangle vert$ and the following computation



          $$ Vert A Vert
          leq frac{1}{2} ( Vert A + A^star Vert + Vert A - A^star Vert )
          = frac{1}{2} ( sup_{Vert x Vert = 1} vert langle ( A + A^star)x, x rangle vert + sup_{Vert x Vert = 1} vert langle ( A - A^star)x, x rangle vert)
          leq 2 sup_{Vert x Vert = 1} vert langle Ax, x rangle vert.$$



          Putting everything together yields



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2
          leq sqrt{n} sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_{max} leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{max}
          leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{2}
          = 2sqrt{n} omega_e(A). $$



          Therefore, $alpha= frac{1}{2sqrt{n}}$ will do the job.



          Added: I was asked to adress the optimality of the constants in the inequality. The estimate $omega_e(A) leq Vert A Vert$ is sharp, as we can take $A=(Id, dots, Id)$ and for this choice we get equality.



          The other estimate is more difficult. I only managed to prove optimality under the additional assumption $dim_mathbb{C}(E)geq n+1$. In case someone has some insight for lower dimension please let me know.



          If we assume $dim_mathbb{C}(E)geq n+1$, then it suffices to consider the case $E=mathbb{C}^{n+1}$ with the standard scalar product (otherwise choose a subspace of dimension $n+1$ and identify it with $mathbb{C}^{n+1}$). We choose $A_k: mathbb{C}^{n+1} rightarrow mathbb{C}^{n+1}$ such that
          $$ A_k (x_1, dots, x_{n+1})= (0, dots,0 , x_1, 0, dots, 0) $$
          where $x_1$ is in the kth slot. The corresponding matrix consists of all zeros and a one in the kth row of the first column, e.g. for $n=3, k=2$
          $$ A_2 = begin{pmatrix}
          0 & 0 & 0 \
          1 & 0 & 0 \
          0 & 0 & 0
          end{pmatrix}.$$

          Now we set
          $$ A=(A_2, dots, A_{n+1} )$$
          We have
          $$ Vert A_k x Vert^2 = langle (0, dots, x_1, dots, 0),(0, dots, x_1, dots, 0)rangle = vert x_1 vert^2 $$
          Thus, we get
          $$ Vert A Vert = sup_{Vert x Vert = 1} sqrt{n} vert x_1 vert = sqrt{n} $$
          On the other hand we have
          $$ vert langle A_k x, x rangle vert^2 = vert langle (0, dots, x_1, dots, 0), (x_1, dots, x_{n+1}) rangle vert^2 = vert x_1 vert^2 cdot vert x_k vert^2 $$
          And hence, for $Vert x Vert=1$
          $$ left(sum_{k=2}^{n+1} vert langle A_k x, x rangle vert^2right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot sum_{k=2}^{n+1} vert x_k vert^2 right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot (1- vert x_1 vert^2)right)^frac{1}{2}
          = vert x_1 vert cdot sqrt{1- vert x_1 vert^2}$$

          Hence, we get
          $$ omega_e(A) = sup_{Vert x Vert=1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = sup_{vert x_1 vertleq 1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = frac{1}{2} $$

          Thus, we finally get
          $$ frac{1}{2sqrt{n}} Vert A Vert = frac{1}{2sqrt{n}} cdot sqrt{n} = frac{1}{2} = omega_e(A) $$






          share|cite|improve this answer











          $endgroup$



          Yes, these two norms are equivalent. Recall that the following two norms are equivalent:



          $$ Vert cdot Vert_{max}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{max} = max_{jin {1, dots, n }} vert x_j vert $$



          and



          $$ Vert cdot Vert_{2}: mathbb{R}^n rightarrow mathbb{R}, Vert (x_1, dots, x_n )Vert_{2} = left( sum_{j=1}^n vert x_j vert^2 right)^{1/2}.$$



          In fact, we have that for all $yin mathbb{R}^n$ holds (this are the optimal constants)



          $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max}.$$



          Note that



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2 $$



          and



          $$ omega_e(A) = sup_{Vert x Vert = 1} Vert (vert langle A_1 x,xrangle vert dots , vert langle A_n x, x ranglevert )Vert_2 $$



          Furthermore, as pointed out by @Student we have



          $$ Vert A Vert leq 2 sup_{Vert x Vert =1} vert langle Ax , x rangle vert .$$



          This follows from the fact that for normal operators $B$ we have $Vert B Vert = sup_{Vert x Vert=1} vert langle Bx, x rangle vert$ and the following computation



          $$ Vert A Vert
          leq frac{1}{2} ( Vert A + A^star Vert + Vert A - A^star Vert )
          = frac{1}{2} ( sup_{Vert x Vert = 1} vert langle ( A + A^star)x, x rangle vert + sup_{Vert x Vert = 1} vert langle ( A - A^star)x, x rangle vert)
          leq 2 sup_{Vert x Vert = 1} vert langle Ax, x rangle vert.$$



          Putting everything together yields



          $$ Vert A Vert = sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_2
          leq sqrt{n} sup_{Vert x Vert = 1} Vert (A_1x, dots , A_n x)Vert_{max} leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{max}
          leq 2 sqrt{n} sup_{Vert x Vert = 1} Vert (vert langle A_1x, x rangle vert, dots , vert langle A_n x, x rangle vert)Vert_{2}
          = 2sqrt{n} omega_e(A). $$



          Therefore, $alpha= frac{1}{2sqrt{n}}$ will do the job.



          Added: I was asked to adress the optimality of the constants in the inequality. The estimate $omega_e(A) leq Vert A Vert$ is sharp, as we can take $A=(Id, dots, Id)$ and for this choice we get equality.



          The other estimate is more difficult. I only managed to prove optimality under the additional assumption $dim_mathbb{C}(E)geq n+1$. In case someone has some insight for lower dimension please let me know.



          If we assume $dim_mathbb{C}(E)geq n+1$, then it suffices to consider the case $E=mathbb{C}^{n+1}$ with the standard scalar product (otherwise choose a subspace of dimension $n+1$ and identify it with $mathbb{C}^{n+1}$). We choose $A_k: mathbb{C}^{n+1} rightarrow mathbb{C}^{n+1}$ such that
          $$ A_k (x_1, dots, x_{n+1})= (0, dots,0 , x_1, 0, dots, 0) $$
          where $x_1$ is in the kth slot. The corresponding matrix consists of all zeros and a one in the kth row of the first column, e.g. for $n=3, k=2$
          $$ A_2 = begin{pmatrix}
          0 & 0 & 0 \
          1 & 0 & 0 \
          0 & 0 & 0
          end{pmatrix}.$$

          Now we set
          $$ A=(A_2, dots, A_{n+1} )$$
          We have
          $$ Vert A_k x Vert^2 = langle (0, dots, x_1, dots, 0),(0, dots, x_1, dots, 0)rangle = vert x_1 vert^2 $$
          Thus, we get
          $$ Vert A Vert = sup_{Vert x Vert = 1} sqrt{n} vert x_1 vert = sqrt{n} $$
          On the other hand we have
          $$ vert langle A_k x, x rangle vert^2 = vert langle (0, dots, x_1, dots, 0), (x_1, dots, x_{n+1}) rangle vert^2 = vert x_1 vert^2 cdot vert x_k vert^2 $$
          And hence, for $Vert x Vert=1$
          $$ left(sum_{k=2}^{n+1} vert langle A_k x, x rangle vert^2right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot sum_{k=2}^{n+1} vert x_k vert^2 right)^frac{1}{2}
          = left(vert x_1 vert^2 cdot (1- vert x_1 vert^2)right)^frac{1}{2}
          = vert x_1 vert cdot sqrt{1- vert x_1 vert^2}$$

          Hence, we get
          $$ omega_e(A) = sup_{Vert x Vert=1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = sup_{vert x_1 vertleq 1} vert x_1 vert cdot sqrt{1- vert x_1 vert^2}
          = frac{1}{2} $$

          Thus, we finally get
          $$ frac{1}{2sqrt{n}} Vert A Vert = frac{1}{2sqrt{n}} cdot sqrt{n} = frac{1}{2} = omega_e(A) $$







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Jan 12 at 13:10

























          answered Dec 19 '17 at 9:10









          Severin SchravenSeverin Schraven

          6,4251935




          6,4251935












          • $begingroup$
            @Student I added some details. Is it fine or should I elaborate a bit more?
            $endgroup$
            – Severin Schraven
            Dec 20 '17 at 11:06










          • $begingroup$
            @Student Thank you for pointing out my mistakes. I'll correct them as soon as possible.
            $endgroup$
            – Severin Schraven
            Dec 21 '17 at 11:38










          • $begingroup$
            Dear Professor Severin. I think that since the inequalities $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max},$$ are sharp, i.e. $1,sqrt{n}$ are the optimal constants, then the inequalities $$frac{1}{2sqrt{n}}|mathbf{A}|leq omega_e(mathbf{A})leq |mathbf{A}|,$$ are sharp (the constants $frac{1}{2sqrt{n}}$ and $1$ are the best). Do you agree with me? Thanks a lot.
            $endgroup$
            – Student
            Jan 10 at 10:15








          • 1




            $begingroup$
            Dear Severin, I would like to thank you for your effort in order to help me. Since a period I'm trying to find an example. According to ([arXiv][1], page 25), if we take $A_k=frac{1}{sqrt{n}}T$ for all $kin {1,cdots,n}$ with $$T=begin{pmatrix}0&0\1&0end{pmatrix}.$$ [1]: arxiv.org/pdf/math/0410492.pdf
            $endgroup$
            – Student
            Jan 11 at 6:05






          • 1




            $begingroup$
            We obtain: Hence, $$w(A_1,cdots,A_n)=sqrt{n}w(frac{1}{sqrt{n}}T)=w(T)=frac{1}{2},$$ however, $$displaystylefrac{1}{2sqrt{n}}left|displaystylesum_{k=1}^nA_k^*A_k right|^{1/2}=frac{1}{2sqrt{n}}sqrt{n}|frac{1}{sqrt{n}}T|=frac{1}{2sqrt{n}}.$$
            $endgroup$
            – Student
            Jan 11 at 6:06




















          • $begingroup$
            @Student I added some details. Is it fine or should I elaborate a bit more?
            $endgroup$
            – Severin Schraven
            Dec 20 '17 at 11:06










          • $begingroup$
            @Student Thank you for pointing out my mistakes. I'll correct them as soon as possible.
            $endgroup$
            – Severin Schraven
            Dec 21 '17 at 11:38










          • $begingroup$
            Dear Professor Severin. I think that since the inequalities $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max},$$ are sharp, i.e. $1,sqrt{n}$ are the optimal constants, then the inequalities $$frac{1}{2sqrt{n}}|mathbf{A}|leq omega_e(mathbf{A})leq |mathbf{A}|,$$ are sharp (the constants $frac{1}{2sqrt{n}}$ and $1$ are the best). Do you agree with me? Thanks a lot.
            $endgroup$
            – Student
            Jan 10 at 10:15








          • 1




            $begingroup$
            Dear Severin, I would like to thank you for your effort in order to help me. Since a period I'm trying to find an example. According to ([arXiv][1], page 25), if we take $A_k=frac{1}{sqrt{n}}T$ for all $kin {1,cdots,n}$ with $$T=begin{pmatrix}0&0\1&0end{pmatrix}.$$ [1]: arxiv.org/pdf/math/0410492.pdf
            $endgroup$
            – Student
            Jan 11 at 6:05






          • 1




            $begingroup$
            We obtain: Hence, $$w(A_1,cdots,A_n)=sqrt{n}w(frac{1}{sqrt{n}}T)=w(T)=frac{1}{2},$$ however, $$displaystylefrac{1}{2sqrt{n}}left|displaystylesum_{k=1}^nA_k^*A_k right|^{1/2}=frac{1}{2sqrt{n}}sqrt{n}|frac{1}{sqrt{n}}T|=frac{1}{2sqrt{n}}.$$
            $endgroup$
            – Student
            Jan 11 at 6:06


















          $begingroup$
          @Student I added some details. Is it fine or should I elaborate a bit more?
          $endgroup$
          – Severin Schraven
          Dec 20 '17 at 11:06




          $begingroup$
          @Student I added some details. Is it fine or should I elaborate a bit more?
          $endgroup$
          – Severin Schraven
          Dec 20 '17 at 11:06












          $begingroup$
          @Student Thank you for pointing out my mistakes. I'll correct them as soon as possible.
          $endgroup$
          – Severin Schraven
          Dec 21 '17 at 11:38




          $begingroup$
          @Student Thank you for pointing out my mistakes. I'll correct them as soon as possible.
          $endgroup$
          – Severin Schraven
          Dec 21 '17 at 11:38












          $begingroup$
          Dear Professor Severin. I think that since the inequalities $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max},$$ are sharp, i.e. $1,sqrt{n}$ are the optimal constants, then the inequalities $$frac{1}{2sqrt{n}}|mathbf{A}|leq omega_e(mathbf{A})leq |mathbf{A}|,$$ are sharp (the constants $frac{1}{2sqrt{n}}$ and $1$ are the best). Do you agree with me? Thanks a lot.
          $endgroup$
          – Student
          Jan 10 at 10:15






          $begingroup$
          Dear Professor Severin. I think that since the inequalities $$ Vert y Vert_{max} leq Vert y Vert_2 leq sqrt{n} Vert y Vert_{max},$$ are sharp, i.e. $1,sqrt{n}$ are the optimal constants, then the inequalities $$frac{1}{2sqrt{n}}|mathbf{A}|leq omega_e(mathbf{A})leq |mathbf{A}|,$$ are sharp (the constants $frac{1}{2sqrt{n}}$ and $1$ are the best). Do you agree with me? Thanks a lot.
          $endgroup$
          – Student
          Jan 10 at 10:15






          1




          1




          $begingroup$
          Dear Severin, I would like to thank you for your effort in order to help me. Since a period I'm trying to find an example. According to ([arXiv][1], page 25), if we take $A_k=frac{1}{sqrt{n}}T$ for all $kin {1,cdots,n}$ with $$T=begin{pmatrix}0&0\1&0end{pmatrix}.$$ [1]: arxiv.org/pdf/math/0410492.pdf
          $endgroup$
          – Student
          Jan 11 at 6:05




          $begingroup$
          Dear Severin, I would like to thank you for your effort in order to help me. Since a period I'm trying to find an example. According to ([arXiv][1], page 25), if we take $A_k=frac{1}{sqrt{n}}T$ for all $kin {1,cdots,n}$ with $$T=begin{pmatrix}0&0\1&0end{pmatrix}.$$ [1]: arxiv.org/pdf/math/0410492.pdf
          $endgroup$
          – Student
          Jan 11 at 6:05




          1




          1




          $begingroup$
          We obtain: Hence, $$w(A_1,cdots,A_n)=sqrt{n}w(frac{1}{sqrt{n}}T)=w(T)=frac{1}{2},$$ however, $$displaystylefrac{1}{2sqrt{n}}left|displaystylesum_{k=1}^nA_k^*A_k right|^{1/2}=frac{1}{2sqrt{n}}sqrt{n}|frac{1}{sqrt{n}}T|=frac{1}{2sqrt{n}}.$$
          $endgroup$
          – Student
          Jan 11 at 6:06






          $begingroup$
          We obtain: Hence, $$w(A_1,cdots,A_n)=sqrt{n}w(frac{1}{sqrt{n}}T)=w(T)=frac{1}{2},$$ however, $$displaystylefrac{1}{2sqrt{n}}left|displaystylesum_{k=1}^nA_k^*A_k right|^{1/2}=frac{1}{2sqrt{n}}sqrt{n}|frac{1}{sqrt{n}}T|=frac{1}{2sqrt{n}}.$$
          $endgroup$
          – Student
          Jan 11 at 6:06




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2561166%2ftwo-equivalent-norms-on-mathcallen%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Human spaceflight

          Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

          File:DeusFollowingSea.jpg