Derivative of cross product
$begingroup$
Let $f:V_1timesdotstimes V_N to W$ be multilinear. Then $f$ s differentiable and
$$df(a_1,dots,a_n)(h_1+dots+h_n)=f(h_1,a_2,dots,a_n)+f(a_1,h_2,a_3,dots,a_n)+f(a_1,dots,a_{n-1},h_n)tag{1}$$
Problem: Calculate the derivative of the cross product $mathbb R^3times mathbb R^3 to mathbb R^3$.
Solution: Let $f$ be the cross product. It's easy to show that $f$ is multilinear by a direct calculation thus we can use (1). We calculate the derivative at $(u,v)=((u_1,u_2,u_3),(v_1,v_2,v_3))$. We then have
$$df(u,v)(h_1+h_2)=h_1times v + utimes h_2tag{2}$$
Let $e_i, i=1,2,3$ be the unit vectors in $mathbb R^3$. We calculate
$$e_1times v=begin{pmatrix}0\ -v_3 \ v_2end{pmatrix}$$
$$e_2times v=begin{pmatrix}v_3\ 0 \ -v_1end{pmatrix}$$
$$e_3times v=begin{pmatrix}-v_2 \ v_1 \ 0end{pmatrix}$$
For $u$ we use the identity $utimes e_i=-e_itimes u$. So we get the following derivative matrix:
$$df(u,v)=begin{pmatrix}0&v_3&-v_2&0&-u_3&u_2 \ -v_3 & 0 & v_1 & u_3 & 0 & -u_1 \ v_2 & -v_1 & 0 & -u_2 & u_1 & 0end{pmatrix}$$
Question:
1. Apprently, the $+$ means writing the result of $f$ as a column, so it's not the $+$ from e.g. $mathbb R^3$. Should I see that from the definition above without the example? How should I know the dimension of the imge of $df$ without them specifying it somehow?
- What $+$ is used in $f(a_i)(h_1+h_2)$? How does $h_1$ and $h_2$ and $h_1+h_2$ look like?
derivatives multilinear-algebra cross-product
$endgroup$
add a comment |
$begingroup$
Let $f:V_1timesdotstimes V_N to W$ be multilinear. Then $f$ s differentiable and
$$df(a_1,dots,a_n)(h_1+dots+h_n)=f(h_1,a_2,dots,a_n)+f(a_1,h_2,a_3,dots,a_n)+f(a_1,dots,a_{n-1},h_n)tag{1}$$
Problem: Calculate the derivative of the cross product $mathbb R^3times mathbb R^3 to mathbb R^3$.
Solution: Let $f$ be the cross product. It's easy to show that $f$ is multilinear by a direct calculation thus we can use (1). We calculate the derivative at $(u,v)=((u_1,u_2,u_3),(v_1,v_2,v_3))$. We then have
$$df(u,v)(h_1+h_2)=h_1times v + utimes h_2tag{2}$$
Let $e_i, i=1,2,3$ be the unit vectors in $mathbb R^3$. We calculate
$$e_1times v=begin{pmatrix}0\ -v_3 \ v_2end{pmatrix}$$
$$e_2times v=begin{pmatrix}v_3\ 0 \ -v_1end{pmatrix}$$
$$e_3times v=begin{pmatrix}-v_2 \ v_1 \ 0end{pmatrix}$$
For $u$ we use the identity $utimes e_i=-e_itimes u$. So we get the following derivative matrix:
$$df(u,v)=begin{pmatrix}0&v_3&-v_2&0&-u_3&u_2 \ -v_3 & 0 & v_1 & u_3 & 0 & -u_1 \ v_2 & -v_1 & 0 & -u_2 & u_1 & 0end{pmatrix}$$
Question:
1. Apprently, the $+$ means writing the result of $f$ as a column, so it's not the $+$ from e.g. $mathbb R^3$. Should I see that from the definition above without the example? How should I know the dimension of the imge of $df$ without them specifying it somehow?
- What $+$ is used in $f(a_i)(h_1+h_2)$? How does $h_1$ and $h_2$ and $h_1+h_2$ look like?
derivatives multilinear-algebra cross-product
$endgroup$
$begingroup$
If $f$ is $p$-linear then $f(x_1 + h_1, ldots, x_p + h_p) = f(x_1, ldots, x_p) + sum_i f(x_1, ldots, h_i, ldots, x_p) + E(h_1, ldots, h_p),$ where $E$ is the sum of all terms were $f$ is evaluated at least in two different $h_i.$ It follows $E(h_1, ldots, h_p) = o(|h|)$ and then, $f'(x_1, ldots, x_p) = sum_i f(x_1, ldots, h_i, ldots, x_p).$ This proof does not require any sort of finite dimension. Q.E.D.
$endgroup$
– Will M.
Jan 27 at 1:17
add a comment |
$begingroup$
Let $f:V_1timesdotstimes V_N to W$ be multilinear. Then $f$ s differentiable and
$$df(a_1,dots,a_n)(h_1+dots+h_n)=f(h_1,a_2,dots,a_n)+f(a_1,h_2,a_3,dots,a_n)+f(a_1,dots,a_{n-1},h_n)tag{1}$$
Problem: Calculate the derivative of the cross product $mathbb R^3times mathbb R^3 to mathbb R^3$.
Solution: Let $f$ be the cross product. It's easy to show that $f$ is multilinear by a direct calculation thus we can use (1). We calculate the derivative at $(u,v)=((u_1,u_2,u_3),(v_1,v_2,v_3))$. We then have
$$df(u,v)(h_1+h_2)=h_1times v + utimes h_2tag{2}$$
Let $e_i, i=1,2,3$ be the unit vectors in $mathbb R^3$. We calculate
$$e_1times v=begin{pmatrix}0\ -v_3 \ v_2end{pmatrix}$$
$$e_2times v=begin{pmatrix}v_3\ 0 \ -v_1end{pmatrix}$$
$$e_3times v=begin{pmatrix}-v_2 \ v_1 \ 0end{pmatrix}$$
For $u$ we use the identity $utimes e_i=-e_itimes u$. So we get the following derivative matrix:
$$df(u,v)=begin{pmatrix}0&v_3&-v_2&0&-u_3&u_2 \ -v_3 & 0 & v_1 & u_3 & 0 & -u_1 \ v_2 & -v_1 & 0 & -u_2 & u_1 & 0end{pmatrix}$$
Question:
1. Apprently, the $+$ means writing the result of $f$ as a column, so it's not the $+$ from e.g. $mathbb R^3$. Should I see that from the definition above without the example? How should I know the dimension of the imge of $df$ without them specifying it somehow?
- What $+$ is used in $f(a_i)(h_1+h_2)$? How does $h_1$ and $h_2$ and $h_1+h_2$ look like?
derivatives multilinear-algebra cross-product
$endgroup$
Let $f:V_1timesdotstimes V_N to W$ be multilinear. Then $f$ s differentiable and
$$df(a_1,dots,a_n)(h_1+dots+h_n)=f(h_1,a_2,dots,a_n)+f(a_1,h_2,a_3,dots,a_n)+f(a_1,dots,a_{n-1},h_n)tag{1}$$
Problem: Calculate the derivative of the cross product $mathbb R^3times mathbb R^3 to mathbb R^3$.
Solution: Let $f$ be the cross product. It's easy to show that $f$ is multilinear by a direct calculation thus we can use (1). We calculate the derivative at $(u,v)=((u_1,u_2,u_3),(v_1,v_2,v_3))$. We then have
$$df(u,v)(h_1+h_2)=h_1times v + utimes h_2tag{2}$$
Let $e_i, i=1,2,3$ be the unit vectors in $mathbb R^3$. We calculate
$$e_1times v=begin{pmatrix}0\ -v_3 \ v_2end{pmatrix}$$
$$e_2times v=begin{pmatrix}v_3\ 0 \ -v_1end{pmatrix}$$
$$e_3times v=begin{pmatrix}-v_2 \ v_1 \ 0end{pmatrix}$$
For $u$ we use the identity $utimes e_i=-e_itimes u$. So we get the following derivative matrix:
$$df(u,v)=begin{pmatrix}0&v_3&-v_2&0&-u_3&u_2 \ -v_3 & 0 & v_1 & u_3 & 0 & -u_1 \ v_2 & -v_1 & 0 & -u_2 & u_1 & 0end{pmatrix}$$
Question:
1. Apprently, the $+$ means writing the result of $f$ as a column, so it's not the $+$ from e.g. $mathbb R^3$. Should I see that from the definition above without the example? How should I know the dimension of the imge of $df$ without them specifying it somehow?
- What $+$ is used in $f(a_i)(h_1+h_2)$? How does $h_1$ and $h_2$ and $h_1+h_2$ look like?
derivatives multilinear-algebra cross-product
derivatives multilinear-algebra cross-product
edited Jan 17 at 13:09
xotix
asked Jan 17 at 12:58
xotixxotix
291411
291411
$begingroup$
If $f$ is $p$-linear then $f(x_1 + h_1, ldots, x_p + h_p) = f(x_1, ldots, x_p) + sum_i f(x_1, ldots, h_i, ldots, x_p) + E(h_1, ldots, h_p),$ where $E$ is the sum of all terms were $f$ is evaluated at least in two different $h_i.$ It follows $E(h_1, ldots, h_p) = o(|h|)$ and then, $f'(x_1, ldots, x_p) = sum_i f(x_1, ldots, h_i, ldots, x_p).$ This proof does not require any sort of finite dimension. Q.E.D.
$endgroup$
– Will M.
Jan 27 at 1:17
add a comment |
$begingroup$
If $f$ is $p$-linear then $f(x_1 + h_1, ldots, x_p + h_p) = f(x_1, ldots, x_p) + sum_i f(x_1, ldots, h_i, ldots, x_p) + E(h_1, ldots, h_p),$ where $E$ is the sum of all terms were $f$ is evaluated at least in two different $h_i.$ It follows $E(h_1, ldots, h_p) = o(|h|)$ and then, $f'(x_1, ldots, x_p) = sum_i f(x_1, ldots, h_i, ldots, x_p).$ This proof does not require any sort of finite dimension. Q.E.D.
$endgroup$
– Will M.
Jan 27 at 1:17
$begingroup$
If $f$ is $p$-linear then $f(x_1 + h_1, ldots, x_p + h_p) = f(x_1, ldots, x_p) + sum_i f(x_1, ldots, h_i, ldots, x_p) + E(h_1, ldots, h_p),$ where $E$ is the sum of all terms were $f$ is evaluated at least in two different $h_i.$ It follows $E(h_1, ldots, h_p) = o(|h|)$ and then, $f'(x_1, ldots, x_p) = sum_i f(x_1, ldots, h_i, ldots, x_p).$ This proof does not require any sort of finite dimension. Q.E.D.
$endgroup$
– Will M.
Jan 27 at 1:17
$begingroup$
If $f$ is $p$-linear then $f(x_1 + h_1, ldots, x_p + h_p) = f(x_1, ldots, x_p) + sum_i f(x_1, ldots, h_i, ldots, x_p) + E(h_1, ldots, h_p),$ where $E$ is the sum of all terms were $f$ is evaluated at least in two different $h_i.$ It follows $E(h_1, ldots, h_p) = o(|h|)$ and then, $f'(x_1, ldots, x_p) = sum_i f(x_1, ldots, h_i, ldots, x_p).$ This proof does not require any sort of finite dimension. Q.E.D.
$endgroup$
– Will M.
Jan 27 at 1:17
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
For a multilinear mapping, it suffices to consider its Frechet derivative. Let $W$ be an $n$-D vector space, and each $V_i$ be an $m_i$-D vector space with $i=1,2,...,N$. Let $f:V_1times V_2timescdotstimes V_Nto W$ be multilinear. Then $forallleft(v_1,v_2,...,v_Nright)in V_1times V_2timescdotstimes V_N$, the Frechet derivative of $f$ at this location, denoted by $({rm d}f)(v_1,v_2,...,v_N)$, is also a multilinear mapping, i.e.,
$$
({rm d}f)(v_1,v_2,...,v_N):V_1times V_2timescdotstimes V_Nto W.
$$
According to Frechet, it follows that
begin{align}
&({rm d}f)(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=f(h_1,a_2,...,a_N)\
&+f(a_1,h_2,...,a_N)\
&+cdots\
&+f(a_1,a_2,...,h_N).
end{align}
Recall that, if $g$ is linear, its entry-wise form reads
$$
g_i(v)=sum_ja_{ij}v_j,
$$
and if $g$ is bilinear, its entry-wise form reads
$$
g_i(v_1,v_2)=sum_{j_1,j_2}a_{ij_1j_2}v_{1j_1}v_{2j_2}.
$$
Inductively and formally, the above multilinear $f$ observes the following entry-wise form
$$
f_i(v_1,v_2,...,v_N)=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...v_{Nj_N}
$$
for $i=1,2,...,m$, where each $v_{kj_k}$ denotes the $j_k$-th entry of $v_kin V_k$, while $a_{ij_1j_2...j_N}$'s are the coefficients of $f$.
Thanks to this entry-wise form, we may then write down the entry-wise form of ${rm d}f$ as well, which reads
begin{align}
&({rm d}f)_i(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}h_{1j_1}v_{2j_2}...v_{Nj_N}\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}h_{2j_2}...v_{Nj_N}\
&+cdots\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...h_{Nj_N}.
end{align}
In other words, as $a_{ij_1j_2...j_N}$'s are known, the entry-wise form of ${rm d}f$ could be expressed straightforwardly as above.
Finally, the "$+$" in OP's original post, i.e., $(h_1+h_2+cdots+h_N)$, is a convention in some context, which is exactly $(h_1,h_2,...,h_N)$ here. When there is free of ambiguity, both expressions can be used as per ones preference.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3076935%2fderivative-of-cross-product%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
For a multilinear mapping, it suffices to consider its Frechet derivative. Let $W$ be an $n$-D vector space, and each $V_i$ be an $m_i$-D vector space with $i=1,2,...,N$. Let $f:V_1times V_2timescdotstimes V_Nto W$ be multilinear. Then $forallleft(v_1,v_2,...,v_Nright)in V_1times V_2timescdotstimes V_N$, the Frechet derivative of $f$ at this location, denoted by $({rm d}f)(v_1,v_2,...,v_N)$, is also a multilinear mapping, i.e.,
$$
({rm d}f)(v_1,v_2,...,v_N):V_1times V_2timescdotstimes V_Nto W.
$$
According to Frechet, it follows that
begin{align}
&({rm d}f)(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=f(h_1,a_2,...,a_N)\
&+f(a_1,h_2,...,a_N)\
&+cdots\
&+f(a_1,a_2,...,h_N).
end{align}
Recall that, if $g$ is linear, its entry-wise form reads
$$
g_i(v)=sum_ja_{ij}v_j,
$$
and if $g$ is bilinear, its entry-wise form reads
$$
g_i(v_1,v_2)=sum_{j_1,j_2}a_{ij_1j_2}v_{1j_1}v_{2j_2}.
$$
Inductively and formally, the above multilinear $f$ observes the following entry-wise form
$$
f_i(v_1,v_2,...,v_N)=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...v_{Nj_N}
$$
for $i=1,2,...,m$, where each $v_{kj_k}$ denotes the $j_k$-th entry of $v_kin V_k$, while $a_{ij_1j_2...j_N}$'s are the coefficients of $f$.
Thanks to this entry-wise form, we may then write down the entry-wise form of ${rm d}f$ as well, which reads
begin{align}
&({rm d}f)_i(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}h_{1j_1}v_{2j_2}...v_{Nj_N}\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}h_{2j_2}...v_{Nj_N}\
&+cdots\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...h_{Nj_N}.
end{align}
In other words, as $a_{ij_1j_2...j_N}$'s are known, the entry-wise form of ${rm d}f$ could be expressed straightforwardly as above.
Finally, the "$+$" in OP's original post, i.e., $(h_1+h_2+cdots+h_N)$, is a convention in some context, which is exactly $(h_1,h_2,...,h_N)$ here. When there is free of ambiguity, both expressions can be used as per ones preference.
$endgroup$
add a comment |
$begingroup$
For a multilinear mapping, it suffices to consider its Frechet derivative. Let $W$ be an $n$-D vector space, and each $V_i$ be an $m_i$-D vector space with $i=1,2,...,N$. Let $f:V_1times V_2timescdotstimes V_Nto W$ be multilinear. Then $forallleft(v_1,v_2,...,v_Nright)in V_1times V_2timescdotstimes V_N$, the Frechet derivative of $f$ at this location, denoted by $({rm d}f)(v_1,v_2,...,v_N)$, is also a multilinear mapping, i.e.,
$$
({rm d}f)(v_1,v_2,...,v_N):V_1times V_2timescdotstimes V_Nto W.
$$
According to Frechet, it follows that
begin{align}
&({rm d}f)(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=f(h_1,a_2,...,a_N)\
&+f(a_1,h_2,...,a_N)\
&+cdots\
&+f(a_1,a_2,...,h_N).
end{align}
Recall that, if $g$ is linear, its entry-wise form reads
$$
g_i(v)=sum_ja_{ij}v_j,
$$
and if $g$ is bilinear, its entry-wise form reads
$$
g_i(v_1,v_2)=sum_{j_1,j_2}a_{ij_1j_2}v_{1j_1}v_{2j_2}.
$$
Inductively and formally, the above multilinear $f$ observes the following entry-wise form
$$
f_i(v_1,v_2,...,v_N)=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...v_{Nj_N}
$$
for $i=1,2,...,m$, where each $v_{kj_k}$ denotes the $j_k$-th entry of $v_kin V_k$, while $a_{ij_1j_2...j_N}$'s are the coefficients of $f$.
Thanks to this entry-wise form, we may then write down the entry-wise form of ${rm d}f$ as well, which reads
begin{align}
&({rm d}f)_i(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}h_{1j_1}v_{2j_2}...v_{Nj_N}\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}h_{2j_2}...v_{Nj_N}\
&+cdots\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...h_{Nj_N}.
end{align}
In other words, as $a_{ij_1j_2...j_N}$'s are known, the entry-wise form of ${rm d}f$ could be expressed straightforwardly as above.
Finally, the "$+$" in OP's original post, i.e., $(h_1+h_2+cdots+h_N)$, is a convention in some context, which is exactly $(h_1,h_2,...,h_N)$ here. When there is free of ambiguity, both expressions can be used as per ones preference.
$endgroup$
add a comment |
$begingroup$
For a multilinear mapping, it suffices to consider its Frechet derivative. Let $W$ be an $n$-D vector space, and each $V_i$ be an $m_i$-D vector space with $i=1,2,...,N$. Let $f:V_1times V_2timescdotstimes V_Nto W$ be multilinear. Then $forallleft(v_1,v_2,...,v_Nright)in V_1times V_2timescdotstimes V_N$, the Frechet derivative of $f$ at this location, denoted by $({rm d}f)(v_1,v_2,...,v_N)$, is also a multilinear mapping, i.e.,
$$
({rm d}f)(v_1,v_2,...,v_N):V_1times V_2timescdotstimes V_Nto W.
$$
According to Frechet, it follows that
begin{align}
&({rm d}f)(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=f(h_1,a_2,...,a_N)\
&+f(a_1,h_2,...,a_N)\
&+cdots\
&+f(a_1,a_2,...,h_N).
end{align}
Recall that, if $g$ is linear, its entry-wise form reads
$$
g_i(v)=sum_ja_{ij}v_j,
$$
and if $g$ is bilinear, its entry-wise form reads
$$
g_i(v_1,v_2)=sum_{j_1,j_2}a_{ij_1j_2}v_{1j_1}v_{2j_2}.
$$
Inductively and formally, the above multilinear $f$ observes the following entry-wise form
$$
f_i(v_1,v_2,...,v_N)=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...v_{Nj_N}
$$
for $i=1,2,...,m$, where each $v_{kj_k}$ denotes the $j_k$-th entry of $v_kin V_k$, while $a_{ij_1j_2...j_N}$'s are the coefficients of $f$.
Thanks to this entry-wise form, we may then write down the entry-wise form of ${rm d}f$ as well, which reads
begin{align}
&({rm d}f)_i(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}h_{1j_1}v_{2j_2}...v_{Nj_N}\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}h_{2j_2}...v_{Nj_N}\
&+cdots\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...h_{Nj_N}.
end{align}
In other words, as $a_{ij_1j_2...j_N}$'s are known, the entry-wise form of ${rm d}f$ could be expressed straightforwardly as above.
Finally, the "$+$" in OP's original post, i.e., $(h_1+h_2+cdots+h_N)$, is a convention in some context, which is exactly $(h_1,h_2,...,h_N)$ here. When there is free of ambiguity, both expressions can be used as per ones preference.
$endgroup$
For a multilinear mapping, it suffices to consider its Frechet derivative. Let $W$ be an $n$-D vector space, and each $V_i$ be an $m_i$-D vector space with $i=1,2,...,N$. Let $f:V_1times V_2timescdotstimes V_Nto W$ be multilinear. Then $forallleft(v_1,v_2,...,v_Nright)in V_1times V_2timescdotstimes V_N$, the Frechet derivative of $f$ at this location, denoted by $({rm d}f)(v_1,v_2,...,v_N)$, is also a multilinear mapping, i.e.,
$$
({rm d}f)(v_1,v_2,...,v_N):V_1times V_2timescdotstimes V_Nto W.
$$
According to Frechet, it follows that
begin{align}
&({rm d}f)(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=f(h_1,a_2,...,a_N)\
&+f(a_1,h_2,...,a_N)\
&+cdots\
&+f(a_1,a_2,...,h_N).
end{align}
Recall that, if $g$ is linear, its entry-wise form reads
$$
g_i(v)=sum_ja_{ij}v_j,
$$
and if $g$ is bilinear, its entry-wise form reads
$$
g_i(v_1,v_2)=sum_{j_1,j_2}a_{ij_1j_2}v_{1j_1}v_{2j_2}.
$$
Inductively and formally, the above multilinear $f$ observes the following entry-wise form
$$
f_i(v_1,v_2,...,v_N)=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...v_{Nj_N}
$$
for $i=1,2,...,m$, where each $v_{kj_k}$ denotes the $j_k$-th entry of $v_kin V_k$, while $a_{ij_1j_2...j_N}$'s are the coefficients of $f$.
Thanks to this entry-wise form, we may then write down the entry-wise form of ${rm d}f$ as well, which reads
begin{align}
&({rm d}f)_i(v_1,v_2,...,v_N)(h_1,h_2,...,h_N)\
&=sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}h_{1j_1}v_{2j_2}...v_{Nj_N}\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}h_{2j_2}...v_{Nj_N}\
&+cdots\
&+sum_{j_1=1}^{m_1}sum_{j_2=1}^{m_2}cdotssum_{j_N=1}^{m_N}a_{ij_1j_2...j_N}v_{1j_1}v_{2j_2}...h_{Nj_N}.
end{align}
In other words, as $a_{ij_1j_2...j_N}$'s are known, the entry-wise form of ${rm d}f$ could be expressed straightforwardly as above.
Finally, the "$+$" in OP's original post, i.e., $(h_1+h_2+cdots+h_N)$, is a convention in some context, which is exactly $(h_1,h_2,...,h_N)$ here. When there is free of ambiguity, both expressions can be used as per ones preference.
answered Jan 20 at 7:53
hypernovahypernova
4,994514
4,994514
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3076935%2fderivative-of-cross-product%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
If $f$ is $p$-linear then $f(x_1 + h_1, ldots, x_p + h_p) = f(x_1, ldots, x_p) + sum_i f(x_1, ldots, h_i, ldots, x_p) + E(h_1, ldots, h_p),$ where $E$ is the sum of all terms were $f$ is evaluated at least in two different $h_i.$ It follows $E(h_1, ldots, h_p) = o(|h|)$ and then, $f'(x_1, ldots, x_p) = sum_i f(x_1, ldots, h_i, ldots, x_p).$ This proof does not require any sort of finite dimension. Q.E.D.
$endgroup$
– Will M.
Jan 27 at 1:17