Matrix calculus for statistics











up vote
2
down vote

favorite












I want to calculate the variance of a sum of random variables $ (X_1+X_2) $.



Doing statistics, I have to learn maths starting from the end and it is quite difficult, yet very interesting (please consider that I only have the very basic skills in maths).



For now I am doing this calculus manually with the formula $ mathrm{var}(X_1+X_2) = mathrm{var}(X_1) + mathrm{var}(X_2) + 2mathrm{cov}(X_1,X_2)$.



But I am now facing much larger sums (with some minus) and being able to do so with matrix calculation would save me a lot of time (and would be very satisfying too).



I searched every resource in matrix calculus but couldn't find anything usable with my knowledge.



How can I do this calculus from the variance-covariance matrix
$$
begin{pmatrix}
mathrm{var}(X_1) & mathrm{cov}(X_1,X_2) \
mathrm{cov}(X_1,X_2) & mathrm{var}(X_2) \
end{pmatrix}
$$

[preferentially extended to substractions and n terms, like $ (X_1+X_2-X_3) $]?



NB: this is not a statistic question and doesn't belong to stats.stackexchange. I want to understand the thought process of turning scalar calculation to matrix.










share|cite|improve this question




























    up vote
    2
    down vote

    favorite












    I want to calculate the variance of a sum of random variables $ (X_1+X_2) $.



    Doing statistics, I have to learn maths starting from the end and it is quite difficult, yet very interesting (please consider that I only have the very basic skills in maths).



    For now I am doing this calculus manually with the formula $ mathrm{var}(X_1+X_2) = mathrm{var}(X_1) + mathrm{var}(X_2) + 2mathrm{cov}(X_1,X_2)$.



    But I am now facing much larger sums (with some minus) and being able to do so with matrix calculation would save me a lot of time (and would be very satisfying too).



    I searched every resource in matrix calculus but couldn't find anything usable with my knowledge.



    How can I do this calculus from the variance-covariance matrix
    $$
    begin{pmatrix}
    mathrm{var}(X_1) & mathrm{cov}(X_1,X_2) \
    mathrm{cov}(X_1,X_2) & mathrm{var}(X_2) \
    end{pmatrix}
    $$

    [preferentially extended to substractions and n terms, like $ (X_1+X_2-X_3) $]?



    NB: this is not a statistic question and doesn't belong to stats.stackexchange. I want to understand the thought process of turning scalar calculation to matrix.










    share|cite|improve this question


























      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      I want to calculate the variance of a sum of random variables $ (X_1+X_2) $.



      Doing statistics, I have to learn maths starting from the end and it is quite difficult, yet very interesting (please consider that I only have the very basic skills in maths).



      For now I am doing this calculus manually with the formula $ mathrm{var}(X_1+X_2) = mathrm{var}(X_1) + mathrm{var}(X_2) + 2mathrm{cov}(X_1,X_2)$.



      But I am now facing much larger sums (with some minus) and being able to do so with matrix calculation would save me a lot of time (and would be very satisfying too).



      I searched every resource in matrix calculus but couldn't find anything usable with my knowledge.



      How can I do this calculus from the variance-covariance matrix
      $$
      begin{pmatrix}
      mathrm{var}(X_1) & mathrm{cov}(X_1,X_2) \
      mathrm{cov}(X_1,X_2) & mathrm{var}(X_2) \
      end{pmatrix}
      $$

      [preferentially extended to substractions and n terms, like $ (X_1+X_2-X_3) $]?



      NB: this is not a statistic question and doesn't belong to stats.stackexchange. I want to understand the thought process of turning scalar calculation to matrix.










      share|cite|improve this question















      I want to calculate the variance of a sum of random variables $ (X_1+X_2) $.



      Doing statistics, I have to learn maths starting from the end and it is quite difficult, yet very interesting (please consider that I only have the very basic skills in maths).



      For now I am doing this calculus manually with the formula $ mathrm{var}(X_1+X_2) = mathrm{var}(X_1) + mathrm{var}(X_2) + 2mathrm{cov}(X_1,X_2)$.



      But I am now facing much larger sums (with some minus) and being able to do so with matrix calculation would save me a lot of time (and would be very satisfying too).



      I searched every resource in matrix calculus but couldn't find anything usable with my knowledge.



      How can I do this calculus from the variance-covariance matrix
      $$
      begin{pmatrix}
      mathrm{var}(X_1) & mathrm{cov}(X_1,X_2) \
      mathrm{cov}(X_1,X_2) & mathrm{var}(X_2) \
      end{pmatrix}
      $$

      [preferentially extended to substractions and n terms, like $ (X_1+X_2-X_3) $]?



      NB: this is not a statistic question and doesn't belong to stats.stackexchange. I want to understand the thought process of turning scalar calculation to matrix.







      matrices descriptive-statistics






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 22 at 15:10









      Jean-Claude Arbaut

      14.7k63363




      14.7k63363










      asked Nov 22 at 12:31









      Dan Chaltiel

      1134




      1134






















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          3
          down vote



          accepted










          The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.



          Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is



          $$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
          =frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$






          share|cite|improve this answer























          • +1 Elegant solution
            – caverac
            Nov 22 at 13:12










          • Great answer, thanks ! Learned a lot with this !
            – Dan Chaltiel
            Nov 22 at 13:28


















          up vote
          2
          down vote













          The key point here is that



          $$
          mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
          $$



          so that you can express your first expression as



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
          &=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}



          In general



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}






          share|cite|improve this answer

















          • 1




            +1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
            – Jean-Claude Arbaut
            Nov 22 at 12:57











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009073%2fmatrix-calculus-for-statistics%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          3
          down vote



          accepted










          The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.



          Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is



          $$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
          =frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$






          share|cite|improve this answer























          • +1 Elegant solution
            – caverac
            Nov 22 at 13:12










          • Great answer, thanks ! Learned a lot with this !
            – Dan Chaltiel
            Nov 22 at 13:28















          up vote
          3
          down vote



          accepted










          The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.



          Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is



          $$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
          =frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$






          share|cite|improve this answer























          • +1 Elegant solution
            – caverac
            Nov 22 at 13:12










          • Great answer, thanks ! Learned a lot with this !
            – Dan Chaltiel
            Nov 22 at 13:28













          up vote
          3
          down vote



          accepted







          up vote
          3
          down vote



          accepted






          The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.



          Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is



          $$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
          =frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$






          share|cite|improve this answer














          The variance-covariance matrix of $X$ is $frac1n(X-bar X)^T(X-bar X)$.



          Now, you want to compute the variance of the vector $u=Xbeta$ for some vector $beta$. This variance is



          $$Var(u)=frac1n(u-bar u)^T(u-bar u)=frac1n(Xbeta-bar Xbeta)^T(Xbeta-bar Xbeta)\
          =frac1nbeta^T(X-bar X)^T(X-bar X)beta=beta^TVar(X)beta$$







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Nov 22 at 13:25

























          answered Nov 22 at 13:00









          Jean-Claude Arbaut

          14.7k63363




          14.7k63363












          • +1 Elegant solution
            – caverac
            Nov 22 at 13:12










          • Great answer, thanks ! Learned a lot with this !
            – Dan Chaltiel
            Nov 22 at 13:28


















          • +1 Elegant solution
            – caverac
            Nov 22 at 13:12










          • Great answer, thanks ! Learned a lot with this !
            – Dan Chaltiel
            Nov 22 at 13:28
















          +1 Elegant solution
          – caverac
          Nov 22 at 13:12




          +1 Elegant solution
          – caverac
          Nov 22 at 13:12












          Great answer, thanks ! Learned a lot with this !
          – Dan Chaltiel
          Nov 22 at 13:28




          Great answer, thanks ! Learned a lot with this !
          – Dan Chaltiel
          Nov 22 at 13:28










          up vote
          2
          down vote













          The key point here is that



          $$
          mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
          $$



          so that you can express your first expression as



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
          &=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}



          In general



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}






          share|cite|improve this answer

















          • 1




            +1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
            – Jean-Claude Arbaut
            Nov 22 at 12:57















          up vote
          2
          down vote













          The key point here is that



          $$
          mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
          $$



          so that you can express your first expression as



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
          &=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}



          In general



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}






          share|cite|improve this answer

















          • 1




            +1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
            – Jean-Claude Arbaut
            Nov 22 at 12:57













          up vote
          2
          down vote










          up vote
          2
          down vote









          The key point here is that



          $$
          mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
          $$



          so that you can express your first expression as



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
          &=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}



          In general



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}






          share|cite|improve this answer












          The key point here is that



          $$
          mathbb{V}{rm ar}[X] = mathbb{C}{rm ov}[X, X]
          $$



          so that you can express your first expression as



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + a_2 X_2] &=& a_1^2mathbb{V}{rm ar}[X_1] + a_2^2mathbb{V}{rm ar}[X_2] + 2 a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + 2a_1a_2 mathbb{C}{rm ov}[X_1, X_2] \
          &=& a_1^2 mathbb{C}{rm ov}[X_1, X_1] + a_2^2 mathbb{C}{rm ov}[X_2, X_2] + a_1a_2 mathbb{C}{rm ov}[X_1, X_2] + a_2a_1 mathbb{C}{rm ov}[X_2, X_1] \
          &=& sum_{i=1}^2 sum_{j=1}^2 a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}



          In general



          begin{eqnarray}
          mathbb{V}{rm ar}[a_1 X_1 + cdots a_n X_n] &=& sum_{i=1}^n sum_{j=1}^n a_i a_j mathbb{C}{rm ov}[X_i, X_j]
          end{eqnarray}







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Nov 22 at 12:56









          caverac

          12.4k21027




          12.4k21027








          • 1




            +1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
            – Jean-Claude Arbaut
            Nov 22 at 12:57














          • 1




            +1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
            – Jean-Claude Arbaut
            Nov 22 at 12:57








          1




          1




          +1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
          – Jean-Claude Arbaut
          Nov 22 at 12:57




          +1, but I think the OP wanted to see how this is done with matrices, that is how to write it as $a^TVar(X)a$.
          – Jean-Claude Arbaut
          Nov 22 at 12:57


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009073%2fmatrix-calculus-for-statistics%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          What visual should I use to simply compare current year value vs last year in Power BI desktop

          How to ignore python UserWarning in pytest?

          Alexandru Averescu