What precisely does it mean to borrow information?











up vote
3
down vote

favorite












I often people them talk about information borrowing or information sharing in Bayesian hierarchical models. I can't seem to get a straight answer about what this actually means and if it is unique to Bayesian hierarchical models. I sort of get the idea: some levels in your hierarchy share a common parameter. I have no idea how this translates to "information borrowing" though.




  1. Is "information borrowing"/ "information sharing" a buzz word people like to throw out?


  2. Is there an example with closed form posteriors that illustrates this sharing phenomenon?


  3. Is this unique to a Bayesian analysis? Generally, when I see examples of "information borrowing" they are just mixed models. Maybe I learned this models in an old fashioned way, but I don't see any sharing.



I am not interested in starting a philosophical debate about methods. I am just curious about the use of this term.










share|cite|improve this question






















  • For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
    – Isabella Ghement
    3 hours ago















up vote
3
down vote

favorite












I often people them talk about information borrowing or information sharing in Bayesian hierarchical models. I can't seem to get a straight answer about what this actually means and if it is unique to Bayesian hierarchical models. I sort of get the idea: some levels in your hierarchy share a common parameter. I have no idea how this translates to "information borrowing" though.




  1. Is "information borrowing"/ "information sharing" a buzz word people like to throw out?


  2. Is there an example with closed form posteriors that illustrates this sharing phenomenon?


  3. Is this unique to a Bayesian analysis? Generally, when I see examples of "information borrowing" they are just mixed models. Maybe I learned this models in an old fashioned way, but I don't see any sharing.



I am not interested in starting a philosophical debate about methods. I am just curious about the use of this term.










share|cite|improve this question






















  • For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
    – Isabella Ghement
    3 hours ago













up vote
3
down vote

favorite









up vote
3
down vote

favorite











I often people them talk about information borrowing or information sharing in Bayesian hierarchical models. I can't seem to get a straight answer about what this actually means and if it is unique to Bayesian hierarchical models. I sort of get the idea: some levels in your hierarchy share a common parameter. I have no idea how this translates to "information borrowing" though.




  1. Is "information borrowing"/ "information sharing" a buzz word people like to throw out?


  2. Is there an example with closed form posteriors that illustrates this sharing phenomenon?


  3. Is this unique to a Bayesian analysis? Generally, when I see examples of "information borrowing" they are just mixed models. Maybe I learned this models in an old fashioned way, but I don't see any sharing.



I am not interested in starting a philosophical debate about methods. I am just curious about the use of this term.










share|cite|improve this question













I often people them talk about information borrowing or information sharing in Bayesian hierarchical models. I can't seem to get a straight answer about what this actually means and if it is unique to Bayesian hierarchical models. I sort of get the idea: some levels in your hierarchy share a common parameter. I have no idea how this translates to "information borrowing" though.




  1. Is "information borrowing"/ "information sharing" a buzz word people like to throw out?


  2. Is there an example with closed form posteriors that illustrates this sharing phenomenon?


  3. Is this unique to a Bayesian analysis? Generally, when I see examples of "information borrowing" they are just mixed models. Maybe I learned this models in an old fashioned way, but I don't see any sharing.



I am not interested in starting a philosophical debate about methods. I am just curious about the use of this term.







machine-learning bayesian multilevel-analysis terminology hierarchical-bayesian






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 5 hours ago









EliK

309112




309112












  • For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
    – Isabella Ghement
    3 hours ago


















  • For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
    – Isabella Ghement
    3 hours ago
















For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
– Isabella Ghement
3 hours ago




For your question 2., you may find this link illuminating: tjmahr.com/plotting-partial-pooling-in-mixed-effects-models.
– Isabella Ghement
3 hours ago










2 Answers
2






active

oldest

votes

















up vote
3
down vote













Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs (regularizes, shrinks toward a common mean) the estimate for a given group. That's an example of 'borrowing information'.






share|cite|improve this answer






























    up vote
    0
    down vote













    This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on Stein's paradox and parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



    EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.



    Now you can see why EB has "borrowing" but true Bayes does not. In true Bayes, the prior distribution already exists and so doesn't need to be begged or borrowed. In EB, the prior distribution has be created from the observed data itself. When we make inference about a particular case, we use all the observed information from that case and a little bit of information from each of the other cases. We say it is only "borrowed", because the information is given back when we move on to make inference about the next case.



    The idea of "information borrowing" is used heavily in statistical genomics, when each "case" is usually a gene or a genomic feature (Smyth, 2004; Phipson et al, 2016).



    References



    Efron, Bradley, and Carl Morris. Stein's paradox in statistics. Scientific American 236, no. 5 (1977): 119-127. http://statweb.stanford.edu/~ckirby/brad/other/Article1977.pdf



    Smyth, G. K. (2004). Linear models and empirical Bayes methods for assessing differential expression in microarray experiments. Statistical Applications in Genetics and Molecular Biology Volume 3, Issue 1, Article 3.
    http://www.statsci.org/smyth/pubs/ebayes.pdf



    Phipson, B, Lee, S, Majewski, IJ, Alexander, WS, and Smyth, GK (2016). Robust hyperparameter estimation protects against hypervariable genes and improves power to detect differential expression. Annals of Applied Statistics 10, 946-963.
    http://dx.doi.org/10.1214/16-AOAS920






    share|cite|improve this answer























    • I don't think this interpretation is correct. For example, mixed effects models borrow information, yet can be analyzed in a traditional Bayesian context
      – Cliff AB
      30 mins ago










    • @CliffAB If you dig into mixed model analyses, you will find that the analysis is virtually always empirical Bayes rather than true Bayes. Most authors of course will say they are doing Bayes when it is actually EB because most authors don't make the distinction. If you think can you give an example of a true Bayes mixed model analysis, then I invite you to do so.
      – Gordon Smyth
      13 mins ago













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "65"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f381761%2fwhat-precisely-does-it-mean-to-borrow-information%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    3
    down vote













    Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs (regularizes, shrinks toward a common mean) the estimate for a given group. That's an example of 'borrowing information'.






    share|cite|improve this answer



























      up vote
      3
      down vote













      Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs (regularizes, shrinks toward a common mean) the estimate for a given group. That's an example of 'borrowing information'.






      share|cite|improve this answer

























        up vote
        3
        down vote










        up vote
        3
        down vote









        Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs (regularizes, shrinks toward a common mean) the estimate for a given group. That's an example of 'borrowing information'.






        share|cite|improve this answer














        Consider a simple problem like estimating means of multiple groups. If your model treats them as completely unrelated then the only information you have about each mean is the information within that group. If your model treats their means as somewhat related (such as in some mixed-effects type model) then the estimates will be more precise because information from other groups informs (regularizes, shrinks toward a common mean) the estimate for a given group. That's an example of 'borrowing information'.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited 39 mins ago

























        answered 4 hours ago









        Glen_b

        208k22396735




        208k22396735
























            up vote
            0
            down vote













            This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on Stein's paradox and parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



            EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.



            Now you can see why EB has "borrowing" but true Bayes does not. In true Bayes, the prior distribution already exists and so doesn't need to be begged or borrowed. In EB, the prior distribution has be created from the observed data itself. When we make inference about a particular case, we use all the observed information from that case and a little bit of information from each of the other cases. We say it is only "borrowed", because the information is given back when we move on to make inference about the next case.



            The idea of "information borrowing" is used heavily in statistical genomics, when each "case" is usually a gene or a genomic feature (Smyth, 2004; Phipson et al, 2016).



            References



            Efron, Bradley, and Carl Morris. Stein's paradox in statistics. Scientific American 236, no. 5 (1977): 119-127. http://statweb.stanford.edu/~ckirby/brad/other/Article1977.pdf



            Smyth, G. K. (2004). Linear models and empirical Bayes methods for assessing differential expression in microarray experiments. Statistical Applications in Genetics and Molecular Biology Volume 3, Issue 1, Article 3.
            http://www.statsci.org/smyth/pubs/ebayes.pdf



            Phipson, B, Lee, S, Majewski, IJ, Alexander, WS, and Smyth, GK (2016). Robust hyperparameter estimation protects against hypervariable genes and improves power to detect differential expression. Annals of Applied Statistics 10, 946-963.
            http://dx.doi.org/10.1214/16-AOAS920






            share|cite|improve this answer























            • I don't think this interpretation is correct. For example, mixed effects models borrow information, yet can be analyzed in a traditional Bayesian context
              – Cliff AB
              30 mins ago










            • @CliffAB If you dig into mixed model analyses, you will find that the analysis is virtually always empirical Bayes rather than true Bayes. Most authors of course will say they are doing Bayes when it is actually EB because most authors don't make the distinction. If you think can you give an example of a true Bayes mixed model analysis, then I invite you to do so.
              – Gordon Smyth
              13 mins ago

















            up vote
            0
            down vote













            This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on Stein's paradox and parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



            EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.



            Now you can see why EB has "borrowing" but true Bayes does not. In true Bayes, the prior distribution already exists and so doesn't need to be begged or borrowed. In EB, the prior distribution has be created from the observed data itself. When we make inference about a particular case, we use all the observed information from that case and a little bit of information from each of the other cases. We say it is only "borrowed", because the information is given back when we move on to make inference about the next case.



            The idea of "information borrowing" is used heavily in statistical genomics, when each "case" is usually a gene or a genomic feature (Smyth, 2004; Phipson et al, 2016).



            References



            Efron, Bradley, and Carl Morris. Stein's paradox in statistics. Scientific American 236, no. 5 (1977): 119-127. http://statweb.stanford.edu/~ckirby/brad/other/Article1977.pdf



            Smyth, G. K. (2004). Linear models and empirical Bayes methods for assessing differential expression in microarray experiments. Statistical Applications in Genetics and Molecular Biology Volume 3, Issue 1, Article 3.
            http://www.statsci.org/smyth/pubs/ebayes.pdf



            Phipson, B, Lee, S, Majewski, IJ, Alexander, WS, and Smyth, GK (2016). Robust hyperparameter estimation protects against hypervariable genes and improves power to detect differential expression. Annals of Applied Statistics 10, 946-963.
            http://dx.doi.org/10.1214/16-AOAS920






            share|cite|improve this answer























            • I don't think this interpretation is correct. For example, mixed effects models borrow information, yet can be analyzed in a traditional Bayesian context
              – Cliff AB
              30 mins ago










            • @CliffAB If you dig into mixed model analyses, you will find that the analysis is virtually always empirical Bayes rather than true Bayes. Most authors of course will say they are doing Bayes when it is actually EB because most authors don't make the distinction. If you think can you give an example of a true Bayes mixed model analysis, then I invite you to do so.
              – Gordon Smyth
              13 mins ago















            up vote
            0
            down vote










            up vote
            0
            down vote









            This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on Stein's paradox and parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



            EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.



            Now you can see why EB has "borrowing" but true Bayes does not. In true Bayes, the prior distribution already exists and so doesn't need to be begged or borrowed. In EB, the prior distribution has be created from the observed data itself. When we make inference about a particular case, we use all the observed information from that case and a little bit of information from each of the other cases. We say it is only "borrowed", because the information is given back when we move on to make inference about the next case.



            The idea of "information borrowing" is used heavily in statistical genomics, when each "case" is usually a gene or a genomic feature (Smyth, 2004; Phipson et al, 2016).



            References



            Efron, Bradley, and Carl Morris. Stein's paradox in statistics. Scientific American 236, no. 5 (1977): 119-127. http://statweb.stanford.edu/~ckirby/brad/other/Article1977.pdf



            Smyth, G. K. (2004). Linear models and empirical Bayes methods for assessing differential expression in microarray experiments. Statistical Applications in Genetics and Molecular Biology Volume 3, Issue 1, Article 3.
            http://www.statsci.org/smyth/pubs/ebayes.pdf



            Phipson, B, Lee, S, Majewski, IJ, Alexander, WS, and Smyth, GK (2016). Robust hyperparameter estimation protects against hypervariable genes and improves power to detect differential expression. Annals of Applied Statistics 10, 946-963.
            http://dx.doi.org/10.1214/16-AOAS920






            share|cite|improve this answer














            This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on Stein's paradox and parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that mixed models have an EB interpretation.



            EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.



            Now you can see why EB has "borrowing" but true Bayes does not. In true Bayes, the prior distribution already exists and so doesn't need to be begged or borrowed. In EB, the prior distribution has be created from the observed data itself. When we make inference about a particular case, we use all the observed information from that case and a little bit of information from each of the other cases. We say it is only "borrowed", because the information is given back when we move on to make inference about the next case.



            The idea of "information borrowing" is used heavily in statistical genomics, when each "case" is usually a gene or a genomic feature (Smyth, 2004; Phipson et al, 2016).



            References



            Efron, Bradley, and Carl Morris. Stein's paradox in statistics. Scientific American 236, no. 5 (1977): 119-127. http://statweb.stanford.edu/~ckirby/brad/other/Article1977.pdf



            Smyth, G. K. (2004). Linear models and empirical Bayes methods for assessing differential expression in microarray experiments. Statistical Applications in Genetics and Molecular Biology Volume 3, Issue 1, Article 3.
            http://www.statsci.org/smyth/pubs/ebayes.pdf



            Phipson, B, Lee, S, Majewski, IJ, Alexander, WS, and Smyth, GK (2016). Robust hyperparameter estimation protects against hypervariable genes and improves power to detect differential expression. Annals of Applied Statistics 10, 946-963.
            http://dx.doi.org/10.1214/16-AOAS920







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited 33 mins ago

























            answered 1 hour ago









            Gordon Smyth

            4,4331124




            4,4331124












            • I don't think this interpretation is correct. For example, mixed effects models borrow information, yet can be analyzed in a traditional Bayesian context
              – Cliff AB
              30 mins ago










            • @CliffAB If you dig into mixed model analyses, you will find that the analysis is virtually always empirical Bayes rather than true Bayes. Most authors of course will say they are doing Bayes when it is actually EB because most authors don't make the distinction. If you think can you give an example of a true Bayes mixed model analysis, then I invite you to do so.
              – Gordon Smyth
              13 mins ago




















            • I don't think this interpretation is correct. For example, mixed effects models borrow information, yet can be analyzed in a traditional Bayesian context
              – Cliff AB
              30 mins ago










            • @CliffAB If you dig into mixed model analyses, you will find that the analysis is virtually always empirical Bayes rather than true Bayes. Most authors of course will say they are doing Bayes when it is actually EB because most authors don't make the distinction. If you think can you give an example of a true Bayes mixed model analysis, then I invite you to do so.
              – Gordon Smyth
              13 mins ago


















            I don't think this interpretation is correct. For example, mixed effects models borrow information, yet can be analyzed in a traditional Bayesian context
            – Cliff AB
            30 mins ago




            I don't think this interpretation is correct. For example, mixed effects models borrow information, yet can be analyzed in a traditional Bayesian context
            – Cliff AB
            30 mins ago












            @CliffAB If you dig into mixed model analyses, you will find that the analysis is virtually always empirical Bayes rather than true Bayes. Most authors of course will say they are doing Bayes when it is actually EB because most authors don't make the distinction. If you think can you give an example of a true Bayes mixed model analysis, then I invite you to do so.
            – Gordon Smyth
            13 mins ago






            @CliffAB If you dig into mixed model analyses, you will find that the analysis is virtually always empirical Bayes rather than true Bayes. Most authors of course will say they are doing Bayes when it is actually EB because most authors don't make the distinction. If you think can you give an example of a true Bayes mixed model analysis, then I invite you to do so.
            – Gordon Smyth
            13 mins ago




















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Cross Validated!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f381761%2fwhat-precisely-does-it-mean-to-borrow-information%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            What visual should I use to simply compare current year value vs last year in Power BI desktop

            Alexandru Averescu

            Trompette piccolo