Access buckets across projects in gcp using hive












0














I have two projects on my gcp account and both of them have buckets.
On one of the projects, I have a dataproc cluster on which I am running hive.
From this hive, I want to access the buckets of the other project.
I have tried giving ACL permissions to my bucket, but I still get the error when I execute a create table command from hive, saying:



FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: java.io.IOException Error accessing: bucket: bucketname, object: folder/filename.



How can I access my bucket using hive ?










share|improve this question






















  • Have you taken a look at the documentation? It may have what you're looking for: cloud.google.com/dataproc/docs/concepts/connectors/…
    – Maxim
    Nov 25 '18 at 10:07










  • Does the Dataproc service account name@[YOUR_PROJECT_ID].iam.gserviceaccount.com have the right permissions on that bucket?
    – MonicaPC
    Nov 28 '18 at 0:41










  • @MonicaPC I had to give accurate permissions in the bucket to my service account.
    – Sneha K
    Nov 30 '18 at 5:19










  • @Sneha K I'm glad it worked. Could you post the answer to your question, as benefit for the community?
    – Maxim
    Nov 30 '18 at 5:40
















0














I have two projects on my gcp account and both of them have buckets.
On one of the projects, I have a dataproc cluster on which I am running hive.
From this hive, I want to access the buckets of the other project.
I have tried giving ACL permissions to my bucket, but I still get the error when I execute a create table command from hive, saying:



FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: java.io.IOException Error accessing: bucket: bucketname, object: folder/filename.



How can I access my bucket using hive ?










share|improve this question






















  • Have you taken a look at the documentation? It may have what you're looking for: cloud.google.com/dataproc/docs/concepts/connectors/…
    – Maxim
    Nov 25 '18 at 10:07










  • Does the Dataproc service account name@[YOUR_PROJECT_ID].iam.gserviceaccount.com have the right permissions on that bucket?
    – MonicaPC
    Nov 28 '18 at 0:41










  • @MonicaPC I had to give accurate permissions in the bucket to my service account.
    – Sneha K
    Nov 30 '18 at 5:19










  • @Sneha K I'm glad it worked. Could you post the answer to your question, as benefit for the community?
    – Maxim
    Nov 30 '18 at 5:40














0












0








0







I have two projects on my gcp account and both of them have buckets.
On one of the projects, I have a dataproc cluster on which I am running hive.
From this hive, I want to access the buckets of the other project.
I have tried giving ACL permissions to my bucket, but I still get the error when I execute a create table command from hive, saying:



FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: java.io.IOException Error accessing: bucket: bucketname, object: folder/filename.



How can I access my bucket using hive ?










share|improve this question













I have two projects on my gcp account and both of them have buckets.
On one of the projects, I have a dataproc cluster on which I am running hive.
From this hive, I want to access the buckets of the other project.
I have tried giving ACL permissions to my bucket, but I still get the error when I execute a create table command from hive, saying:



FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: java.io.IOException Error accessing: bucket: bucketname, object: folder/filename.



How can I access my bucket using hive ?







hadoop hive google-cloud-platform google-cloud-storage






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 23 '18 at 9:03









Sneha K

385




385












  • Have you taken a look at the documentation? It may have what you're looking for: cloud.google.com/dataproc/docs/concepts/connectors/…
    – Maxim
    Nov 25 '18 at 10:07










  • Does the Dataproc service account name@[YOUR_PROJECT_ID].iam.gserviceaccount.com have the right permissions on that bucket?
    – MonicaPC
    Nov 28 '18 at 0:41










  • @MonicaPC I had to give accurate permissions in the bucket to my service account.
    – Sneha K
    Nov 30 '18 at 5:19










  • @Sneha K I'm glad it worked. Could you post the answer to your question, as benefit for the community?
    – Maxim
    Nov 30 '18 at 5:40


















  • Have you taken a look at the documentation? It may have what you're looking for: cloud.google.com/dataproc/docs/concepts/connectors/…
    – Maxim
    Nov 25 '18 at 10:07










  • Does the Dataproc service account name@[YOUR_PROJECT_ID].iam.gserviceaccount.com have the right permissions on that bucket?
    – MonicaPC
    Nov 28 '18 at 0:41










  • @MonicaPC I had to give accurate permissions in the bucket to my service account.
    – Sneha K
    Nov 30 '18 at 5:19










  • @Sneha K I'm glad it worked. Could you post the answer to your question, as benefit for the community?
    – Maxim
    Nov 30 '18 at 5:40
















Have you taken a look at the documentation? It may have what you're looking for: cloud.google.com/dataproc/docs/concepts/connectors/…
– Maxim
Nov 25 '18 at 10:07




Have you taken a look at the documentation? It may have what you're looking for: cloud.google.com/dataproc/docs/concepts/connectors/…
– Maxim
Nov 25 '18 at 10:07












Does the Dataproc service account name@[YOUR_PROJECT_ID].iam.gserviceaccount.com have the right permissions on that bucket?
– MonicaPC
Nov 28 '18 at 0:41




Does the Dataproc service account name@[YOUR_PROJECT_ID].iam.gserviceaccount.com have the right permissions on that bucket?
– MonicaPC
Nov 28 '18 at 0:41












@MonicaPC I had to give accurate permissions in the bucket to my service account.
– Sneha K
Nov 30 '18 at 5:19




@MonicaPC I had to give accurate permissions in the bucket to my service account.
– Sneha K
Nov 30 '18 at 5:19












@Sneha K I'm glad it worked. Could you post the answer to your question, as benefit for the community?
– Maxim
Nov 30 '18 at 5:40




@Sneha K I'm glad it worked. Could you post the answer to your question, as benefit for the community?
– Maxim
Nov 30 '18 at 5:40












1 Answer
1






active

oldest

votes


















2














As suggested, I used the google cloud connector, which comes pre-installed in the dataproc cluster.



https://cloud.google.com/dataproc/docs/concepts/connectors/install-storage-connector



The steps are precise, but in addition to that, I had to add apt roles in the bucket to my service account.



https://cloud.google.com/storage/docs/access-control/iam-roles



It then worked.






share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53443495%2faccess-buckets-across-projects-in-gcp-using-hive%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2














    As suggested, I used the google cloud connector, which comes pre-installed in the dataproc cluster.



    https://cloud.google.com/dataproc/docs/concepts/connectors/install-storage-connector



    The steps are precise, but in addition to that, I had to add apt roles in the bucket to my service account.



    https://cloud.google.com/storage/docs/access-control/iam-roles



    It then worked.






    share|improve this answer


























      2














      As suggested, I used the google cloud connector, which comes pre-installed in the dataproc cluster.



      https://cloud.google.com/dataproc/docs/concepts/connectors/install-storage-connector



      The steps are precise, but in addition to that, I had to add apt roles in the bucket to my service account.



      https://cloud.google.com/storage/docs/access-control/iam-roles



      It then worked.






      share|improve this answer
























        2












        2








        2






        As suggested, I used the google cloud connector, which comes pre-installed in the dataproc cluster.



        https://cloud.google.com/dataproc/docs/concepts/connectors/install-storage-connector



        The steps are precise, but in addition to that, I had to add apt roles in the bucket to my service account.



        https://cloud.google.com/storage/docs/access-control/iam-roles



        It then worked.






        share|improve this answer












        As suggested, I used the google cloud connector, which comes pre-installed in the dataproc cluster.



        https://cloud.google.com/dataproc/docs/concepts/connectors/install-storage-connector



        The steps are precise, but in addition to that, I had to add apt roles in the bucket to my service account.



        https://cloud.google.com/storage/docs/access-control/iam-roles



        It then worked.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 30 '18 at 6:01









        Sneha K

        385




        385






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53443495%2faccess-buckets-across-projects-in-gcp-using-hive%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            What visual should I use to simply compare current year value vs last year in Power BI desktop

            Alexandru Averescu

            Trompette piccolo