How to Integrate Rest API Source Connector with Kafka Connect?












-1














I have Confluent 5.0 on my local machine and trying to reading data from Rest API using Rest API Source Connect which is not part of confluent. till now i have used confluent inbuilt connectors only. Rest API source connect is open source and available on github https://github.com/llofberg/kafka-connect-rest



I have downloaded this connector from github and got stuck here.



Can anybody tell me the process to integrate this connector with confluent or how can i use this to pull the data from Rest API?










share|improve this question
























  • Can you clarify "got stuck"? That's about the same as saying "it's not working" idownvotedbecau.se/itsnotworking
    – cricket_007
    Nov 24 '18 at 1:02


















-1














I have Confluent 5.0 on my local machine and trying to reading data from Rest API using Rest API Source Connect which is not part of confluent. till now i have used confluent inbuilt connectors only. Rest API source connect is open source and available on github https://github.com/llofberg/kafka-connect-rest



I have downloaded this connector from github and got stuck here.



Can anybody tell me the process to integrate this connector with confluent or how can i use this to pull the data from Rest API?










share|improve this question
























  • Can you clarify "got stuck"? That's about the same as saying "it's not working" idownvotedbecau.se/itsnotworking
    – cricket_007
    Nov 24 '18 at 1:02
















-1












-1








-1


1





I have Confluent 5.0 on my local machine and trying to reading data from Rest API using Rest API Source Connect which is not part of confluent. till now i have used confluent inbuilt connectors only. Rest API source connect is open source and available on github https://github.com/llofberg/kafka-connect-rest



I have downloaded this connector from github and got stuck here.



Can anybody tell me the process to integrate this connector with confluent or how can i use this to pull the data from Rest API?










share|improve this question















I have Confluent 5.0 on my local machine and trying to reading data from Rest API using Rest API Source Connect which is not part of confluent. till now i have used confluent inbuilt connectors only. Rest API source connect is open source and available on github https://github.com/llofberg/kafka-connect-rest



I have downloaded this connector from github and got stuck here.



Can anybody tell me the process to integrate this connector with confluent or how can i use this to pull the data from Rest API?







apache-kafka apache-kafka-connect






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 23 '18 at 23:52









cricket_007

79.5k1142109




79.5k1142109










asked Nov 23 '18 at 9:22









DP0808

13




13












  • Can you clarify "got stuck"? That's about the same as saying "it's not working" idownvotedbecau.se/itsnotworking
    – cricket_007
    Nov 24 '18 at 1:02




















  • Can you clarify "got stuck"? That's about the same as saying "it's not working" idownvotedbecau.se/itsnotworking
    – cricket_007
    Nov 24 '18 at 1:02


















Can you clarify "got stuck"? That's about the same as saying "it's not working" idownvotedbecau.se/itsnotworking
– cricket_007
Nov 24 '18 at 1:02






Can you clarify "got stuck"? That's about the same as saying "it's not working" idownvotedbecau.se/itsnotworking
– cricket_007
Nov 24 '18 at 1:02














1 Answer
1






active

oldest

votes


















0














Disclaimer: There is no single answer to add an external Kafka Connect plugin; Confluent provides the Kafka Connect Maven plugin, but that doesn't mean people use it or even Maven to package their code.



If it is not on the Confluent Hub, then you'll have to build it by hand.





1) Clone the repo, and build it (install Git and Maven first)



git clone https://github.com/llofberg/kafka-connect-rest && cd kafka-connect-rest
mvn clean package


2) Create a directory for it on all Connect workers, similar to the other Connectors of Confluent Platform



mkdir $CONFLUENT_HOME/share/java/kafka-connect-rest


3) Find each of the shaded JARs (this connector happens to make multiple JARs, I don't know why...)



find . -iname "*shaded.jar" -type f

./kafka-connect-transform-from-json/kafka-connect-transform-from-json-plugin/target/kafka-connect-transform-from-json-plugin-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-add-headers/target/kafka-connect-transform-add-headers-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-velocity-eval/target/kafka-connect-transform-velocity-eval-1.0-SNAPSHOT-shaded.jar
./kafka-connect-rest-plugin/target/kafka-connect-rest-plugin-1.0-SNAPSHOT-shaded.jar


4) Copy each of these files into the $CONFLUENT_HOME/share/java/kafka-connect-rest folder created in step 2 for each Connect worker



5) Make sure your plugin.path of the connect-*.properties file points at the full path to $CONFLUENT_HOME/share/java



At this point, you've done all the steps that are listed in the README to build the thing and setup the plugin path, just not in Docker.



6) Start Connect (Distributed)



7) Hit GET /connector-plugins to verify the thing loaded.



8) Configure and send JSON payload to POST /connectors



I have not used this connector before, so I do not know how to configure it. Maybe see the examples or follow along with @rmoff's blog post before the KSQL stuff






share|improve this answer























  • Thank you @cricket_007. I already had done this part except GET /Connector-plugins. as i followed the same instruction and got stuck (couldn't run) GET / Connector-plugin or POST / Connectors command. On confluent hiting ./bin/connect-standalone GET /connector-plugins getting err like java.io.FileNotFoundException: GET (No such file or directory)
    – DP0808
    Nov 24 '18 at 16:36










  • You're not supposed to run that from the CLI, but from POSTMAN because it's a REST API, and I mentioned to start connect-distributed, not connect-standalone
    – cricket_007
    Nov 24 '18 at 17:33













Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53443768%2fhow-to-integrate-rest-api-source-connector-with-kafka-connect%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0














Disclaimer: There is no single answer to add an external Kafka Connect plugin; Confluent provides the Kafka Connect Maven plugin, but that doesn't mean people use it or even Maven to package their code.



If it is not on the Confluent Hub, then you'll have to build it by hand.





1) Clone the repo, and build it (install Git and Maven first)



git clone https://github.com/llofberg/kafka-connect-rest && cd kafka-connect-rest
mvn clean package


2) Create a directory for it on all Connect workers, similar to the other Connectors of Confluent Platform



mkdir $CONFLUENT_HOME/share/java/kafka-connect-rest


3) Find each of the shaded JARs (this connector happens to make multiple JARs, I don't know why...)



find . -iname "*shaded.jar" -type f

./kafka-connect-transform-from-json/kafka-connect-transform-from-json-plugin/target/kafka-connect-transform-from-json-plugin-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-add-headers/target/kafka-connect-transform-add-headers-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-velocity-eval/target/kafka-connect-transform-velocity-eval-1.0-SNAPSHOT-shaded.jar
./kafka-connect-rest-plugin/target/kafka-connect-rest-plugin-1.0-SNAPSHOT-shaded.jar


4) Copy each of these files into the $CONFLUENT_HOME/share/java/kafka-connect-rest folder created in step 2 for each Connect worker



5) Make sure your plugin.path of the connect-*.properties file points at the full path to $CONFLUENT_HOME/share/java



At this point, you've done all the steps that are listed in the README to build the thing and setup the plugin path, just not in Docker.



6) Start Connect (Distributed)



7) Hit GET /connector-plugins to verify the thing loaded.



8) Configure and send JSON payload to POST /connectors



I have not used this connector before, so I do not know how to configure it. Maybe see the examples or follow along with @rmoff's blog post before the KSQL stuff






share|improve this answer























  • Thank you @cricket_007. I already had done this part except GET /Connector-plugins. as i followed the same instruction and got stuck (couldn't run) GET / Connector-plugin or POST / Connectors command. On confluent hiting ./bin/connect-standalone GET /connector-plugins getting err like java.io.FileNotFoundException: GET (No such file or directory)
    – DP0808
    Nov 24 '18 at 16:36










  • You're not supposed to run that from the CLI, but from POSTMAN because it's a REST API, and I mentioned to start connect-distributed, not connect-standalone
    – cricket_007
    Nov 24 '18 at 17:33


















0














Disclaimer: There is no single answer to add an external Kafka Connect plugin; Confluent provides the Kafka Connect Maven plugin, but that doesn't mean people use it or even Maven to package their code.



If it is not on the Confluent Hub, then you'll have to build it by hand.





1) Clone the repo, and build it (install Git and Maven first)



git clone https://github.com/llofberg/kafka-connect-rest && cd kafka-connect-rest
mvn clean package


2) Create a directory for it on all Connect workers, similar to the other Connectors of Confluent Platform



mkdir $CONFLUENT_HOME/share/java/kafka-connect-rest


3) Find each of the shaded JARs (this connector happens to make multiple JARs, I don't know why...)



find . -iname "*shaded.jar" -type f

./kafka-connect-transform-from-json/kafka-connect-transform-from-json-plugin/target/kafka-connect-transform-from-json-plugin-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-add-headers/target/kafka-connect-transform-add-headers-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-velocity-eval/target/kafka-connect-transform-velocity-eval-1.0-SNAPSHOT-shaded.jar
./kafka-connect-rest-plugin/target/kafka-connect-rest-plugin-1.0-SNAPSHOT-shaded.jar


4) Copy each of these files into the $CONFLUENT_HOME/share/java/kafka-connect-rest folder created in step 2 for each Connect worker



5) Make sure your plugin.path of the connect-*.properties file points at the full path to $CONFLUENT_HOME/share/java



At this point, you've done all the steps that are listed in the README to build the thing and setup the plugin path, just not in Docker.



6) Start Connect (Distributed)



7) Hit GET /connector-plugins to verify the thing loaded.



8) Configure and send JSON payload to POST /connectors



I have not used this connector before, so I do not know how to configure it. Maybe see the examples or follow along with @rmoff's blog post before the KSQL stuff






share|improve this answer























  • Thank you @cricket_007. I already had done this part except GET /Connector-plugins. as i followed the same instruction and got stuck (couldn't run) GET / Connector-plugin or POST / Connectors command. On confluent hiting ./bin/connect-standalone GET /connector-plugins getting err like java.io.FileNotFoundException: GET (No such file or directory)
    – DP0808
    Nov 24 '18 at 16:36










  • You're not supposed to run that from the CLI, but from POSTMAN because it's a REST API, and I mentioned to start connect-distributed, not connect-standalone
    – cricket_007
    Nov 24 '18 at 17:33
















0












0








0






Disclaimer: There is no single answer to add an external Kafka Connect plugin; Confluent provides the Kafka Connect Maven plugin, but that doesn't mean people use it or even Maven to package their code.



If it is not on the Confluent Hub, then you'll have to build it by hand.





1) Clone the repo, and build it (install Git and Maven first)



git clone https://github.com/llofberg/kafka-connect-rest && cd kafka-connect-rest
mvn clean package


2) Create a directory for it on all Connect workers, similar to the other Connectors of Confluent Platform



mkdir $CONFLUENT_HOME/share/java/kafka-connect-rest


3) Find each of the shaded JARs (this connector happens to make multiple JARs, I don't know why...)



find . -iname "*shaded.jar" -type f

./kafka-connect-transform-from-json/kafka-connect-transform-from-json-plugin/target/kafka-connect-transform-from-json-plugin-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-add-headers/target/kafka-connect-transform-add-headers-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-velocity-eval/target/kafka-connect-transform-velocity-eval-1.0-SNAPSHOT-shaded.jar
./kafka-connect-rest-plugin/target/kafka-connect-rest-plugin-1.0-SNAPSHOT-shaded.jar


4) Copy each of these files into the $CONFLUENT_HOME/share/java/kafka-connect-rest folder created in step 2 for each Connect worker



5) Make sure your plugin.path of the connect-*.properties file points at the full path to $CONFLUENT_HOME/share/java



At this point, you've done all the steps that are listed in the README to build the thing and setup the plugin path, just not in Docker.



6) Start Connect (Distributed)



7) Hit GET /connector-plugins to verify the thing loaded.



8) Configure and send JSON payload to POST /connectors



I have not used this connector before, so I do not know how to configure it. Maybe see the examples or follow along with @rmoff's blog post before the KSQL stuff






share|improve this answer














Disclaimer: There is no single answer to add an external Kafka Connect plugin; Confluent provides the Kafka Connect Maven plugin, but that doesn't mean people use it or even Maven to package their code.



If it is not on the Confluent Hub, then you'll have to build it by hand.





1) Clone the repo, and build it (install Git and Maven first)



git clone https://github.com/llofberg/kafka-connect-rest && cd kafka-connect-rest
mvn clean package


2) Create a directory for it on all Connect workers, similar to the other Connectors of Confluent Platform



mkdir $CONFLUENT_HOME/share/java/kafka-connect-rest


3) Find each of the shaded JARs (this connector happens to make multiple JARs, I don't know why...)



find . -iname "*shaded.jar" -type f

./kafka-connect-transform-from-json/kafka-connect-transform-from-json-plugin/target/kafka-connect-transform-from-json-plugin-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-add-headers/target/kafka-connect-transform-add-headers-1.0-SNAPSHOT-shaded.jar
./kafka-connect-transform-velocity-eval/target/kafka-connect-transform-velocity-eval-1.0-SNAPSHOT-shaded.jar
./kafka-connect-rest-plugin/target/kafka-connect-rest-plugin-1.0-SNAPSHOT-shaded.jar


4) Copy each of these files into the $CONFLUENT_HOME/share/java/kafka-connect-rest folder created in step 2 for each Connect worker



5) Make sure your plugin.path of the connect-*.properties file points at the full path to $CONFLUENT_HOME/share/java



At this point, you've done all the steps that are listed in the README to build the thing and setup the plugin path, just not in Docker.



6) Start Connect (Distributed)



7) Hit GET /connector-plugins to verify the thing loaded.



8) Configure and send JSON payload to POST /connectors



I have not used this connector before, so I do not know how to configure it. Maybe see the examples or follow along with @rmoff's blog post before the KSQL stuff







share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 24 '18 at 0:35

























answered Nov 24 '18 at 0:21









cricket_007

79.5k1142109




79.5k1142109












  • Thank you @cricket_007. I already had done this part except GET /Connector-plugins. as i followed the same instruction and got stuck (couldn't run) GET / Connector-plugin or POST / Connectors command. On confluent hiting ./bin/connect-standalone GET /connector-plugins getting err like java.io.FileNotFoundException: GET (No such file or directory)
    – DP0808
    Nov 24 '18 at 16:36










  • You're not supposed to run that from the CLI, but from POSTMAN because it's a REST API, and I mentioned to start connect-distributed, not connect-standalone
    – cricket_007
    Nov 24 '18 at 17:33




















  • Thank you @cricket_007. I already had done this part except GET /Connector-plugins. as i followed the same instruction and got stuck (couldn't run) GET / Connector-plugin or POST / Connectors command. On confluent hiting ./bin/connect-standalone GET /connector-plugins getting err like java.io.FileNotFoundException: GET (No such file or directory)
    – DP0808
    Nov 24 '18 at 16:36










  • You're not supposed to run that from the CLI, but from POSTMAN because it's a REST API, and I mentioned to start connect-distributed, not connect-standalone
    – cricket_007
    Nov 24 '18 at 17:33


















Thank you @cricket_007. I already had done this part except GET /Connector-plugins. as i followed the same instruction and got stuck (couldn't run) GET / Connector-plugin or POST / Connectors command. On confluent hiting ./bin/connect-standalone GET /connector-plugins getting err like java.io.FileNotFoundException: GET (No such file or directory)
– DP0808
Nov 24 '18 at 16:36




Thank you @cricket_007. I already had done this part except GET /Connector-plugins. as i followed the same instruction and got stuck (couldn't run) GET / Connector-plugin or POST / Connectors command. On confluent hiting ./bin/connect-standalone GET /connector-plugins getting err like java.io.FileNotFoundException: GET (No such file or directory)
– DP0808
Nov 24 '18 at 16:36












You're not supposed to run that from the CLI, but from POSTMAN because it's a REST API, and I mentioned to start connect-distributed, not connect-standalone
– cricket_007
Nov 24 '18 at 17:33






You're not supposed to run that from the CLI, but from POSTMAN because it's a REST API, and I mentioned to start connect-distributed, not connect-standalone
– cricket_007
Nov 24 '18 at 17:33




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53443768%2fhow-to-integrate-rest-api-source-connector-with-kafka-connect%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

What visual should I use to simply compare current year value vs last year in Power BI desktop

Alexandru Averescu

Trompette piccolo