ElasticSearch in a runnable jar: lucene problems












2














I am trying to create a fat executable jar with maven, including elasticsearch as a dependency to create a TransportClient to a running elasticsearch node. From eclipse the client connects just fine to the node, but when I create a jar from the whole project and run that with java -jar bla.jar, the connection fails with:



DEBUG - [Armor] adding address [{#transport#-1}{127.0.0.1}{127.0.0.1:9300}]
DEBUG - [Armor] connected to node [{#transport#-1}{127.0.0.1}{127.0.0.1:9300}]
INFO - [Armor] failed to get node info for {#transport#-1}{127.0.0.1}{127.0.0.1:9300}, disconnecting...
org.elasticsearch.transport.RemoteTransportException: [Failed to deserialize response of type [org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]]
Caused by: org.elasticsearch.transport.TransportSerializationException: Failed to deserialize response of type [org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]
at org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:179) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]
...
Caused by: java.lang.IllegalArgumentException: An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090, completion090, XBloomFilter]
at org.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:109) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.apache.lucene.codecs.PostingsFormat.forName(PostingsFormat.java:112) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.elasticsearch.common.lucene.Lucene.<clinit>(Lucene.java:68) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.elasticsearch.Version.fromId(Version.java:508) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.elasticsearch.Version.readVersion(Version.java:280) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.elasticsearch.cluster.node.DiscoveryNode.readFrom(DiscoveryNode.java:327) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.elasticsearch.cluster.node.DiscoveryNode.readNode(DiscoveryNode.java:310) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse.readFrom(LivenessResponse.java:52) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:177) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) ~[cmd-0.0.1-SNAPSHOT.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]


With the root cause of An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090, completion090, XBloomFilter]. I read up on this, it basically means that my final jar includes same-named manifest files from the various lucene dependencies, that are overwritten by each other. I used the maven-shade-plugin with a manifest resource transformer to solve the problem, but the error remains:



<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>de.test.cmd.Main</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>


Then I looked for elasticsearch jars without lucene dependency, I just want to create a transportclient for communication. No such luck, there is only that one jar, including lucene. What can I do to put elasticsearch into a runnable jar? or should I keep all dependencies in a lib folder besides the jar? I don't know how to tell maven to do that



--



UPDATE: When I use "Copy required libraries into a sub-folder besides the jar" from eclipse export dialog, it works. I let eclipse generate a build.xml from that, but I still want to try and use maven for building that jar, either with dependencies included, or as an extra directory.










share|improve this question





























    2














    I am trying to create a fat executable jar with maven, including elasticsearch as a dependency to create a TransportClient to a running elasticsearch node. From eclipse the client connects just fine to the node, but when I create a jar from the whole project and run that with java -jar bla.jar, the connection fails with:



    DEBUG - [Armor] adding address [{#transport#-1}{127.0.0.1}{127.0.0.1:9300}]
    DEBUG - [Armor] connected to node [{#transport#-1}{127.0.0.1}{127.0.0.1:9300}]
    INFO - [Armor] failed to get node info for {#transport#-1}{127.0.0.1}{127.0.0.1:9300}, disconnecting...
    org.elasticsearch.transport.RemoteTransportException: [Failed to deserialize response of type [org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]]
    Caused by: org.elasticsearch.transport.TransportSerializationException: Failed to deserialize response of type [org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]
    at org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:179) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
    at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]
    ...
    Caused by: java.lang.IllegalArgumentException: An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090, completion090, XBloomFilter]
    at org.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:109) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.apache.lucene.codecs.PostingsFormat.forName(PostingsFormat.java:112) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.elasticsearch.common.lucene.Lucene.<clinit>(Lucene.java:68) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.elasticsearch.Version.fromId(Version.java:508) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.elasticsearch.Version.readVersion(Version.java:280) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.elasticsearch.cluster.node.DiscoveryNode.readFrom(DiscoveryNode.java:327) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.elasticsearch.cluster.node.DiscoveryNode.readNode(DiscoveryNode.java:310) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse.readFrom(LivenessResponse.java:52) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:177) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) ~[cmd-0.0.1-SNAPSHOT.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
    at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]


    With the root cause of An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090, completion090, XBloomFilter]. I read up on this, it basically means that my final jar includes same-named manifest files from the various lucene dependencies, that are overwritten by each other. I used the maven-shade-plugin with a manifest resource transformer to solve the problem, but the error remains:



    <build>
    <plugins>
    <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-shade-plugin</artifactId>
    <version>2.4.3</version>
    <executions>
    <execution>
    <phase>package</phase>
    <goals>
    <goal>shade</goal>
    </goals>
    <configuration>
    <transformers>
    <transformer
    implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
    <mainClass>de.test.cmd.Main</mainClass>
    </transformer>
    </transformers>
    </configuration>
    </execution>
    </executions>
    </plugin>
    </plugins>
    </build>


    Then I looked for elasticsearch jars without lucene dependency, I just want to create a transportclient for communication. No such luck, there is only that one jar, including lucene. What can I do to put elasticsearch into a runnable jar? or should I keep all dependencies in a lib folder besides the jar? I don't know how to tell maven to do that



    --



    UPDATE: When I use "Copy required libraries into a sub-folder besides the jar" from eclipse export dialog, it works. I let eclipse generate a build.xml from that, but I still want to try and use maven for building that jar, either with dependencies included, or as an extra directory.










    share|improve this question



























      2












      2








      2







      I am trying to create a fat executable jar with maven, including elasticsearch as a dependency to create a TransportClient to a running elasticsearch node. From eclipse the client connects just fine to the node, but when I create a jar from the whole project and run that with java -jar bla.jar, the connection fails with:



      DEBUG - [Armor] adding address [{#transport#-1}{127.0.0.1}{127.0.0.1:9300}]
      DEBUG - [Armor] connected to node [{#transport#-1}{127.0.0.1}{127.0.0.1:9300}]
      INFO - [Armor] failed to get node info for {#transport#-1}{127.0.0.1}{127.0.0.1:9300}, disconnecting...
      org.elasticsearch.transport.RemoteTransportException: [Failed to deserialize response of type [org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]]
      Caused by: org.elasticsearch.transport.TransportSerializationException: Failed to deserialize response of type [org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]
      at org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:179) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
      at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]
      ...
      Caused by: java.lang.IllegalArgumentException: An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090, completion090, XBloomFilter]
      at org.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:109) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.apache.lucene.codecs.PostingsFormat.forName(PostingsFormat.java:112) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.common.lucene.Lucene.<clinit>(Lucene.java:68) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.Version.fromId(Version.java:508) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.Version.readVersion(Version.java:280) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.cluster.node.DiscoveryNode.readFrom(DiscoveryNode.java:327) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.cluster.node.DiscoveryNode.readNode(DiscoveryNode.java:310) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse.readFrom(LivenessResponse.java:52) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:177) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
      at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]


      With the root cause of An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090, completion090, XBloomFilter]. I read up on this, it basically means that my final jar includes same-named manifest files from the various lucene dependencies, that are overwritten by each other. I used the maven-shade-plugin with a manifest resource transformer to solve the problem, but the error remains:



      <build>
      <plugins>
      <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-shade-plugin</artifactId>
      <version>2.4.3</version>
      <executions>
      <execution>
      <phase>package</phase>
      <goals>
      <goal>shade</goal>
      </goals>
      <configuration>
      <transformers>
      <transformer
      implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
      <mainClass>de.test.cmd.Main</mainClass>
      </transformer>
      </transformers>
      </configuration>
      </execution>
      </executions>
      </plugin>
      </plugins>
      </build>


      Then I looked for elasticsearch jars without lucene dependency, I just want to create a transportclient for communication. No such luck, there is only that one jar, including lucene. What can I do to put elasticsearch into a runnable jar? or should I keep all dependencies in a lib folder besides the jar? I don't know how to tell maven to do that



      --



      UPDATE: When I use "Copy required libraries into a sub-folder besides the jar" from eclipse export dialog, it works. I let eclipse generate a build.xml from that, but I still want to try and use maven for building that jar, either with dependencies included, or as an extra directory.










      share|improve this question















      I am trying to create a fat executable jar with maven, including elasticsearch as a dependency to create a TransportClient to a running elasticsearch node. From eclipse the client connects just fine to the node, but when I create a jar from the whole project and run that with java -jar bla.jar, the connection fails with:



      DEBUG - [Armor] adding address [{#transport#-1}{127.0.0.1}{127.0.0.1:9300}]
      DEBUG - [Armor] connected to node [{#transport#-1}{127.0.0.1}{127.0.0.1:9300}]
      INFO - [Armor] failed to get node info for {#transport#-1}{127.0.0.1}{127.0.0.1:9300}, disconnecting...
      org.elasticsearch.transport.RemoteTransportException: [Failed to deserialize response of type [org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]]
      Caused by: org.elasticsearch.transport.TransportSerializationException: Failed to deserialize response of type [org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]
      at org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:179) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
      at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]
      ...
      Caused by: java.lang.IllegalArgumentException: An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090, completion090, XBloomFilter]
      at org.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:109) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.apache.lucene.codecs.PostingsFormat.forName(PostingsFormat.java:112) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.common.lucene.Lucene.<clinit>(Lucene.java:68) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.Version.fromId(Version.java:508) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.Version.readVersion(Version.java:280) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.cluster.node.DiscoveryNode.readFrom(DiscoveryNode.java:327) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.cluster.node.DiscoveryNode.readNode(DiscoveryNode.java:310) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse.readFrom(LivenessResponse.java:52) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:177) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) ~[cmd-0.0.1-SNAPSHOT.jar:?]
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_66-internal]
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_66-internal]
      at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_66-internal]


      With the root cause of An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090, completion090, XBloomFilter]. I read up on this, it basically means that my final jar includes same-named manifest files from the various lucene dependencies, that are overwritten by each other. I used the maven-shade-plugin with a manifest resource transformer to solve the problem, but the error remains:



      <build>
      <plugins>
      <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-shade-plugin</artifactId>
      <version>2.4.3</version>
      <executions>
      <execution>
      <phase>package</phase>
      <goals>
      <goal>shade</goal>
      </goals>
      <configuration>
      <transformers>
      <transformer
      implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
      <mainClass>de.test.cmd.Main</mainClass>
      </transformer>
      </transformers>
      </configuration>
      </execution>
      </executions>
      </plugin>
      </plugins>
      </build>


      Then I looked for elasticsearch jars without lucene dependency, I just want to create a transportclient for communication. No such luck, there is only that one jar, including lucene. What can I do to put elasticsearch into a runnable jar? or should I keep all dependencies in a lib folder besides the jar? I don't know how to tell maven to do that



      --



      UPDATE: When I use "Copy required libraries into a sub-folder besides the jar" from eclipse export dialog, it works. I let eclipse generate a build.xml from that, but I still want to try and use maven for building that jar, either with dependencies included, or as an extra directory.







      java maven elasticsearch jar lucene






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Jan 28 '16 at 16:29

























      asked Jan 28 '16 at 16:20









      Eike Cochu

      1,35452349




      1,35452349
























          3 Answers
          3






          active

          oldest

          votes


















          3














          You should add the follow transformer tag to the shade plugin.



          <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>






          share|improve this answer





























            1














            I faced the same issue and what I found to work cleaner in maven was the inclusion of lucene-core dependency explicitly above the elasticsearch dependency in the project's pom.xml.



            <dependency>
            <groupId>org.apache.lucene</groupId>
            <artifactId>lucene-core</artifactId>
            <version>5.4.1</version>
            </dependency>
            <dependency>
            <groupId>org.elasticsearch</groupId>
            <artifactId>elasticsearch</artifactId>
            <version>2.2.0</version>
            </dependency>





            share|improve this answer





















            • Thanks , that worked for me !
              – igx
              Jan 25 '17 at 16:10



















            0














            in my case, it's caused the by the incorrect



            META-INF/services/{org.apache.lucene.codecs.Codec,
            org.apache.lucene.codecs.PostingsFormat,
            org.apache.lucene.codecs.DocValuesFormat}



            being overwritten by the wrong lib.



            I suppose there are multiple libraries that has the same files that during the assembly, the incorrect one got overwritten.



            Just open the correct jar file and copy the same files into your resource folder. I works most of the time ;)






            share|improve this answer





















              Your Answer






              StackExchange.ifUsing("editor", function () {
              StackExchange.using("externalEditor", function () {
              StackExchange.using("snippets", function () {
              StackExchange.snippets.init();
              });
              });
              }, "code-snippets");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "1"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f35066597%2felasticsearch-in-a-runnable-jar-lucene-problems%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              3














              You should add the follow transformer tag to the shade plugin.



              <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>






              share|improve this answer


























                3














                You should add the follow transformer tag to the shade plugin.



                <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>






                share|improve this answer
























                  3












                  3








                  3






                  You should add the follow transformer tag to the shade plugin.



                  <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>






                  share|improve this answer












                  You should add the follow transformer tag to the shade plugin.



                  <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Jul 28 '16 at 9:48









                  Bing

                  5818




                  5818

























                      1














                      I faced the same issue and what I found to work cleaner in maven was the inclusion of lucene-core dependency explicitly above the elasticsearch dependency in the project's pom.xml.



                      <dependency>
                      <groupId>org.apache.lucene</groupId>
                      <artifactId>lucene-core</artifactId>
                      <version>5.4.1</version>
                      </dependency>
                      <dependency>
                      <groupId>org.elasticsearch</groupId>
                      <artifactId>elasticsearch</artifactId>
                      <version>2.2.0</version>
                      </dependency>





                      share|improve this answer





















                      • Thanks , that worked for me !
                        – igx
                        Jan 25 '17 at 16:10
















                      1














                      I faced the same issue and what I found to work cleaner in maven was the inclusion of lucene-core dependency explicitly above the elasticsearch dependency in the project's pom.xml.



                      <dependency>
                      <groupId>org.apache.lucene</groupId>
                      <artifactId>lucene-core</artifactId>
                      <version>5.4.1</version>
                      </dependency>
                      <dependency>
                      <groupId>org.elasticsearch</groupId>
                      <artifactId>elasticsearch</artifactId>
                      <version>2.2.0</version>
                      </dependency>





                      share|improve this answer





















                      • Thanks , that worked for me !
                        – igx
                        Jan 25 '17 at 16:10














                      1












                      1








                      1






                      I faced the same issue and what I found to work cleaner in maven was the inclusion of lucene-core dependency explicitly above the elasticsearch dependency in the project's pom.xml.



                      <dependency>
                      <groupId>org.apache.lucene</groupId>
                      <artifactId>lucene-core</artifactId>
                      <version>5.4.1</version>
                      </dependency>
                      <dependency>
                      <groupId>org.elasticsearch</groupId>
                      <artifactId>elasticsearch</artifactId>
                      <version>2.2.0</version>
                      </dependency>





                      share|improve this answer












                      I faced the same issue and what I found to work cleaner in maven was the inclusion of lucene-core dependency explicitly above the elasticsearch dependency in the project's pom.xml.



                      <dependency>
                      <groupId>org.apache.lucene</groupId>
                      <artifactId>lucene-core</artifactId>
                      <version>5.4.1</version>
                      </dependency>
                      <dependency>
                      <groupId>org.elasticsearch</groupId>
                      <artifactId>elasticsearch</artifactId>
                      <version>2.2.0</version>
                      </dependency>






                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered Jul 13 '16 at 14:07









                      bizentass

                      12319




                      12319












                      • Thanks , that worked for me !
                        – igx
                        Jan 25 '17 at 16:10


















                      • Thanks , that worked for me !
                        – igx
                        Jan 25 '17 at 16:10
















                      Thanks , that worked for me !
                      – igx
                      Jan 25 '17 at 16:10




                      Thanks , that worked for me !
                      – igx
                      Jan 25 '17 at 16:10











                      0














                      in my case, it's caused the by the incorrect



                      META-INF/services/{org.apache.lucene.codecs.Codec,
                      org.apache.lucene.codecs.PostingsFormat,
                      org.apache.lucene.codecs.DocValuesFormat}



                      being overwritten by the wrong lib.



                      I suppose there are multiple libraries that has the same files that during the assembly, the incorrect one got overwritten.



                      Just open the correct jar file and copy the same files into your resource folder. I works most of the time ;)






                      share|improve this answer


























                        0














                        in my case, it's caused the by the incorrect



                        META-INF/services/{org.apache.lucene.codecs.Codec,
                        org.apache.lucene.codecs.PostingsFormat,
                        org.apache.lucene.codecs.DocValuesFormat}



                        being overwritten by the wrong lib.



                        I suppose there are multiple libraries that has the same files that during the assembly, the incorrect one got overwritten.



                        Just open the correct jar file and copy the same files into your resource folder. I works most of the time ;)






                        share|improve this answer
























                          0












                          0








                          0






                          in my case, it's caused the by the incorrect



                          META-INF/services/{org.apache.lucene.codecs.Codec,
                          org.apache.lucene.codecs.PostingsFormat,
                          org.apache.lucene.codecs.DocValuesFormat}



                          being overwritten by the wrong lib.



                          I suppose there are multiple libraries that has the same files that during the assembly, the incorrect one got overwritten.



                          Just open the correct jar file and copy the same files into your resource folder. I works most of the time ;)






                          share|improve this answer












                          in my case, it's caused the by the incorrect



                          META-INF/services/{org.apache.lucene.codecs.Codec,
                          org.apache.lucene.codecs.PostingsFormat,
                          org.apache.lucene.codecs.DocValuesFormat}



                          being overwritten by the wrong lib.



                          I suppose there are multiple libraries that has the same files that during the assembly, the incorrect one got overwritten.



                          Just open the correct jar file and copy the same files into your resource folder. I works most of the time ;)







                          share|improve this answer












                          share|improve this answer



                          share|improve this answer










                          answered Nov 22 at 22:41









                          nemo

                          4,19531422




                          4,19531422






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Stack Overflow!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              To learn more, see our tips on writing great answers.





                              Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                              Please pay close attention to the following guidance:


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f35066597%2felasticsearch-in-a-runnable-jar-lucene-problems%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              What visual should I use to simply compare current year value vs last year in Power BI desktop

                              How to ignore python UserWarning in pytest?

                              Alexandru Averescu