Libat上的UnsatisfiedLinkError在使用Kafka Streams进行开发时会破坏DB dll

我正在开发Windows机器上编写Kafka Streams应用程序。 如果我尝试使用Kafka Streams的leftJoinbranchfunction,我在执行jar应用程序时会收到以下错误:

 Exception in thread "StreamThread-1" java.lang.UnsatisfiedLinkError: C:\Users\user\AppData\Local\Temp\librocksdbjni325337723194862275.dll: Can't find dependent libraries at java.lang.ClassLoader$NativeLibrary.load(Native Method) at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941) at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824) at java.lang.Runtime.load0(Runtime.java:809) at java.lang.System.load(System.java:1086) at org.rocksdb.NativeLibraryLoader.loadLibraryFromJar(NativeLibraryLoader.java:78) at org.rocksdb.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:56) at org.rocksdb.RocksDB.loadLibrary(RocksDB.java:64) at org.rocksdb.RocksDB.(RocksDB.java:35) at org.rocksdb.Options.(Options.java:22) at org.apache.kafka.streams.state.internals.RocksDBStore.openDB(RocksDBStore.java:115) at org.apache.kafka.streams.state.internals.Segment.openDB(Segment.java:38) at org.apache.kafka.streams.state.internals.Segments.getOrCreateSegment(Segments.java:75) at org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStore.put(RocksDBSegmentedBytesStore.java:72) at org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStore.put(ChangeLoggingSegmentedBytesStore.java:54) at org.apache.kafka.streams.state.internals.MeteredSegmentedBytesStore.put(MeteredSegmentedBytesStore.java:101) at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:109) at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:101) at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:65) at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48) at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188) at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134) at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83) at org.apache.kafka.streams.kstream.internals.KStreamFlatMapValues$KStreamFlatMapValuesProcessor.process(KStreamFlatMapValues.java:43) at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48) at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188) at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134) at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83) at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:44) at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48) at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188) at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134) at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83) at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:70) at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:197) at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:641) at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:368) 

好像Kafka找不到DLL,但等等……我正在开发一个Java应用程序!

可能是什么问题呢? 如果我尝试更简单的流操作,如只有一个filter ,为什么这个错误不显示?

更新:

仅当代理中存在消息时才会引发此问题。 我正在使用Kafka Streams版本0.10.2.1。

这是引发问题的一段代码

 public class KafkaStreamsMainClass { private KafkaStreamsMainClass() { } public static void main(final String[] args) throws Exception { Properties streamsConfiguration = new Properties(); streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "kafka-streams"); streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-server:9092"); streamsConfiguration.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "schema-registry:8082"); streamsConfiguration.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 10 * 1000); streamsConfiguration.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0); streamsConfiguration.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, GenericAvroSerde.class); streamsConfiguration.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class); KStreamBuilder builder = new KStreamBuilder(); KStream sourceStream = builder.stream(SOURCE_TOPIC); KStream finishedFiltered = sourceStream .filter((GenericRecord key, GenericRecord value) -> value.get("endTime") != null); KStream[] branchedStreams = sourceStream .filter((GenericRecord key, GenericRecord value) -> value.get("endTime") == null) .branch((GenericRecord key, GenericRecord value) -> value.get("firstField") != null, (GenericRecord key, GenericRecord value) -> value.get("secondField") != null); branchedStreams[0] = finishedFiltered.join(branchedStreams[0], (GenericRecord value1, GenericRecord value2) -> { return value1; }, JoinWindows.of(TimeUnit.SECONDS.toMillis(2))); branchedStreams[1] = finishedFiltered.join(branchedStreams[1], (GenericRecord value1, GenericRecord value2) -> { return value1; }, JoinWindows.of(TimeUnit.SECONDS.toMillis(2))); KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration); streams.setUncaughtExceptionHandler((Thread thread, Throwable throwable) -> { throwable.printStackTrace(); }); streams.start(); Runtime.getRuntime().addShutdownHook(new Thread(streams::close)); } } 

我打开了由Maven下载的rocksdbjni-5.0.1.jar存档,它包含了librocksdbjni-win64.dll库。 它似乎试图从RocksDB的外部而不是从内部检索库。

我正在使用Windows 7计算机进行开发。

你有没有经历过这个问题?

最近我也遇到了这个问题。 我设法分两步解决了这个问题:

  1. C:\Users\[your_user]\AppData\Local\Temp文件夹中删除所有librocksdbjni[...].dll文件。
  2. 在项目中为rocksdb添加maven依赖rocksdb ,这对我rocksdb : https : rocksdb

编译您的Kafka Stream应用程序并运行它。 它应该工作!

我将我的kafka-streams项目更新到最新发布的1.0.0版本。

这个版本受到这个错误的影响,但是在修补它并在内部Artifactory服务器上上传这个修补版本后,我们就能够在Windows和Linux上执行我们的kafka-streams代理。 下一个版本1.0.1和1.1.0将修复此错误,因此只要其中一个版本发布,我们就会切换到它们而不是修补版本。

总结一下Kafka的人在1.0.0版本中解决了这个问题。

您缺少rocksdb dll所依赖的一些本机库。 请参阅https://github.com/facebook/rocksdb/issues/1302