KafkaAvroDeserializer不返回SpecificRecord但返回GenericRecord

我的KafkaProducer能够使用KafkaAvroSerializer将对象序列化为我的主题。 但是, KafkaConsumer.poll()返回反序列化的GenericRecord而不是我的序列化类。

MyKafkaProducer

  KafkaProducer producer; try (InputStream props = Resources.getResource("producer.props").openStream()) { Properties properties = new Properties(); properties.load(props); properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, io.confluent.kafka.serializers.KafkaAvroSerializer.class); properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, io.confluent.kafka.serializers.KafkaAvroSerializer.class); properties.put("schema.registry.url", "http://localhost:8081"); MyBean bean = new MyBean(); producer = new KafkaProducer(properties); producer.send(new ProducerRecord(topic, bean.getId(), bean)); 

我的KafkaConsumer

  try (InputStream props = Resources.getResource("consumer.props").openStream()) { properties.load(props); properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, io.confluent.kafka.serializers.KafkaAvroDeserializer.class); properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, io.confluent.kafka.serializers.KafkaAvroDeserializer.class); properties.put("schema.registry.url", "http://localhost:8081"); consumer = new KafkaConsumer(properties); } consumer.subscribe(Arrays.asList(topic)); try { while (true) { ConsumerRecords records = consumer.poll(100); if (records.isEmpty()) { continue; } for (ConsumerRecord record : records) { MyBean bean = record.value(); // <-------- This is throwing a cast Exception because it cannot cast GenericRecord to MyBean System.out.println("consumer received: " + bean); } } 

MyBean bean = record.value(); 该行抛出一个强制转换exception,因为它无法将GenericRecord强制转换为MyBean。

我正在使用kafka-client-0.9.0.1kafka-avro-serializer-3.0.0

KafkaAvroDeserializer支持SpecificData

默认情况下不启用它。 要启用它:

 properties.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, true); 

KafkaAvroDeserializer不支持ReflectData

Confluent的KafkaAvroDeserializer不知道如何使用Avro ReflectData反序列化。 我不得不扩展它以支持Avro ReflectData:

 /** * Extends deserializer to support ReflectData. * * @param  * value type */ public abstract class ReflectKafkaAvroDeserializer extends KafkaAvroDeserializer { private Schema readerSchema; private DecoderFactory decoderFactory = DecoderFactory.get(); protected ReflectKafkaAvroDeserializer(Class type) { readerSchema = ReflectData.get().getSchema(type); } @Override protected Object deserialize( boolean includeSchemaAndVersion, String topic, Boolean isKey, byte[] payload, Schema readerSchemaIgnored) throws SerializationException { if (payload == null) { return null; } int schemaId = -1; try { ByteBuffer buffer = ByteBuffer.wrap(payload); if (buffer.get() != MAGIC_BYTE) { throw new SerializationException("Unknown magic byte!"); } schemaId = buffer.getInt(); Schema writerSchema = schemaRegistry.getByID(schemaId); int start = buffer.position() + buffer.arrayOffset(); int length = buffer.limit() - 1 - idSize; DatumReader reader = new ReflectDatumReader(writerSchema, readerSchema); BinaryDecoder decoder = decoderFactory.binaryDecoder(buffer.array(), start, length, null); return reader.read(null, decoder); } catch (IOException e) { throw new SerializationException("Error deserializing Avro message for id " + schemaId, e); } catch (RestClientException e) { throw new SerializationException("Error retrieving Avro schema for id " + schemaId, e); } } } 

定义一个反序列化到MyBean的自定义反序列化器类:

 public class MyBeanDeserializer extends ReflectKafkaAvroDeserializer { public MyBeanDeserializer() { super(MyBean.class); } } 

配置KafkaConsumer以使用自定义反序列化器类:

 properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, MyBeanDeserializer.class); 

为了增加Chin Huang的答案,为了获得最少的代码和更好的性能,您应该以这种方式实现它:

 /** * Extends deserializer to support ReflectData. * * @param  * value type */ public abstract class SpecificKafkaAvroDeserializer extends AbstractKafkaAvroDeserializer implements Deserializer { private final Schema schema; private Class type; private DecoderFactory decoderFactory = DecoderFactory.get(); protected SpecificKafkaAvroDeserializer(Class type, Map props) { this.type = type; this.schema = ReflectData.get().getSchema(type); this.configure(this.deserializerConfig(props)); } public void configure(Map configs) { this.configure(new KafkaAvroDeserializerConfig(configs)); } @Override protected T deserialize( boolean includeSchemaAndVersion, String topic, Boolean isKey, byte[] payload, Schema readerSchemaIgnore) throws SerializationException { if (payload == null) { return null; } int schemaId = -1; try { ByteBuffer buffer = ByteBuffer.wrap(payload); if (buffer.get() != MAGIC_BYTE) { throw new SerializationException("Unknown magic byte!"); } schemaId = buffer.getInt(); Schema schema = schemaRegistry.getByID(schemaId); Schema readerSchema = ReflectData.get().getSchema(type); int start = buffer.position() + buffer.arrayOffset(); int length = buffer.limit() - 1 - idSize; SpecificDatumReader reader = new SpecificDatumReader(schema, readerSchema); BinaryDecoder decoder = decoderFactory.binaryDecoder(buffer.array(), start, length, null); return reader.read(null, decoder); } catch (IOException e) { throw new SerializationException("Error deserializing Avro message for id " + schemaId, e); } catch (RestClientException e) { throw new SerializationException("Error retrieving Avro schema for id " + schemaId, e); } } }