Introducing Schema Registry for Upstash Kafka
Today, we are thrilled to announce a significant enhancement to Upstash Kafka – the introduction of a Schema Registry! This powerful addition allows you to manage your data schemas seamlessly, ensuring data consistency and compatibility throughout your Kafka ecosystem.
What is a Schema Registry?
For those who are new to the concept, let's start with the basics. A Schema Registry is a vital component in any Kafka ecosystem. It serves as a central repository for managing schemas, enabling organizations to:
Ensure Data Consistency: With a Schema Registry, you can define the structure of your data in a standardized format. This consistency ensures that all producers and consumers of data adhere to the same schema, preventing data conflicts and errors.
Achieve Compatibility: When your data evolves over time, maintaining backward and forward compatibility is crucial. A Schema Registry helps you version and manage schemas, ensuring that new data remains compatible with older consumers and vice versa.
Enhance Data Governance: Managing schemas in a centralized registry provides greater control and visibility over your data. It allows you to track changes, access historical versions, and maintain a clear audit trail.
Upstash Kafka Schema Registry: Your Bridge to Compatibility
Our Schema Registry for Upstash Kafka is designed to seamlessly integrate with your existing Kafka infrastructure.
Key Features:
API Compatibility: We've ensured that the familiar API endpoints you've been using with Confluent Schema Registry are available in Upstash Kafka Schema Registry. You can smoothly transition your applications without having to make significant code changes.
Data Versioning: Easily manage schema versions to support evolving data structures via Upstash Kafka Schema Registry.
High Availability: Upstash Kafka Schema Registry is designed for high availability, ensuring that your schema management remains reliable, even in demanding environments.
Generous Pricing: On our free plan, you can work with up to 100 schemas without any cost. If your needs expand, our pay-as-you-go plan allows up to 1000 schemas.
How To Use
Since it is compatible with the Confluent Schema Registry, you can use it with available tools right away.
Get the required parameters from related Kafka instace page at Upstash Console.
Scroll down to the REST API
section to find the values you need:
UPSTASH_KAFKA_REST_URL
UPSTASH_KAFKA_REST_USERNAME
UPSTASH_KAFKA_REST_PASSWORD
You can use it with the KafkaAvroSerializer/Deserializer
in Producer/Consumer.
Producer example:
Properties props = new Properties();
//....
props.put("value.serializer, "io.confluent.kafka.serializers.KafkaAvroSerializer");
props.put("schema.registry.url", UPSTASH_KAFKA_REST_URL + "/schema-registry");
props.put("basic.auth.credentials.source", "USER_INFO");
props.put("basic.auth.user.info", UPSTASH_KAFKA_REST_USERNAME + ":" + UPSTASH_KAFKA_REST_PASSWORD);
try (var producer = new KafkaProducer<String, org.apache.avro.GenericRecord>(props)) {
// ...
}
Properties props = new Properties();
//....
props.put("value.serializer, "io.confluent.kafka.serializers.KafkaAvroSerializer");
props.put("schema.registry.url", UPSTASH_KAFKA_REST_URL + "/schema-registry");
props.put("basic.auth.credentials.source", "USER_INFO");
props.put("basic.auth.user.info", UPSTASH_KAFKA_REST_USERNAME + ":" + UPSTASH_KAFKA_REST_PASSWORD);
try (var producer = new KafkaProducer<String, org.apache.avro.GenericRecord>(props)) {
// ...
}
Or you can use it with the AvroConverter
in your Connectors as follows:
{
"name": "myConnector",
"properties": {
"connector.class": "io.debezium.connector.mysql.MySqlConnector",
// other configurations are skipped.
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.basic.auth.credentials.source": "USER_INFO",
"key.converter.basic.auth.user.info": "UPSTASH_KAFKA_REST_USERNAME:UPSTASH_KAFKA_REST_PASSWORD",
"key.converter.schema.registry.url": "UPSTASH_KAFKA_REST_URL/schema-registry",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.basic.auth.credentials.source": "USER_INFO",
"value.converter.basic.auth.user.info": "UPSTASH_KAFKA_REST_USERNAME:UPSTASH_KAFKA_REST_PASSWORD",
"value.converter.schema.registry.url": "UPSTASH_KAFKA_REST_URL/schema-registry"
}
}
{
"name": "myConnector",
"properties": {
"connector.class": "io.debezium.connector.mysql.MySqlConnector",
// other configurations are skipped.
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.basic.auth.credentials.source": "USER_INFO",
"key.converter.basic.auth.user.info": "UPSTASH_KAFKA_REST_USERNAME:UPSTASH_KAFKA_REST_PASSWORD",
"key.converter.schema.registry.url": "UPSTASH_KAFKA_REST_URL/schema-registry",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.basic.auth.credentials.source": "USER_INFO",
"value.converter.basic.auth.user.info": "UPSTASH_KAFKA_REST_USERNAME:UPSTASH_KAFKA_REST_PASSWORD",
"value.converter.schema.registry.url": "UPSTASH_KAFKA_REST_URL/schema-registry"
}
}
You can also use it programatically via java client SchemaRegistryClient and view/modify the contents via Third party UI tools. Check our documentation here for details about how to configure them.
Get Started Today!
The introduction of Schema Registry for Upstash Kafka is a significant milestone in our commitment to providing a comprehensive and user-friendly data streaming experience. Whether you're a seasoned Kafka user or just getting started, our Schema Registry will allow you to manage your data schemas effectively.
For more details and insights of Schema Registry for Upstash Kafka, you can explore our documentation here.
If you have any questions or need assistance, you can reach out to our support team at support@upstash.com or join our community on Discord. Don't forget to stay updated by following us on Twitter, and happy streaming!