Loading Avro Schema Compatibility Checker...

How to Check Avro Schema Compatibility - Complete Guide

Step 1

Paste Your Original Schema

The Original Schema (left panel) is your existing schema — the one currently in production or registered in your Kafka Schema Registry.

Paste directly: Copy your existing Avro schema from your IDE, schema registry, or version control
Upload file: Click the Upload button to select an .avsc or .json file
Try sample: Click the Sample button to load a pre-built example with an old and new schema pair

Example original schema:

{
  "type": "record",
  "name": "User",
  "fields": [
    { "name": "id", "type": "int" },
    { "name": "name", "type": "string" }
  ]
}
Step 2

Paste Your New Schema

The New Schema (right panel) is your updated schema — the one you plan to deploy. The checker compares the two and reports:

Backward compatibility: Can the new schema read data written with the old schema? Critical for consumers upgrading before producers.
Forward compatibility: Can the old schema read data written with the new schema? Critical for producers upgrading before consumers.
Full compatibility: Both directions work — the safest level for rolling upgrades in Kafka pipelines.

Safe change example (adding optional field with default):

{
  "name": "age",
  "type": ["null", "int"],
  "default": null
}
Step 3

Review Compatibility Results

Results update instantly as you type. Each compatibility type is shown with a clear pass/fail status and explanation:

Green badge: Compatible — safe to deploy in the corresponding upgrade scenario
Red badge: Incompatible — schema change will break consumers or producers using the indicated direction
Error message: Specific reason for incompatibility shown inline to help you fix the change

Use the results to decide whether your schema change is safe. For Kafka Schema Registry, aim for Full Compatible for the safest rollouts. Learn more in our Avro with Apache Kafka guide.

What is Avro Schema Compatibility?

Avro schema compatibility defines whether data written with one schema version can be read by another version. It is the foundation of safe schema evolution in Kafka and big data systems.

  • Backward compatible: New schema reads data written by old schema. Consumers can upgrade first.
  • Forward compatible: Old schema reads data written by new schema. Producers can upgrade first.
  • Full compatible: Both directions — the safest for rolling upgrades without coordination.
  • Safe changes: Adding optional fields with defaults is typically backward and forward compatible. Removing fields or changing types usually breaks compatibility.

Frequently Asked Questions

What is backward compatibility in Avro?

Backward compatibility means the new (reader) schema can read data written with the old (writer) schema. This is required when you upgrade consumers before producers. Adding a new optional field with a default value is a backward-compatible change.

What is forward compatibility in Avro?

Forward compatibility means the old schema can read data written with the new schema. This is needed when producers upgrade before consumers. Removing a field with a default value is a forward-compatible change.

What schema changes break compatibility?

Common breaking changes include: removing a required field (no default), adding a required field without a default value, changing a field's type to an incompatible type (e.g., string to int), and renaming a field without an alias. Always check compatibility before deploying schema changes to Kafka.

How does this work with Kafka Schema Registry?

Kafka Schema Registry enforces compatibility rules when you register a new schema version. Use this tool to pre-check compatibility locally before submitting to the registry, so you catch issues before deployment.

What does full compatibility mean?

Full compatibility is the safest level — it requires both backward AND forward compatibility. With full compatibility, any consumer or producer can upgrade in any order without coordination. This is achieved by only adding optional fields with defaults or removing optional fields with defaults.

Is this tool free to use?

Yes, completely free with no limitations. No registration required. The checker runs entirely in your browser using the official avsc library — your schemas never leave your machine.