Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Json Compatibility Checker #155

Open
wants to merge 30 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 16 additions & 5 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -270,7 +270,7 @@ project('serializers:avro') {
classifier ='all'
mergeServiceFiles()
}

tasks.build.dependsOn tasks.shadowJar
artifacts { archives shadowJar }
javadoc {
Expand Down Expand Up @@ -392,6 +392,18 @@ project('serializers') {
testCompile group: 'io.pravega', name: 'pravega-client', version: pravegaVersion
}

shadowJar {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we need this for json compatibility checker?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not understand why would this change for the json compatibility checker, which is just a package being added to the project. I think the structure of the build.gradle file remains the same, with shading of dependencies and creation of fat jars.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my question is why is this being added ? you are not creating a new module. and all existing projects that we wanted shaded, are already done so.

//classifier = 'tests'
//from sourceSets.test.output
//configurations = [project.configurations.testRuntime]
zip64 true
relocate 'org.apache.avro', 'io.pravega.schemaregistry.shaded.org.apache.avro'
classifier ='all'
mergeServiceFiles()
}

tasks.build.dependsOn tasks.shadowJar

shadowJar {
zip64 true
relocate 'org.xerial.snappy' , 'io.pravega.schemaregistry.shaded.org.xerial.snappy'
Expand All @@ -409,7 +421,7 @@ project('serializers') {
classifier ='all'
mergeServiceFiles()
}

tasks.build.dependsOn tasks.shadowJar
artifacts { archives shadowJar }
javadoc {
Expand Down Expand Up @@ -508,7 +520,7 @@ project('test') {
compile project(':common')
compile project(':contract')
compile project(':client')
compile project(':server')
compile project(':server')
compile project(':serializers')
compile project(':serializers:protobuf')
compile project(':serializers:avro')
Expand Down Expand Up @@ -733,5 +745,4 @@ class DockerPushTask extends Exec {
return tag
}
}
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,139 @@
package io.pravega.schemaregistry.rules.jsoncompatibility;

import com.fasterxml.jackson.databind.JsonNode;
import com.google.common.collect.Iterators;

import java.util.Iterator;

import static io.pravega.schemaregistry.rules.jsoncompatibility.BreakingChangesStore.BreakingChanges;

/**
* this class provides methods for comparing array types in 2 schemas.
* the simple body check is used for instances when an array type node is found. Not when there is an array type.
* being defined
*/
public class ArrayTypeComparator {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add some java docs to explain what this class does


JsonCompatibilityChecker jsonCompatibilityChecker;

public void setJsonTypeComparator() {
this.jsonCompatibilityChecker = new JsonCompatibilityChecker();
}

public BreakingChanges compareArrays(JsonNode toCheck, JsonNode toCheckAgainst) {
if (toCheck.isArray() && toCheckAgainst.isArray())
return arraySimpleBodyComparision(toCheck, toCheckAgainst);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you need to do other checks after arraySimpleBodyComparison?
if so, returning from here in case where there are no breaking changes as part of this may not be the desired goal. right?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In array definitions for json, simple body check will not be used. Had added this for required and enum arrays but seems like this is redundant now since I have other helper methods for them. I am keeping it for the moment to see if it can be used for tuple validation of arrays.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you perhaps misunderstood my point. you are "returning" after doin array simple body comparisons.. i am sure there are other checks that need to be done if this is true. if its false, its ok to return. but if its true, shouldnt you move to the next check and perform that. right?

else {
if (checkUniqueItems(toCheck, toCheckAgainst) != null)
return checkUniqueItems(toCheck, toCheckAgainst);
BreakingChanges minMaxChanges = minMaxItems(toCheck, toCheckAgainst);
if (minMaxChanges != null)
return minMaxChanges;
BreakingChanges additionalItemsChanges = additionalItems(toCheck, toCheckAgainst);
if (additionalItemsChanges != null)
return additionalItemsChanges;
BreakingChanges itemsValidationChanges = itemValidation(toCheck, toCheckAgainst);
if (itemsValidationChanges != null)
return itemsValidationChanges;
//TODO: Add contains and tupleValidation
}
return null;
}

private BreakingChanges arraySimpleBodyComparision(JsonNode toCheck, JsonNode toCheckAgainst) {
Iterator<String> allNodes = Iterators.concat(toCheck.fieldNames(), toCheckAgainst.fieldNames());
while (allNodes.hasNext()) {
String item = allNodes.next();
if (!toCheck.has(item))
return BreakingChanges.ARRAY_SIMPLE_BODY_CHECK_ELEMENT_REMOVED;
else if (!toCheckAgainst.has(item))
return BreakingChanges.ARRAY_SIMPLE_BODY_CHECK_ELEMENT_ADDED;
}
return null;
}

private BreakingChanges checkUniqueItems(JsonNode toCheck, JsonNode toCheckAgainst) {
if (toCheck.has("uniqueItems")) {
if (toCheck.get("uniqueItems").isBoolean() && toCheck.get("uniqueItems").asText() == "true") {
if (toCheckAgainst.get("uniqueItems") == null || (toCheckAgainst.get(
"uniqueItems").isBoolean() && toCheckAgainst.get(
"uniqueItems").asText().equals("false")))
return BreakingChanges.ARRAY_UNIQUE_ITEMS_CONDITION_ENABLED;
}
}
return null;
}

private BreakingChanges minMaxItems(JsonNode toCheck, JsonNode toCheckAgainst) {
if (toCheck.get("maxItems") != null && toCheckAgainst.get("maxItems") == null)
return BreakingChanges.ARRAY_MAX_ITEMS_CONDITION_ADDED;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dont we also need max items condition removed?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

max items removed would be a relaxation for the reader schema and make the writing schema a subset of the space covered with no max items (as present in the reader schema). So a schema with no max items will be able to read data written by a schema with some max items condition.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fair point! 👍

else if (toCheck.get("maxItems") != null && toCheckAgainst.get("maxItems") != null) {
int originalMaxLimit = toCheckAgainst.get("maxItems").asInt();
int changedMaxLimit = toCheck.get("maxItems").asInt();
if (changedMaxLimit < originalMaxLimit)
return BreakingChanges.ARRAY_MAX_ITEMS_VALUE_DECREASED;
}
if (toCheck.get("minItems") != null && toCheckAgainst.get("minItems") == null)
return BreakingChanges.ARRAY_MIN_ITEMS_CONDITION_ADDED;
else if (toCheck.get("minItems") != null && toCheckAgainst.get("minItems") != null) {
int originalMinLimit = toCheckAgainst.get("minItems").asInt();
int changedMinLimit = toCheck.get("minItems").asInt();
if (changedMinLimit > originalMinLimit)
return BreakingChanges.ARRAY_MIN_ITEMS_VALUE_INCREASED;
}
return null;
}

private BreakingChanges additionalItems(JsonNode toCheck, JsonNode toCheckAgainst) {
if (toCheck.get("additionalItems") != null && toCheckAgainst.get("additionalItems") == null) {
if (toCheck.get("additionalItems").isBoolean() && toCheck.get("additionalItems").asText() == "false")
return BreakingChanges.ARRAY_ADDITIONAL_ITEMS_DISABLED;
else if (toCheck.get("additionalItems").isObject())
return BreakingChanges.ARRAY_ADDITIONAL_ITEMS_SCOPE_DECREASED;
} else if (toCheck.get("additionalItems") != null && toCheckAgainst.get("additionalItems") != null) {
if (toCheck.get("additionalItems").isBoolean() && toCheck.get("additionalItems").asText() == "false") {
if (!(toCheckAgainst.get("additionalItems").asText() == "false"))
return BreakingChanges.ARRAY_ADDITIONAL_ITEMS_DISABLED;
} else if (toCheck.get("additionalItems").isObject()) {
if (toCheckAgainst.get("additionalItems").isObject()) {
if (jsonCompatibilityChecker.checkNodeType(toCheck.get("additionalItems"),
toCheckAgainst.get("additionalItems")) != null)
return BreakingChanges.ARRAY_ADDITIONAL_ITEMS_SCOPE_INCOMPATIBLE_CHANGE;
} else if (toCheckAgainst.get("additionalItems").isBoolean() && toCheckAgainst.get(
"additionalItems").asText() == "true")
return BreakingChanges.ARRAY_ADDITIONAL_ITEMS_SCOPE_DECREASED;
}
}
return null;
}

private BreakingChanges itemValidation(JsonNode toCheck, JsonNode toCheckAgainst) {
if (!toCheck.has("items") && !toCheckAgainst.has("items"))
return null;
return jsonCompatibilityChecker.checkNodeType(toCheck.get("items"), toCheckAgainst.get("items"));
}

private boolean isDynamicArray(JsonNode node) {
if (node.get("additionalItems") == null)
return true;
else if (node.get("additionalItems").isBoolean()) {
if (node.get("additionalItems").asText() == "true")
return true;
}
return false;
}

private boolean isDynamicArrayWithCondition(JsonNode node) {
if (node.get("additionalItems").isObject())
return true;
return false;
}

private boolean isStaticArray(JsonNode node) {
if (node.get("additionalItems").isBoolean()) {
if (node.get("additionalItems").asText() == "false")
return true;
}
return false;
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
package io.pravega.schemaregistry.rules.jsoncompatibility;

import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;

public class BreakingChangesStore {
//includes only those changes that lead to incompatibility
protected enum BreakingChanges {
// data type mismatch
DATA_TYPE_MISMATCH,
// strings
STRING_TYPE_MAX_LENGTH_ADDED,
STRING_TYPE_MAX_LENGTH_VALUE_DECREASED,
STRING_TYPE_MIN_LENGTH_ADDED,
STRING_TYPE_MIN_LENGTH_VALUE_INCREASED,
STRING_TYPE_PATTERN_ADDED,
STRING_TYPE_PATTERN_MODIFIED,
// NUMBERS
NUMBER_TYPE_MAXIMUM_VALUE_ADDED,
NUMBER_TYPE_MAXIMUM_VALUE_DECREASED,
NUMBER_TYPE_EXCLUSIVE_MAXIMUM_VALUE_ADDED,
NUMBER_TYPE_EXCLUSIVE_MAXIMUM_VALUE_DECREASED,
NUMBER_TYPE_MINIMUM_VALUE_ADDED,
NUMBER_TYPE_MINIMUM_VALUE_INCREASED,
NUMBER_TYPE_EXCLUSIVE_MINIMUM_VALUE_ADDED,
NUMBER_TYPE_EXCLUSIVE_MINIMUM_VALUE_INCREASED,
NUMBER_TYPE_MULTIPLE_OF_ADDED,
NUMBER_TYPE_MULTIPLE_OF_NON_DIVISIBLE_CHANGE,
NUMBER_TYPE_MULTIPLE_OF_INCREASED,
NUMBER_TYPE_CHANGED_FROM_NUMBER_TO_INTEGER,
//ARRAYS
ARRAY_MAX_ITEMS_CONDITION_ADDED,
ARRAY_MAX_ITEMS_VALUE_DECREASED,
ARRAY_MIN_ITEMS_CONDITION_ADDED,
ARRAY_MIN_ITEMS_VALUE_INCREASED,
ARRAY_UNIQUE_ITEMS_CONDITION_ENABLED,
ARRAY_ADDITIONAL_ITEMS_DISABLED,
ARRAY_ADDITIONAL_ITEMS_SCOPE_DECREASED,
ARRAY_ADDITIONAL_ITEMS_SCOPE_INCOMPATIBLE_CHANGE,
ARRAY_SIMPLE_BODY_CHECK_ELEMENT_REMOVED,
ARRAY_SIMPLE_BODY_CHECK_ELEMENT_ADDED,
ITEM_REMOVED_NOT_COVERED_BY_PARTIALLY_OPEN_CONTENT_MODEL,
ITEM_REMOVED_FROM_CLOSED_CONTENT_MODEL,
ITEM_ADDED_TO_OPEN_CONTENT_MODEL,
ITEM_ADDED_NOT_COVERED_BY_PARTIALLY_OPEN_CONTENT_MODEL,
// PROPERTIES
PROPERTIES_SECTION_ADDED, // this would be taken care of by checking for required and additional properties since that is when incompatibility arises
PROPERTY_REMOVED_NOT_PART_OF_DYNAMIC_PROPERTY_SET_WITH_CONDITION, // check on updated schema
PROPERTY_REMOVED_FROM_STATIC_PROPERTY_SET, // check on updated schema
PROPERTY_ADDED_TO_DYNAMIC_PROPERTY_SET, // check on original schema
PROPERTY_ADDED_NOT_PART_OF_DYNAMIC_PROPERTY_SET_WITH_CONDITION, // check on original schema
REQUIRED_PROPERTY_ADDED_WITHOUT_DEFAULT,
REQUIRED_ATTRIBUTE_ADDED, // may not be needed
MAX_PROPERTIES_ADDED,
MAX_PROPERTIES_LIMIT_DECREASED,
MIN_PROPERTIES_ADDED,
MIN_PROPERTIES_LIMIT_INCREASED,
ADDITIONAL_PROPERTIES_REMOVED,
ADDITIONAL_PROPERTIES_NON_COMPATIBLE_CHANGE,
// DEPENDENCIES
DEPENDENCY_SECTION_ADDED,
DEPENDENCY_ADDED_IN_ARRAY_FORM,
DEPENDENCY_ARRAY_ELEMENTS_NON_REMOVAL,
DEPENDENCY_ADDED_IN_SCHEMA_FORM,
DEPENDENCY_IN_SCHEMA_FORM_MODIFIED,
// ENUM
ENUM_TYPE_ADDED,
ENUM_TYPE_ARRAY_CONTENTS_NON_ADDITION_OF_ELEMENTS,
// NOT TYPE
NOT_TYPE_EXTENDED,
// COMBINED
SUBSCHEMA_TYPE_ADDED,
SUBSCHEMA_TYPE_CHANGED,
SUBSCHEMA_TYPE_ALL_OF_SCHEMAS_INCREASED,
SUBSCHEMA_TYPE_ALL_OF_SCHEMAS_CHANGED,
SUBSCHEMA_TYPE_ONE_OF_SCHEMAS_DECREASED,
SUBSCHEMA_TYPE_ONE_OF_SCHEMAS_CHANGED,
SUBSCHEMA_TYPE_ANY_OF_SCHEMAS_DECREASED,
SUBSCHEMA_TYPE_ANYOF_SCHEMAS_CHANGED
}

private List<BreakingChanges> breakingChangesList = new ArrayList<>();

private void computeBreakingChanges() {
breakingChangesList = Arrays.asList(BreakingChanges.values());
}

public List<BreakingChanges> getBreakingChangesList() {
return breakingChangesList;
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
package io.pravega.schemaregistry.rules.jsoncompatibility;

import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ArrayNode;
import com.google.common.collect.Iterators;

import java.util.Iterator;

import static io.pravega.schemaregistry.rules.jsoncompatibility.BreakingChangesStore.*;

public class DependenciesComparator {
JsonCompatibilityCheckerUtils jsonCompatibilityCheckerUtils = new JsonCompatibilityCheckerUtils();
PropertiesComparator propertiesComparator = new PropertiesComparator();
public BreakingChanges checkDependencies(JsonNode toCheck, JsonNode toCheckAgainst) {
propertiesComparator.setJsonCompatibilityChecker();
if(toCheck.get("dependencies") != null && toCheckAgainst.get("dependencies") == null)
return BreakingChanges.DEPENDENCY_SECTION_ADDED;
else if(toCheck.get("dependencies") == null && toCheckAgainst.get("dependencies") == null)
return null;
else if(toCheck.get("dependencies") == null && toCheckAgainst.get("dependencies") != null)
return null;
Iterator<String> dependencyFields = Iterators.concat(toCheck.get("dependencies").fieldNames(), toCheckAgainst.get("dependencies").fieldNames());
boolean dependencyTypeIsArray = checkDependencyTypeIsArray(toCheck.get("dependencies"), toCheckAgainst.get("dependencies"));
while(dependencyFields.hasNext()) {
String field = dependencyFields.next();
if(toCheck.get("dependencies").get(field) != null && toCheckAgainst.get("dependencies").get(field)==null) {
if(dependencyTypeIsArray)
return BreakingChanges.DEPENDENCY_ADDED_IN_ARRAY_FORM;
else
return BreakingChanges.DEPENDENCY_ADDED_IN_SCHEMA_FORM;
}
else if (toCheck.get("dependencies").get(field) != null && toCheckAgainst.get("dependencies").get(field) != null) {
if(dependencyTypeIsArray) {
// check the value returned by the array comparator
if(!jsonCompatibilityCheckerUtils.arrayComparisionOnlyRemoval((ArrayNode) toCheck.get("dependencies").get(field),
(ArrayNode) toCheckAgainst.get("dependencies").get(field)))
return BreakingChanges.DEPENDENCY_ARRAY_ELEMENTS_NON_REMOVAL;
}
else {
if(propertiesComparator.checkProperties(toCheck.get("dependencies").get(field),
toCheckAgainst.get("dependencies").get(field)) != null)
return BreakingChanges.DEPENDENCY_IN_SCHEMA_FORM_MODIFIED;
}
}
}
return null;
}

private boolean checkDependencyTypeIsArray(JsonNode toCheck, JsonNode toCheckAgainst) {
// it is assumed that the dependency type does not change from array to schema or vice versa.
Iterator<String> toCheckFields = toCheck.fieldNames();
String toCheckSample = toCheckFields.next();
Iterator<String> toCheckAgainstFields = toCheckAgainst.fieldNames();
String toCheckAgainstSample = toCheckAgainstFields.next();
if(toCheck.get(toCheckSample).isArray() && toCheckAgainst.get(toCheckAgainstSample).isArray())
return true;
return false;
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
package io.pravega.schemaregistry.rules.jsoncompatibility;

import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ArrayNode;

import static io.pravega.schemaregistry.rules.jsoncompatibility.BreakingChangesStore.*;

public class EnumComparator {
JsonCompatibilityCheckerUtils jsonCompatibilityCheckerUtils = new JsonCompatibilityCheckerUtils();
public BreakingChanges enumComparator(JsonNode toCheck, JsonNode toCheckAgainst) {
if(toCheck.has("enum") && !toCheckAgainst.has("enum"))
return BreakingChanges.ENUM_TYPE_ADDED;
else if(toCheck.has("enum") && toCheck.has("enum")) {
if(!jsonCompatibilityCheckerUtils.arrayComparisionOnlyAddition((ArrayNode)toCheck.get("enum"), (ArrayNode)toCheckAgainst.get("enum")))
return BreakingChanges.ENUM_TYPE_ARRAY_CONTENTS_NON_ADDITION_OF_ELEMENTS;
}
return null;
}
}
Loading