Assessing the type of compatibility based on schemata differences

Schema

flowchart LR
    subgraph inputs
        input_desc_0["schemata differences"]
    end
    subgraph output
        output_desc["the type of compatibility"]
    end
    inputs --> output

Context

Backward compatibility - an ability of a system to understand input intended for previous versions of itself

Forward compatibility - an ability of a system to understand input intended for future versions of itself

Full compatibility - backward and forward compatibility combined

No compatibility - neither level of compatibility

Maintaining backward and forward compatibility is important for minimizing disruption
and ensuring smooth transitions when updating JSON schemata.

flowchart LR
    subgraph data writers
        current_writer("writer")
        next_writer("writer<sub>+1</sub>")
    end
    subgraph data readers
        current_reader("reader")
        next_reader("reader<sub>+1</sub>")
    end
    current_writer --> current_reader
    current_writer -->|backward compatibility| next_reader
    next_writer -->|forward compatibility| current_reader
flowchart LR
    subgraph data writers
        current_writer("writer")
        next_writer("writer<sub>+1</sub>")
    end
    subgraph data readers
        current_reader("reader")
        next_reader("reader<sub>+1</sub>")
    end
    current_writer --> current_reader
    current_writer -->|backward compatibility| next_reader
    next_writer -->|forward compatibility| current_reader

Examples


assessing the type of compatibility based on reduction of accepted types by a number

In this situation, Because not every integer is a number, such a change is forward compatible.. Therefore, such a change is forward compatible.

Input:

reduction of accepted types by a number:

  • JSON schema path: #

    change of accepted JSON value types from

    • integer

    • number

    to

    • integer

Output:

forward compatibility:

Reasons for breaking the backward compatibility:

    • schema path: #

      the set of allowed JSON value types has been reduced by number


assessing the type of compatibility based on extension of accepted types by a number

In this situation, Because not every integer is a number, such a change is backward compatible.. Therefore, such a change is backward compatible.

Input:

extension of accepted types by a number:

  • JSON schema path: #

    change of accepted JSON value types from

    • integer

    to

    • integer

    • number

Output:

backward compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the set of allowed JSON value types has been extended by number


assessing the type of compatibility based on change of accepted type from integer to number

In this situation, Because every integer is a number, but not vice versa, such a change is backward compatible.. Therefore, such a change is backward compatible.

Input:

change of accepted type from integer to number:

  • JSON schema path: #

    change of accepted JSON value types from

    • integer

    to

    • number

Output:

backward compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the set of allowed JSON value types has been extended by number


assessing the type of compatibility based on an accepted type change from null to boolean

In this situation, Because no boolean value can satisfy null JSON type constraint, and vice versa, such a change is incompatible.. Therefore, such a change is not compatible.

Input:

an accepted type change from null to boolean:

  • JSON schema path: #

    change of accepted JSON value types from

    • null

    to

    • boolean

Output:

no compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the set of allowed JSON value types has been extended by boolean

Reasons for breaking the backward compatibility:

    • schema path: #

      the set of allowed JSON value types has been reduced by null


assessing the type of compatibility based on extending set of accepted value types from numbers to numbers and integers

In this situation, every integer is also a number so this kind of difference does not have any impact. Therefore, such a change is fully compatible.

Input:

extending set of accepted value types from numbers to numbers and integers:

  • JSON schema path: #

    change of accepted JSON value types from

    • number

    to

    • integer

    • number

Output:

full compatibility:


assessing the type of compatibility based on change of accepted type from number to integer

In this situation, Because every integer is a number, but not vice versa, such a change is forward compatible.. Therefore, such a change is forward compatible.

Input:

change of accepted type from number to integer:

  • JSON schema path: #

    change of accepted JSON value types from

    • number

    to

    • integer

Output:

forward compatibility:

Reasons for breaking the backward compatibility:

    • schema path: #

      the set of allowed JSON value types has been reduced by number


assessing the type of compatibility based on no differences

In this situation, identical schemata cannot be incompatible with each other. Therefore, such a change is fully compatible.

Input:

no differences:

Output:

full compatibility:


assessing the type of compatibility based on extension of accepted types by an additional type

In this situation, Because more value types than before are accepted, this change is backward compatible.. Therefore, such a change is backward compatible.

Input:

extension of accepted types by an additional type:

  • JSON schema path: #

    change of accepted JSON value types from

    • null

    to

    • boolean

    • null

Output:

backward compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the set of allowed JSON value types has been extended by boolean


assessing the type of compatibility based on reduction of accepted types

In this situation, Because less value types than before are accepted, this change is backward compatible.. Therefore, such a change is forward compatible.

Input:

reduction of accepted types:

  • JSON schema path: #

    change of accepted JSON value types from

    • boolean

    • null

    to

    • null

Output:

forward compatibility:

Reasons for breaking the backward compatibility:

    • schema path: #

      the set of allowed JSON value types has been reduced by boolean


assessing the type of compatibility based on old and new value of multipleOf being not each other's factors

In this situation, there are potentially some numbers that are not divisible by neither of multipleOf values. Therefore, such a change is not compatible.

Input:

old and new value of multipleOf being not each other's factors:

  • JSON schema path: #

    change of multipleOf from 2.0 to 5.0

Output:

no compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the new multiple constraint of 5.0 is not a factor of the old multiple constraint of 2.0

Reasons for breaking the backward compatibility:

    • schema path: #

      the old multiple constraint of 2.0 is not a factor of the new multiple constraint of 5.0


assessing the type of compatibility based on the range of allowed number values being extended

In this situation, all numbers from the new, longer range fall into the old, shorter range. Therefore, such a change is backward compatible.

Input:

the range of allowed number values being extended:

  • JSON schema path: #

    change of maximum from 15.0 to 20.0

  • JSON schema path: #

    change of minimum from 10.0 to 5.0

Output:

backward compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the range of allowed values has been extended by [5.0,10.0) and (15.0,20.0]


assessing the type of compatibility based on the old value of multipleOf being divisible by the new one

In this situation, all numbers from the new range fall into the old, unconstrained one. Therefore, such a change is forward compatible.

Input:

the old value of multipleOf being divisible by the new one:

  • JSON schema path: #

    change of maximum from unspecified to 20.0

  • JSON schema path: #

    change of minimum from unspecified to 5.0

Output:

forward compatibility:

Reasons for breaking the backward compatibility:

    • schema path: #

      the range of allowed values has been reduced by (-Infinity,5.0) and (20.0,Infinity)


assessing the type of compatibility based on the new value of multipleOf being divisible by the old one

In this situation, Because every multiple the new value is also a multiple of the old value, such a change is backward compatible. Therefore, such a change is backward compatible.

Input:

the new value of multipleOf being divisible by the old one:

  • JSON schema path: #

    change of multipleOf from 2.0 to 4.0

Output:

backward compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the new multiple constraint of 4.0 is not a factor of the old multiple constraint of 2.0


assessing the type of compatibility based on the range gets unconstrained

In this situation, all numbers from the old range fall into the new, unconstrained one. Therefore, such a change is backward compatible.

Input:

the range gets unconstrained:

  • JSON schema path: #

    change of maximum from 20.0 to unspecified

  • JSON schema path: #

    change of minimum from 5.0 to unspecified

Output:

backward compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the range of allowed values has been extended by (-Infinity,5.0] and (20.0,Infinity)


assessing the type of compatibility based on the range of allowed values being reduced using the exclusive version of constraints

In this situation, all numbers from the new, shorted range fall into the old, longer range. Therefore, such a change is forward compatible.

Input:

the range of allowed values being reduced using the exclusive version of constraints:

  • JSON schema path: #

    change of exclusiveMaximum from 20.0 to 15.0

  • JSON schema path: #

    change of exclusiveMinimum from 5.0 to 10.0

Output:

forward compatibility:

Reasons for breaking the backward compatibility:

    • schema path: #

      the range of allowed values has been reduced by (5.0,10.0] and [15.0,20.0)


assessing the type of compatibility based on the range of allowed number values being extended and reduced at the same time using exclusive versions of constraints

In this situation, there are some numbers which do not fall into neither old nor new range. Therefore, such a change is not compatible.

Input:

the range of allowed number values being extended and reduced at the same time using exclusive versions of constraints:

  • JSON schema path: #

    change of exclusiveMaximum from 15.0 to 20.0

  • JSON schema path: #

    change of exclusiveMinimum from 5.0 to 10.0

Output:

no compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the range of allowed values has been extended by [15.0,20.0)

Reasons for breaking the backward compatibility:

    • schema path: #

      the range of allowed values has been reduced by (5.0,10.0]


assessing the type of compatibility based on the range of allowed number values being extended and reduced at the same time

In this situation, there are some numbers which do not fall into neither old nor new range. Therefore, such a change is not compatible.

Input:

the range of allowed number values being extended and reduced at the same time:

  • JSON schema path: #

    change of maximum from 15.0 to 20.0

  • JSON schema path: #

    change of minimum from 5.0 to 10.0

Output:

no compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the range of allowed values has been extended by (15.0,20.0]

Reasons for breaking the backward compatibility:

    • schema path: #

      the range of allowed values has been reduced by [5.0,10.0)


assessing the type of compatibility based on the range of allowed values being reduced

In this situation, all numbers from the new, shorted range fall into the old, longer range. Therefore, such a change is forward compatible.

Input:

the range of allowed values being reduced:

  • JSON schema path: #

    change of maximum from 20.0 to 15.0

  • JSON schema path: #

    change of minimum from 5.0 to 10.0

Output:

forward compatibility:

Reasons for breaking the backward compatibility:

    • schema path: #

      the range of allowed values has been reduced by [5.0,10.0) and (15.0,20.0]


assessing the type of compatibility based on the range of allowed values being shifted

In this situation, all numbers from the new, longer range fall into the old, shorter range. Therefore, such a change is backward compatible.

Input:

the range of allowed values being shifted:

  • JSON schema path: #

    change of exclusiveMaximum from 15.0 to 20.0

  • JSON schema path: #

    change of exclusiveMinimum from 10.0 to 5.0

Output:

backward compatibility:

Reasons for breaking the forward compatibility:

    • schema path: #

      the range of allowed values has been extended by (5.0,10.0] and [15.0,20.0)

Calculating differences between schemata based on old JSON schema and new JSON schema

Schema

flowchart LR
    subgraph inputs
        input_desc_0["new JSON schema"]
        input_desc_1["old JSON schema"]
    end
    subgraph output
        output_desc["differences between schemata"]
    end
    inputs --> output

Context

Calculating JSON Schema Difference is a process used to identify the changes between two JSON schemata.
It is used to to see what has been added, removed, or changed.
This is useful for tracking changes over time, understanding the impact of changes, and managing versions of a schema.
It can also be used to generate a diff report or to automate the process of updating dependent systems or documentation when a schema changes.

Properties

  • comparing identical schemata yields no differences

  • comparing different schemata yields differences

Examples


calculating differences between schemata based on JSON schema accepting only number less than or equal to some number and JSON schema accepting only number less than or equal to other number

Any change of maximum inclusive keyword value is considered a difference.

Input:

JSON schema accepting only number less than or equal to other number:

{
  "maximum": 4
}

JSON schema accepting only number less than or equal to some number:

{
  "maximum": 2
}

Output:

a change in inclusive maximum value:

  • JSON schema path: #/maximum

    change of maximum from 2.0 to 4.0


calculating differences between schemata based on JSON schema accepting only number greater than or equal to some number and JSON schema accepting only number greater than or equal to other number

Any change of minimum inclusive keyword value is considered a difference.

Input:

JSON schema accepting only number greater than or equal to other number:

{
  "minimum": 4
}

JSON schema accepting only number greater than or equal to some number:

{
  "minimum": 2
}

Output:

a change in inclusive minimum value:

  • JSON schema path: #/minimum

    change of minimum from 2.0 to 4.0


calculating differences between schemata based on JSON schema accepting only multiples of a some number and JSON schema accepting only multiples of other number

Any change of multipleOf keyword is considered a difference.

Input:

JSON schema accepting only multiples of other number:

{
  "multipleOf": 4
}

JSON schema accepting only multiples of a some number:

{
  "multipleOf": 2
}

Output:

a change in accepted value factor:

  • JSON schema path: #/multipleOf

    change of multipleOf from 2.0 to 4.0


calculating differences between schemata based on JSON schema accepting only number greater than some number and JSON schema accepting only number greater than other number

Any change of minimum exclusive keyword value is considered a difference.

Input:

JSON schema accepting only number greater than other number:

{
  "exclusiveMinimum": 4
}

JSON schema accepting only number greater than some number:

{
  "exclusiveMinimum": 2
}

Output:

a change in exclusive minimum value:

  • JSON schema path: #/exclusiveMinimum

    change of exclusiveMinimum from 2.0 to 4.0


calculating differences between schemata based on JSON schema accepting only some type and JSON schema accepting only other type

Any change of expected JSON value type is considered a difference.

Input:

JSON schema accepting only other type:

{
  "type": [
    "boolean"
  ]
}

JSON schema accepting only some type:

{
  "type": [
    "null"
  ]
}

Output:

a change in accepted value type:

  • JSON schema path: #/type

    change of accepted JSON value types from

    • null

    to

    • boolean

calculating differences between schemata based on JSON schema accepting only number less than some number and JSON schema accepting only number less than other number

Any change of maximum exclusive keyword value is considered a difference.

Input:

JSON schema accepting only number less than other number:

{
  "exclusiveMaximum": 4
}

JSON schema accepting only number less than some number:

{
  "exclusiveMaximum": 2
}

Output:

a change in exclusive maximum value:

  • JSON schema path: #/exclusiveMaximum

    change of exclusiveMaximum from 2.0 to 4.0


calculating differences between schemata based on same schema and some schema

Comparison of two identical schemata yields no differences.

Input:

some schema:

false

same schema:

false

Output:

no differences:

Finding JSON schema violations as a result of validating JSON value against JSON schema

Schema

flowchart LR
    subgraph inputs
        input_desc_0["JSON value"]
        input_desc_1["JSON schema"]
    end
    subgraph output
        output_desc["JSON schema violations"]
    end
    inputs --> output

Context

JSON validation is a specification for validating the structure and data types of JSON values.
It allows you to specify the required properties, the types of values, the format of the data, and other constraints for a JSON object.
This is useful for ensuring that the data received or sent in a JSON format is as expected and can be processed correctly.
It helps to catch errors early, improve data quality, and reduce the amount of code needed for data validation.

Properties

  • 'true' JSON schema does not impose any constraints

  • 'false' JSON schema rejects anything

  • any JSON value passes validation against 'empty object' JSON schema

Examples


finding JSON schema violations as a result of validating a multiple of x against schema accepting only numbers which are multiples of x

Because a JSON number value is a multiple of the factor desired by the schema, such a value is valid.

Input:

a multiple of x:

7.5

schema accepting only numbers which are multiples of x:

{
  "multipleOf": 2.5,
  "type": [
    "number"
  ]
}

Output:

no violations:


finding JSON schema violations as a result of validating a boolean value against schema accepting only nulls or strings

Because the value is neither null or string, such a value is invalid.

Input:

a boolean value:

true

schema accepting only nulls or strings:

{
  "type": [
    "null",
    "string"
  ]
}

Output:

a type mismatch violation:

  • JSON value path: $
    JSON schema path: #/type

    Invalid type. Expected null or string but got boolean.


finding JSON schema violations as a result of validating JSON number value against JSON schema accepting only numbers

Because a JSON value directly matches schema's only 'type' keyword item, such a value is valid.

Input:

JSON number value:

2.5

JSON schema accepting only numbers:

{
  "type": [
    "number"
  ]
}

Output:

no violations:


finding JSON schema violations as a result of validating JSON null value against JSON schema accepting booleans, nulls and strings

Because a JSON value directly matches one of schema's 'type' keyword items, such a value is valid.

Input:

JSON null value:

null

JSON schema accepting booleans, nulls and strings:

{
  "type": [
    "boolean",
    "null",
    "string"
  ]
}

Output:

no violations:


finding JSON schema violations as a result of validating JSON number value which happens to be an integer against JSON schema accepting any numbers

Because a JSON value indirectly matches schema's only 'type' keyword item, such a value is valid.

Input:

JSON number value which happens to be an integer:

1

JSON schema accepting any numbers:

{
  "type": [
    "number"
  ]
}

Output:

no violations:


finding JSON schema violations as a result of validating a fractional number against schema accepting only whole numbers

Because the schema accepts only whole numbers, such a value is invalid.

Input:

a fractional number:

1.5

schema accepting only whole numbers:

{
  "type": [
    "integer"
  ]
}

Output:

a type mismatch violation:

  • JSON value path: $
    JSON schema path: #/type

    Invalid type. Expected integer but got number.


finding JSON schema violations as a result of validating a multiple of 2.5 against a schema accepting only multiples of 2.5

Because the schema accepts any multiples of 2.5, such a value is valid.

Input:

a multiple of 2.5:

7.5

a schema accepting only multiples of 2.5:

{
  "multipleOf": 2.5,
  "type": [
    "number"
  ]
}

Output:

no violations:


finding JSON schema violations as a result of validating number at the schema's maximum allowed values boundary against a schema with a maximum exclusive allowed value set

Because the maximum value constraint is exclusive, such a value is invalid.

Input:

number at the schema's maximum allowed values boundary:

4

a schema with a maximum exclusive allowed value set:

{
  "exclusiveMaximum": 4
}

Output:

an invalid range violation:

  • JSON value path: $
    JSON schema path: #/exclusiveMaximum

    4.0 is outside of the valid range of (-Infinity,4.0)


finding JSON schema violations as a result of validating an array containing some duplicated strings against schema not accepting duplicates

Because the schema requires items to be unique, and the value contains duplicate occurrence, such a value is invalid.

Input:

an array containing some duplicated strings:

[
  "a",
  "b",
  "b",
  "c",
  "d",
  "d",
  "e"
]

schema not accepting duplicates:

{
  "uniqueItems": true
}

Output:

an invalid array violation:

  • JSON value path: $
    JSON schema path: #

    Invalid array:

    • JSON value path: $[1]
      JSON schema path: #/uniqueItems

      Non-unique array item.

    • JSON value path: $[2]
      JSON schema path: #/uniqueItems

      Non-unique array item.

    • JSON value path: $[4]
      JSON schema path: #/uniqueItems

      Non-unique array item.

    • JSON value path: $[5]
      JSON schema path: #/uniqueItems

      Non-unique array item.


finding JSON schema violations as a result of validating an array containing a mixture of null and boolean values to a schema accepting only arrays of nulls against schema accepting only arrays of nulls

Because the schema requires items to conform to a certain schema and this is not the case here, such a value is invalid.

Input:

an array containing a mixture of null and boolean values to a schema accepting only arrays of nulls:

[
  null,
  false,
  null,
  true,
  null
]

schema accepting only arrays of nulls:

{
  "items": {
    "type": [
      "null"
    ]
  },
  "type": [
    "array"
  ]
}

Output:

an invalid array violation:

  • JSON value path: $
    JSON schema path: #

    Invalid array:

    • JSON value path: $[1]
      JSON schema path: #/items/type

      Invalid type. Expected null but got boolean.

    • JSON value path: $[3]
      JSON schema path: #/items/type

      Invalid type. Expected null but got boolean.


finding JSON schema violations as a result of validating not a multiple of 2.5 against a schema accepting only multiples of 2.5

Because the schema accepts only multiples of 2.5, such a value is invalid.

Input:

not a multiple of 2.5:

7

a schema accepting only multiples of 2.5:

{
  "multipleOf": 2.5,
  "type": [
    "number"
  ]
}

Output:

an invalid multiple violation:

  • JSON value path: $
    JSON schema path: #/multipleOf

    7.0 is not a multiple of 2.5


finding JSON schema violations as a result of validating number below the schema's maximum allowed values boundary against a schema with a maximum exclusive allowed value set

Because the value is less than the maximum value constraint, such a value is valid.

Input:

number below the schema's maximum allowed values boundary:

3

a schema with a maximum exclusive allowed value set:

{
  "exclusiveMaximum": 4
}

Output:

no violations:


finding JSON schema violations as a result of validating number at the schema's minimum allowed values boundary against a schema with a minimum exclusive allowed value set

Because the minimum value constraint is exclusive, such a value is invalid.

Input:

number at the schema's minimum allowed values boundary:

4

a schema with a minimum exclusive allowed value set:

{
  "exclusiveMinimum": 4
}

Output:

an invalid range violation:

  • JSON value path: $
    JSON schema path: #/exclusiveMinimum

    4.0 is outside of the valid range of (4.0,Infinity)


finding JSON schema violations as a result of validating number at the schema's maximum allowed values boundary against a schema with a maximum inclusive allowed value set

Because the maximum value constraint is inclusive, such a value is valid.

Input:

number at the schema's maximum allowed values boundary:

4

a schema with a maximum inclusive allowed value set:

{
  "maximum": 4
}

Output:

no violations:


finding JSON schema violations as a result of validating number at the schema's minimum allowed values boundary against a schema with a minimum inclusive allowed value set

Because the minimum value constraint is inclusive, such a value is valid.

Input:

number at the schema's minimum allowed values boundary:

4

a schema with a minimum inclusive allowed value set:

{
  "minimum": 4
}

Output:

no violations:


finding JSON schema violations as a result of validating number exceeding the schema's maximum allowed values boundary against a schema with a maximum inclusive allowed value set

Because the value is greater than the maximum value constraint is exclusive, such a value is invalid.

Input:

number exceeding the schema's maximum allowed values boundary:

5

a schema with a maximum inclusive allowed value set:

{
  "maximum": 4
}

Output:

an invalid range violation:

  • JSON value path: $
    JSON schema path: #/maximum

    5.0 is outside of the valid range of (-Infinity,4.0]


finding JSON schema violations as a result of validating number below the schema's minimum allowed values boundary against a schema with a minimum exclusive allowed value set

Because the value is less than the minimum value constraint, such a value is valid.

Input:

number below the schema's minimum allowed values boundary:

5

a schema with a minimum exclusive allowed value set:

{
  "exclusiveMinimum": 4
}

Output:

no violations:


finding JSON schema violations as a result of validating number exceeding the schema's minimum allowed values boundary against a schema with a minimum inclusive allowed value set

Because the value is greater than the minimum value constraint is exclusive, such a value is invalid.

Input:

number exceeding the schema's minimum allowed values boundary:

3

a schema with a minimum inclusive allowed value set:

{
  "minimum": 4
}

Output:

an invalid range violation:

  • JSON value path: $
    JSON schema path: #/minimum

    3.0 is outside of the valid range of [4.0,Infinity)

Printing a JSON value representing a JSON schema

Schema

flowchart LR
    subgraph inputs
        input_desc_0["JSON schema"]
    end
    subgraph output
        output_desc["a JSON value"]
    end
    inputs --> output

Context

Properties

  • always prints a well-formatted schema JSON

Producing a JSON schema or an error as a result of parsing JSON value

Schema

flowchart LR
    subgraph inputs
        input_desc_0["JSON value"]
    end
    subgraph output
        output_desc["a JSON schema or an error"]
    end
    inputs --> output

Context

JSON schema is commonly expressed in a JSON format. However, not every JSON is a valid JSON schema.

Examples


producing a JSON schema or an error as a result of parsing a boolean JSON value

Because a boolean value of false is a valid schema which passes validation of any JSON value, such a value represents a JSON schema.

Input:

a boolean JSON value:

false

Output:

a successfully parsed JSON schema:

false

producing a JSON schema or an error as a result of parsing A JSON object with 'type' property set to an array with a single 'null' string inside it

Because the 'type' keyword defines acceptable JSON types. It can be in a form of an array of string (here, only with one type defined), such a value represents a JSON schema.

Input:

A JSON object with 'type' property set to an array with a single 'null' string inside it:

{
  "type": [
    "null"
  ]
}

Output:

a successfully parsed JSON schema:

{
  "type": [
    "null"
  ]
}

producing a JSON schema or an error as a result of parsing A JSON object with 'type' property defined

Because the 'type' keyword defines acceptable JSON types. It can be in a form of a single string, such a value represents a JSON schema.

Input:

A JSON object with 'type' property defined:

{
  "type": "null"
}

Output:

a successfully parsed JSON schema:

{
  "type": [
    "null"
  ]
}

producing a JSON schema or an error as a result of parsing A JSON object with 'required' property being array of strings

Because the 'required' constrain rejects any JSON object not containing properties defined by it, such a value represents a JSON schema.

Input:

A JSON object with 'required' property being array of strings:

{
  "required": [
    "prop1",
    "prop2"
  ]
}

Output:

a successfully parsed JSON schema:

{
  "required": [
    "prop1",
    "prop2"
  ]
}

producing a JSON schema or an error as a result of parsing A JSON object with 'type' property set to an array with 'array', 'null' and 'string' strings inside it

Because the 'type' keyword defines acceptable JSON types. It can be in a form of an array of string (here, with three types defined), such a value represents a JSON schema.

Input:

A JSON object with 'type' property set to an array with 'array', 'null' and 'string' strings inside it:

{
  "type": [
    "array",
    "null",
    "string"
  ]
}

Output:

a successfully parsed JSON schema:

{
  "type": [
    "array",
    "null",
    "string"
  ]
}

producing a JSON schema or an error as a result of parsing A JSON object with 'uniqueItems' property set

Because the 'uniqueItems' keyword makes sure that if JSON value is an array, its items do not contain any duplicates, such a value represents a JSON schema.

Input:

A JSON object with 'uniqueItems' property set:

{
  "uniqueItems": true
}

Output:

a successfully parsed JSON schema:

{
  "uniqueItems": true
}

producing a JSON schema or an error as a result of parsing A JSON object with 'type' property set to an empty array

Because the 'type' keyword defines acceptable JSON types. It can be in a form of an array of string (here, with no types defined), such a value represents a JSON schema.

Input:

A JSON object with 'type' property set to an empty array:

{
  "type": []
}

Output:

a successfully parsed JSON schema:

{
  "type": []
}

producing a JSON schema or an error as a result of parsing a JSON value not being a boolean or object

Because booleans and objects are the only acceptable forms, such a value does not represent a JSON schema.

Input:

a JSON value not being a boolean or object:

0

Output:

a parsing error:

an error:

"the JSON value is neither a boolean nor an object"


producing a JSON schema or an error as a result of parsing an JSON object with the 'not' property defined

Because the 'not' constrain rejects any JSON value which conform to schema defined by it, such a value represents a JSON schema.

Input:

an JSON object with the 'not' property defined:

{
  "not": true
}

Output:

a successfully parsed JSON schema:

{
  "not": true
}

producing a JSON schema or an error as a result of parsing an JSON object with the 'items' property defined

Because the 'items' constrain makes sure than if a JSON value is an array, every item of that array conforms the schema defined by it, such a value represents a JSON schema.

Input:

an JSON object with the 'items' property defined:

{
  "items": true
}

Output:

a successfully parsed JSON schema:

{
  "items": true
}

producing a JSON schema or an error as a result of parsing an empty JSON object

Because an empty JSON object is a valid schema which passes validation of any JSON value, such a value represents a JSON schema.

Input:

an empty JSON object:

{}

Output:

a successfully parsed JSON schema:

{}