cerbero / json-parser
Zero-dependencies pull parser to read large JSON from any source in a memory-efficient way.
Fund package maintenance!
cerbero90
Requires
- php: ^8.1
Requires (Dev)
- guzzlehttp/guzzle: ^7.2
- illuminate/http: >=6.20
- mockery/mockery: ^1.5
- pestphp/pest: ^2.0
- phpstan/phpstan: ^1.9
- scrutinizer/ocular: ^1.8
- squizlabs/php_codesniffer: ^3.0
Suggests
- guzzlehttp/guzzle: Required to load JSON from endpoints (^7.2).
This package is auto-updated.
Last update: 2024-12-06 18:43:38 UTC
README
Zero-dependencies pull parser to read large JSON from any source in a memory-efficient way.
📦 Install
Via Composer:
composer require cerbero/json-parser
🔮 Usage
👣 Basics
JSON Parser provides a minimal API to read large JSON from any source:
// a source is anything that can provide a JSON, in this case an endpoint $source = 'https://randomuser.me/api/1.4?seed=json-parser&results=5'; foreach (new JsonParser($source) as $key => $value) { // instead of loading the whole JSON, we keep in memory only one key and value at a time }
Depending on our code style, we can instantiate the parser in 3 different ways:
use Cerbero\JsonParser\JsonParser; use function Cerbero\JsonParser\parseJson; // classic object instantiation new JsonParser($source); // static instantiation JsonParser::parse($source); // namespaced function parseJson($source);
If we don't want to use foreach()
to loop through each key and value, we can chain the traverse()
method:
JsonParser::parse($source)->traverse(function (mixed $value, string|int $key, JsonParser $parser) { // lazily load one key and value at a time, we can also access the parser if needed }); // no foreach needed
⚠️ Please note the parameters order of the callback: the value is passed before the key.
💧 Sources
A JSON source is any data point that provides a JSON. A wide range of sources are supported by default:
- strings, e.g.
{"foo":"bar"}
- iterables, i.e. arrays or instances of
Traversable
- file paths, e.g.
/path/to/large.json
- resources, e.g. streams
- API endpoint URLs, e.g.
https://endpoint.json
or any instance ofPsr\Http\Message\UriInterface
- PSR-7 requests, i.e. any instance of
Psr\Http\Message\RequestInterface
- PSR-7 messages, i.e. any instance of
Psr\Http\Message\MessageInterface
- PSR-7 streams, i.e. any instance of
Psr\Http\Message\StreamInterface
- Laravel HTTP client requests, i.e. any instance of
Illuminate\Http\Client\Request
- Laravel HTTP client responses, i.e. any instance of
Illuminate\Http\Client\Response
- user-defined sources, i.e. any instance of
Cerbero\JsonParser\Sources\Source
If the source we need to parse is not supported by default, we can implement our own custom source.
Click here to see how to implement a custom source.
To implement a custom source, we need to extend Source
and implement 3 methods:
use Cerbero\JsonParser\Sources\Source; use Traversable; class CustomSource extends Source { public function getIterator(): Traversable { // return a Traversable holding the JSON source, e.g. a Generator yielding chunks of JSON } public function matches(): bool { // return TRUE if this class can handle the JSON source } protected function calculateSize(): ?int { // return the size of the JSON in bytes or NULL if it can't be calculated } }
The parent class Source
gives us access to 2 properties:
$source
: the JSON source we pass to the parser, i.e.:new JsonParser($source)
$config
: the configuration we set by chaining methods like$parser->pointer('/foo')
The method getIterator()
defines the logic to read the JSON source in a memory-efficient way. It feeds the parser with small pieces of JSON. Please refer to the already existing sources to see some implementations.
The method matches()
determines whether the JSON source passed to the parser can be handled by our custom implementation. In other words, we are telling the parser if it should use our class for the JSON to parse.
Finally, calculateSize()
computes the whole size of the JSON source. It's used to track the parsing progress, however it's not always possible to know the size of a JSON source. In this case, or if we don't need to track the progress, we can return null
.
Now that we have implemented our custom source, we can pass it to the parser:
$json = JsonParser::parse(new CustomSource($source)); foreach ($json as $key => $value) { // process one key and value of $source at a time }
If you find yourself implementing the same custom source in different projects, feel free to send a PR and we will consider to support your custom source by default. Thank you in advance for any contribution!
🎯 Pointers
A JSON pointer is a standard used to point to nodes within a JSON. This package leverages JSON pointers to extract only some sub-trees from large JSONs.
Consider this JSON for example. To extract only the first gender and avoid parsing the rest of the JSON, we can set the /results/0/gender
pointer:
$json = JsonParser::parse($source)->pointer('/results/0/gender'); foreach ($json as $key => $value) { // 1st and only iteration: $key === 'gender', $value === 'female' }
JSON Parser takes advantage of the -
wildcard to point to any array index, so we can extract all the genders with the /results/-/gender
pointer:
$json = JsonParser::parse($source)->pointer('/results/-/gender'); foreach ($json as $key => $value) { // 1st iteration: $key === 'gender', $value === 'female' // 2nd iteration: $key === 'gender', $value === 'female' // 3rd iteration: $key === 'gender', $value === 'male' // and so on for all the objects in the array... }
If we want to extract more sub-trees, we can set multiple pointers. Let's extract all genders and countries:
$json = JsonParser::parse($source)->pointers(['/results/-/gender', '/results/-/location/country']); foreach ($json as $key => $value) { // 1st iteration: $key === 'gender', $value === 'female' // 2nd iteration: $key === 'country', $value === 'Germany' // 3rd iteration: $key === 'gender', $value === 'female' // 4th iteration: $key === 'country', $value === 'Mexico' // and so on for all the objects in the array... }
⚠️ Intersecting pointers like
/foo
and/foo/bar
is not allowed but intersecting wildcards likefoo/-/bar
andfoo/0/bar
is possible.
We can also specify a callback to execute when JSON pointers are found. This is handy when we have different pointers and we need to run custom logic for each of them:
$json = JsonParser::parse($source)->pointers([ '/results/-/gender' => fn (string $gender, string $key) => new Gender($gender), '/results/-/location/country' => fn (string $country, string $key) => new Country($country), ]); foreach ($json as $key => $value) { // 1st iteration: $key === 'gender', $value instanceof Gender // 2nd iteration: $key === 'country', $value instanceof Country // and so on for all the objects in the array... }
⚠️ Please note the parameters order of the callbacks: the value is passed before the key.
The same can also be achieved by chaining the method pointer()
multiple times:
$json = JsonParser::parse($source) ->pointer('/results/-/gender', fn (string $gender, string $key) => new Gender($gender)) ->pointer('/results/-/location/country', fn (string $country, string $key) => new Country($country)); foreach ($json as $key => $value) { // 1st iteration: $key === 'gender', $value instanceof Gender // 2nd iteration: $key === 'country', $value instanceof Country // and so on for all the objects in the array... }
Pointer callbacks can also be used to customize a key. We can achieve that by updating the key reference:
$json = JsonParser::parse($source)->pointer('/results/-/name/first', function (string $name, string &$key) { $key = 'first_name'; }); foreach ($json as $key => $value) { // 1st iteration: $key === 'first_name', $value === 'Sara' // 2nd iteration: $key === 'first_name', $value === 'Andrea' // and so on for all the objects in the array... }
If the callbacks are enough to handle the pointers and we don't need to run any common logic for all pointers, we can avoid to manually call foreach()
by chaining the method traverse()
:
JsonParser::parse($source) ->pointer('/-/gender', $this->handleGender(...)) ->pointer('/-/location/country', $this->handleCountry(...)) ->traverse(); // no foreach needed
Otherwise if some common logic for all pointers is needed but we prefer methods chaining to manual loops, we can pass a callback to the traverse()
method:
JsonParser::parse($source) ->pointer('/results/-/gender', fn (string $gender, string $key) => new Gender($gender)) ->pointer('/results/-/location/country', fn (string $country, string $key) => new Country($country)) ->traverse(function (Gender|Country $value, string $key, JsonParser $parser) { // 1st iteration: $key === 'gender', $value instanceof Gender // 2nd iteration: $key === 'country', $value instanceof Country // and so on for all the objects in the array... }); // no foreach needed
⚠️ Please note the parameters order of the callbacks: the value is passed before the key.
Sometimes the sub-trees extracted by pointers are small enough to be kept entirely in memory. In such cases, we can chain toArray()
to eager load the extracted sub-trees into an array:
// ['gender' => 'female', 'country' => 'Germany'] $array = JsonParser::parse($source)->pointers(['/results/0/gender', '/results/0/location/country'])->toArray();
🐼 Lazy pointers
JSON Parser only keeps one key and one value in memory at a time. However, if the value is a large array or object, it may be inefficient or even impossible to keep it all in memory.
To solve this problem, we can use lazy pointers. These pointers recursively keep in memory only one key and one value at a time for any nested array or object.
$json = JsonParser::parse($source)->lazyPointer('/results/0/name'); foreach ($json as $key => $value) { // 1st iteration: $key === 'name', $value instanceof Parser }
Lazy pointers return a lightweight instance of Cerbero\JsonParser\Tokens\Parser
instead of the actual large value. To lazy load nested keys and values, we can then loop through the parser:
$json = JsonParser::parse($source)->lazyPointer('/results/0/name'); foreach ($json as $key => $value) { // 1st iteration: $key === 'name', $value instanceof Parser foreach ($value as $nestedKey => $nestedValue) { // 1st iteration: $nestedKey === 'title', $nestedValue === 'Mrs' // 2nd iteration: $nestedKey === 'first', $nestedValue === 'Sara' // 3rd iteration: $nestedKey === 'last', $nestedValue === 'Meder' } }
As mentioned above, lazy pointers are recursive. This means that no nested objects or arrays will ever be kept in memory:
$json = JsonParser::parse($source)->lazyPointer('/results/0/location'); foreach ($json as $key => $value) { // 1st iteration: $key === 'location', $value instanceof Parser foreach ($value as $nestedKey => $nestedValue) { // 1st iteration: $nestedKey === 'street', $nestedValue instanceof Parser // 2nd iteration: $nestedKey === 'city', $nestedValue === 'Sontra' // ... // 6th iteration: $nestedKey === 'coordinates', $nestedValue instanceof Parser // 7th iteration: $nestedKey === 'timezone', $nestedValue instanceof Parser } }
To lazily parse the entire JSON, we can simply chain the lazy()
method:
foreach (JsonParser::parse($source)->lazy() as $key => $value) { // 1st iteration: $key === 'results', $value instanceof Parser // 2nd iteration: $key === 'info', $value instanceof Parser }
We can recursively wrap any instance of Cerbero\JsonParser\Tokens\Parser
by chaining wrap()
. This lets us wrap lazy loaded JSON arrays and objects into classes with advanced functionalities, like mapping or filtering:
$json = JsonParser::parse($source) ->wrap(fn (Parser $parser) => new MyWrapper(fn () => yield from $parser)) ->lazy(); foreach ($json as $key => $value) { // 1st iteration: $key === 'results', $value instanceof MyWrapper foreach ($value as $nestedKey => $nestedValue) { // 1st iteration: $nestedKey === 0, $nestedValue instanceof MyWrapper // 2nd iteration: $nestedKey === 1, $nestedValue instanceof MyWrapper // ... } }
ℹ️ If your wrapper class implements the method
toArray()
, such method will be called when eager loading sub-trees into an array.
Lazy pointers also have all the other functionalities of normal pointers: they accept callbacks, can be set one by one or all together, can be eager loaded into an array and can be mixed with normal pointers as well:
// set custom callback to run only when names are found $json = JsonParser::parse($source)->lazyPointer('/results/-/name', fn (Parser $name) => $this->handleName($name)); // set multiple lazy pointers one by one $json = JsonParser::parse($source) ->lazyPointer('/results/-/name', fn (Parser $name) => $this->handleName($name)) ->lazyPointer('/results/-/location', fn (Parser $location) => $this->handleLocation($location)); // set multiple lazy pointers all together $json = JsonParser::parse($source)->lazyPointers([ '/results/-/name' => fn (Parser $name) => $this->handleName($name)), '/results/-/location' => fn (Parser $location) => $this->handleLocation($location)), ]); // eager load lazy pointers into an array // ['name' => ['title' => 'Mrs', 'first' => 'Sara', 'last' => 'Meder'], 'street' => ['number' => 46, 'name' => 'Römerstraße']] $array = JsonParser::parse($source)->lazyPointers(['/results/0/name', '/results/0/location/street'])->toArray(); // mix pointers and lazy pointers $json = JsonParser::parse($source) ->pointer('/results/-/gender', fn (string $gender) => $this->handleGender($gender)) ->lazyPointer('/results/-/name', fn (Parser $name) => $this->handleName($name));
⚙️ Decoders
By default JSON Parser uses the built-in PHP function json_decode()
to decode one key and value at a time.
Normally it decodes values to associative arrays but, if we prefer to decode values to objects, we can set a custom decoder:
use Cerbero\JsonParser\Decoders\JsonDecoder; JsonParser::parse($source)->decoder(new JsonDecoder(decodesToArray: false));
The simdjson extension offers a decoder faster than json_decode()
that can be installed via pecl install simdjson
if your server satisfies the requirements. JSON Parser leverages the simdjson decoder by default if the extension is loaded.
If we need a decoder that is not supported by default, we can implement our custom one.
Click here to see how to implement a custom decoder.
To create a custom decoder, we need to implement the Decoder
interface and implement 1 method:
use Cerbero\JsonParser\Decoders\Decoder; use Cerbero\JsonParser\Decoders\DecodedValue; class CustomDecoder implements Decoder { public function decode(string $json): DecodedValue { // return an instance of DecodedValue both in case of success or failure } }
The method decode()
defines the logic to decode the given JSON value and it needs to return an instance of DecodedValue
both in case of success or failure.
To make custom decoder implementations even easier, JSON Parser provides an abstract decoder that hydrates DecodedValue
for us so that we just need to define how a JSON value should be decoded:
use Cerbero\JsonParser\Decoders\AbstractDecoder; class CustomDecoder extends AbstractDecoder { protected function decodeJson(string $json): mixed { // decode the given JSON or throw an exception on failure return json_decode($json, flags: JSON_THROW_ON_ERROR); } }
⚠️ Please make sure to throw an exception in
decodeJson()
if the decoding process fails.
Now that we have implemented our custom decoder, we can set it like this:
JsonParser::parse($source)->decoder(new CustomDecoder());
To see some implementation examples, please refer to the already existing decoders.
If you find yourself implementing the same custom decoder in different projects, feel free to send a PR and we will consider to support your custom decoder by default. Thank you in advance for any contribution!
💢 Errors handling
Not all JSONs are valid, some may present syntax errors due to an incorrect structure (e.g. [}
) or decoding errors when values can't be decoded properly (e.g. [1a]
). JSON Parser allows us to intervene and define the logic to run when these issues occur:
use Cerbero\JsonParser\Decoders\DecodedValue; use Cerbero\JsonParser\Exceptions\SyntaxException; $json = JsonParser::parse($source) ->onSyntaxError(fn (SyntaxException $e) => $this->handleSyntaxError($e)) ->onDecodingError(fn (DecodedValue $decoded) => $this->handleDecodingError($decoded));
We can even replace invalid values with placeholders to avoid that the entire JSON parsing fails because of them:
// instead of failing, replace invalid values with NULL $json = JsonParser::parse($source)->patchDecodingError(); // instead of failing, replace invalid values with '<invalid>' $json = JsonParser::parse($source)->patchDecodingError('<invalid>');
For more advanced decoding errors patching, we can pass a closure that has access to the DecodedValue
instance:
use Cerbero\JsonParser\Decoders\DecodedValue; $patches = ['1a' => 1, '2b' => 2]; $json = JsonParser::parse($source) ->patchDecodingError(fn (DecodedValue $decoded) => $patches[$decoded->json] ?? null);
Any exception thrown by this package implements the JsonParserException
interface. This makes it easy to handle all exceptions in a single catch block:
use Cerbero\JsonParser\Exceptions\JsonParserException; try { JsonParser::parse($source)->traverse(); } catch (JsonParserException) { // handle any exception thrown by JSON Parser }
For reference, here is a comprehensive table of all the exceptions thrown by this package:
⏳ Progress
When processing large JSONs, it can be helpful to track the parsing progress. JSON Parser provides convenient methods for accessing all the progress details:
$json = new JsonParser($source); $json->progress(); // <Cerbero\JsonParser\ValueObjects\Progress> $json->progress()->current(); // the already parsed bytes e.g. 86759341 $json->progress()->total(); // the total bytes to parse e.g. 182332642 $json->progress()->fraction(); // the completed fraction e.g. 0.47583 $json->progress()->percentage(); // the completed percentage e.g. 47.583 $json->progress()->format(); // the formatted progress e.g. 47.5%
The total size of a JSON is calculated differently depending on the source. In some cases, it may not be possible to determine the size of a JSON and only the current progress is known:
$json->progress()->current(); // 86759341 $json->progress()->total(); // null $json->progress()->fraction(); // null $json->progress()->percentage(); // null $json->progress()->format(); // null
🛠 Settings
JSON Parser also provides other settings to fine-tune the parsing process. For example we can set the number of bytes to read when parsing JSON strings or streams:
$json = JsonParser::parse($source)->bytes(1024 * 16); // read JSON chunks of 16KB
📆 Change log
Please see CHANGELOG for more information on what has changed recently.
🧪 Testing
composer test
💞 Contributing
Please see CONTRIBUTING and CODE_OF_CONDUCT for details.
🧯 Security
If you discover any security related issues, please email andrea.marco.sartori@gmail.com instead of using the issue tracker.
🏅 Credits
⚖️ License
The MIT License (MIT). Please see License File for more information.