xp-forge / json
Reads and writes JSON to and from various input sources
Installs: 128 905
Dependents: 10
Suggesters: 0
Security: 0
Stars: 3
Watchers: 4
Forks: 1
Open Issues: 0
Requires
- php: >=7.4.0
- xp-framework/core: ^12.0 | ^11.0 | ^10.0
Requires (Dev)
- xp-framework/test: ^2.0 | ^1.0
- dev-master
- v6.0.0
- v5.1.0
- v5.0.4
- v5.0.3
- v5.0.2
- v5.0.1
- v5.0.0
- v4.1.0
- v4.0.1
- v4.0.0
- v3.1.2
- v3.1.1
- v3.1.0
- v3.0.2
- v3.0.1
- v3.0.0
- v2.3.1
- v2.3.0
- v2.2.0
- v2.1.1
- v2.1.0
- v2.0.1
- v2.0.0
- v1.0.1
- v1.0.0
- v0.9.0
- v0.8.5
- v0.8.4
- dev-feature/json_io
- dev-refactor/read-empty-objects
- dev-refactor/json-object
- dev-refactor/empty-objects
This package is auto-updated.
Last update: 2025-05-04 11:19:58 UTC
README
Reads and writes JSON to and from various input sources.
Examples
Reading can be done from a string, file or stream:
use text\json\Json; use io\File; use peer\SocketInputStream; // Strings $value= Json::read('"Test"'); // Input $in= '{"Hello": "World"}'); $in= new File('input.json'); $in= new SocketInputStream(/* ... */); $value= Json::read($in);
Writing can be done to a string, file or stream:
use text\json\Json; use io\File; use peer\SocketOutputStream; // Strings $json= Json::of('Test'); // Output $out= new File('output.json'); $out= new SocketOuputStream(/* ... */); Json::write($value, $out);
Formatted output
To change the output format, use one of the Output
implementations and pass a Format
instance to the output's constructor. The formats available are:
DenseFormat($options)
: Best for network I/O, no unsignificant whitespace, default if nothing given and accessible viaFormat::dense($options= ~Format::ESCAPE_SLASHES)
.WrappedFormat($indent, $options)
: Wraps first-level arrays and all objects, uses whitespace after commas colons. An instance of this format using 4 spaces for indentation and per default leaving forward slashes unescaped is available viaFormat::wrapped($indent= " ", $options= ~Format::ESCAPE_SLASHES)
.
The available options that can be or'ed together are:
Format::ESCAPE_SLASHES
: Escape forward-slashes with "" - default behavior.Format::ESCAPE_UNICODE
: Escape unicode with "\uXXXX" - default behavior.Format::ESCAPE_ENTITIES
: Escape XML entities&
,"
,<
and>
. Per default, these are represented in their literal form.
use text\json\{FileOutput, Format}; $out= new FileOutput('glue.json', Format::wrapped()); $out->write([ 'name' => 'example/package', 'version' => '1.0.0', 'require' => [ 'xp-forge/json' => '^3.0', 'xp-framework/core' => '^10.0' ] ]);
The above code will yield the following output:
{ "name": "example/package", "version": "1.0.0'", "require": { "xp-forge/json": "^3.0", "xp-framework/core": "^10.0" } }
Sequential processing
Processing elements sequentially can save you memory and give a better performance in certain situations.
Reading
You can use the elements()
method to receive an iterator over a JSON array. Instead of loading the entire source into memory and then returning the parsed array, it will parse one array element at a time, yielding them while going.
use peer\http\HttpConnection; use text\json\StreamInput; $conn= new HttpConnection(...); $in= new StreamInput($conn->get('/search?q=example&limit=1000')->in()); foreach ($in->elements() as $element) { // Process }
If you get a huge object, you can also process it sequentially using the pairs()
method. This will parse a single key/value pair at a time.
use peer\http\HttpConnection; use text\json\StreamInput; $conn= new HttpConnection(...); $in= new StreamInput($conn->get('/resources/4711?expand=*')->in()); foreach ($in->pairs() as $key => $value) { // Process }
To detect the type of the data on the stream (again, without reading it completely), you can use the type()
method.
use peer\http\HttpConnection; use text\json\StreamInput; $conn= new HttpConnection(...); $in= new StreamInput($conn->get($resource)->in()); $type= $in->type(); if ($type->isArray()) { // Handle arrays } else if ($type->isObject()) { // Handle objects } else { // Handle primitives }
Writing
To write data sequentially, you can use the begin()
method and the stream it returns. This makes sense when the source offers a way to read data sequentially, if you already have the entire data in memory, using write()
has the same effect.
use text\json\{StreamOutput, Types}; $query= $conn->query('select * from person'); $stream= (new StreamOutput(...))->begin(Types::$ARRAY); while ($record= $query->next()) { $stream->element($record); } $stream->close();
As the Stream
class implements the Closeable interface, it can be used in the with
statement:
use text\json\{StreamOutput, Types}; $query= $conn->query('select * from person'); with ((new StreamOutput(...))->begin(Types::$ARRAY), function($stream) use($query) { while ($record= $query->next()) { $stream->element($record); } });
Further reading
- Performance figures. TL;DR: While slower than the native functionality, the performance overhead is in low millisecond ranges. Using sequential processing we have an advantage both performance- and memory-wise.
- Parsing JSON is a Minefield. This library runs this test suite next to its own.