Specialized Node.js library for memory-efficient operations on JSON arrays. Stream individual elements from large JSON arrays (files, network responses etc.) and append elements to array files without loading the entire array into memory. Perfect for processing large-scale JSON array datasets without memory limitations.
- 🎯 Specialized: Purpose-built for JSON arrays
- 💾 Memory Efficient: Process arrays of any size without loading them entirely
- ⚡ High Performance: Optimized streaming and batch operations
- ✍️ Direct Updates: Append elements without rewriting the entire file
- 🔄 Format Agnostic: Works with any valid JSON array structure
npm install jsonarrayfs
Processs a large JSON array file (e.g., application logs) without loading it into memory:
import { JsonArrayStream } from "jsonarrayfs";
import { createReadStream } from "node:fs";
// Analyze logs: Count errors and slow responses
const fileStream = createReadStream("app.log.json");
const arrayStream = new JsonArrayStream("utf8");
let errorCount = 0;
let slowResponses = 0;
for await (const log of fileStream.pipe(arrayStream)) {
if (log !== JsonArrayStream.NULL) {
if (log.level === "error") errorCount++;
if (log.responseTime > 1000) slowResponses++;
}
}
console.log(
`Analysis complete: Found ${errorCount} errors, ${slowResponses} slow responses`,
);
Process a JSON array from an API response:
import { JsonArrayStream } from "jsonarrayfs";
import { get } from "node:https";
get("https://api.example.com/json-array-data", (res) => {
const arrayStream = new JsonArrayStream("utf8");
res.pipe(arrayStream).on("data", (item) => {
console.log(`Got item: ${item === JsonArrayStream.NULL ? null : item}`);
});
});
Append new elements to an existing JSON array file:
import { JsonArrayStream, appendToJsonArrayFile } from "jsonarrayfs";
// Append new log entries
const newLogs = [
{
timestamp: Date.now(),
level: "info",
message: "User login successful",
responseTime: 245,
},
{
timestamp: Date.now(),
level: "info",
message: "User login successful",
responseTime: 1245,
},
null,
{
timestamp: Date.now(),
level: "error",
message: "Database connection timeout",
responseTime: 1532,
},
];
await appendToJsonArrayFile("app.log.json", "utf8", ...newLogs);
A transform stream that parses JSON array elements one by one for efficient processing. When processing arrays containing null
values, it uses a special sentinel value (JsonArrayStream.NULL
) to distinguish between JSON null
and stream EOF.
new JsonArrayStream(encoding?: string)
encoding
(string, optional): Content encoding (default: 'utf8')
JsonArrayStream.NULL
: Special sentinel value to distinguish between JSONnull
and stream EOF
data
: Emitted for each array elementerror
: Emitted when parsing fails or input is invalid
Appends elements to a JSON array file efficiently without loading the entire file into memory.
async function appendToJsonArrayFile(
filePath: string,
encoding?: string,
...elements: any[]
): Promise<void>;
filePath
(string): Path to the JSON array fileencoding
(string, optional): File encoding (default: 'utf8')...elements
(any[]): Elements to append to the array
Promise that resolves when the append operation is complete.
The library can throw these errors:
- Invalid JSON array format
- Malformed array elements
- Unexpected end of input
- Invalid JSON array format
- File system errors
- Permission issues
- Invalid input elements
- Node.js >= 16.0.0
- Input file must be a valid JSON array
MIT License - see the LICENSE file for details