Home → Blog → JSON in Modern JavaScript
JSON in Modern JavaScript: ES6+ Features and Best Practices (2026)
JSON is everywhere in modern JavaScript development — API responses, configuration files, localStorage, Web Workers, service workers, and inter-process communication. While JSON.parse() and JSON.stringify() are the entry points, the ES6+ language features that have shipped over the last decade fundamentally change how you work with parsed JSON objects. Optional chaining alone has eliminated entire classes of runtime errors. Understanding how these features compose with JSON patterns separates code that merely works from code that is readable, safe, and maintainable under production conditions. This guide covers every modern JavaScript feature that touches JSON, with concrete examples drawn from real-world API integration patterns.
Need to format or validate your JSON?
Use the free online JSON formatter and validator — paste any JSON and get instant feedback.
Open JSON Formatter →JSON.stringify() and JSON.parse() Revisited
Every JavaScript developer knows the two-function core of the JSON API, but both functions accept powerful second and third arguments that most developers rarely use. Understanding the full signature unlocks capabilities that otherwise require post-processing loops.
JSON.stringify() with a Replacer
The second argument to JSON.stringify() is a replacer. It can be either an array of key names (a whitelist) or a function called for each key-value pair. The function form is especially powerful for transformation during serialization:
// Replacer as an array — only include these keys
const user = { id: 1, name: "Alice", password: "s3cr3t", role: "admin" };
JSON.stringify(user, ["id", "name", "role"]);
// '{"id":1,"name":"Alice","role":"admin"}' — password excluded
// Replacer as a function — transform values on the way out
const data = {
createdAt: new Date("2026-04-05"),
amount: 99.999,
internal: undefined,
name: "Widget"
};
JSON.stringify(data, (key, value) => {
if (value instanceof Date) return value.toISOString();
if (typeof value === "number") return Math.round(value * 100) / 100;
if (value === undefined) return null; // preserve the key
return value;
}, 2);
// {
// "createdAt": "2026-04-05T00:00:00.000Z",
// "amount": 100,
// "internal": null,
// "name": "Widget"
// }
The third argument controls indentation. Pass a number (spaces) or a string (used as the indent prefix). This is how you get human-readable pretty-printed output vs. minified output for network transmission.
JSON.parse() with a Reviver
The reviver function mirrors the replacer — it is called for each key-value pair as the JSON is parsed. The most common use case is restoring Date objects, which JSON serializes to ISO strings:
const ISO_DATE_RE = /^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}/;
function dateReviver(key, value) {
if (typeof value === "string" && ISO_DATE_RE.test(value)) {
return new Date(value);
}
return value;
}
const json = '{"name":"Order","createdAt":"2026-04-05T10:30:00.000Z","qty":5}';
const order = JSON.parse(json, dateReviver);
console.log(order.createdAt instanceof Date); // true
console.log(order.createdAt.getFullYear()); // 2026
Tip: The reviver is called bottom-up — leaf values first, then parent objects. The root object gets called last with an empty string key. Use this ordering when revivers depend on child values being transformed first.
Destructuring JSON Responses
Object and array destructuring, introduced in ES6, are the cleanest way to extract data from API responses. They replace verbose const name = data.user.name chains with concise, self-documenting variable declarations.
Object Destructuring
// Basic destructuring from a parsed API response
const response = {
status: "success",
data: {
user: { id: 42, name: "Alice", email: "alice@example.com" },
permissions: ["read", "write"]
},
meta: { page: 1, total: 150 }
};
// Extract nested values in one statement
const {
status,
data: { user: { id, name, email }, permissions },
meta: { page, total }
} = response;
console.log(name); // "Alice"
console.log(permissions); // ["read", "write"]
console.log(total); // 150
Renaming and Default Values
// Rename a key and provide a fallback for missing fields
const apiUser = { user_id: 99, display_name: "Bob" };
const {
user_id: userId, // rename user_id → userId
display_name: displayName, // rename display_name → displayName
avatar: avatarUrl = "/default-avatar.png" // default if missing
} = apiUser;
console.log(userId); // 99
console.log(avatarUrl); // "/default-avatar.png"
Array Destructuring
// Destructure paginated results
const { data: [firstItem, secondItem, ...rest] } = await fetchPage(1);
// Skip elements with commas
const [, second, , fourth] = [10, 20, 30, 40];
// Swap variables without a temp
let a = 1, b = 2;
[a, b] = [b, a];
Optional Chaining for Safe JSON Access
The optional chaining operator (?.), standardized in ES2020, is arguably the single most impactful language feature for JSON handling. Before it existed, safely navigating a deeply nested API response required verbose guard expressions or libraries like Lodash's _.get().
// Before optional chaining — defensive chains are unreadable
const city = data
&& data.user
&& data.user.address
&& data.user.address.city;
// With optional chaining — clean and safe
const city = data?.user?.address?.city;
// Works with method calls too
const upper = data?.user?.name?.toUpperCase();
// Works with bracket notation for dynamic keys
const fieldName = "email";
const email = data?.user?.[fieldName];
// Works with function calls
const result = config?.transform?.(rawValue);
Optional chaining short-circuits at the first null or undefined and returns undefined — it never throws a TypeError. This matters enormously with external API data where any field could be absent due to API version differences, partial responses, or access control.
Common mistake: Optional chaining does not guard against non-object primitives. If data.user is the number 0 or the string "", then data.user?.name throws because you are calling ?. on a non-nullable primitive. Optional chaining only guards against null and undefined.
Nullish Coalescing for Defaults
The nullish coalescing operator (??) provides a default value when an expression evaluates to null or undefined. It is the natural companion to optional chaining.
// ?? vs || — a critical difference
const config = { timeout: 0, retries: 0, label: "" };
// || uses falsy check — breaks for legitimate zero/empty values
config.timeout || 5000 // → 5000 WRONG: 0 is valid!
config.retries || 3 // → 3 WRONG: 0 retries is valid!
config.label || "none" // → "none" WRONG: empty string is valid!
// ?? uses nullish check — only replaces null/undefined
config.timeout ?? 5000 // → 0 CORRECT
config.retries ?? 3 // → 0 CORRECT
config.label ?? "none" // → "" CORRECT
// Combining with optional chaining
const timeout = response?.config?.timeout ?? 5000;
const userName = user?.profile?.displayName ?? user?.name ?? "Anonymous";
As a rule of thumb: use ?? when the field could legitimately be 0, false, or an empty string. Use || only when you actually want to treat all falsy values as missing — for example, when a non-empty string is always required.
Fetching JSON with the Fetch API
The Fetch API is the standard way to retrieve JSON from HTTP endpoints in modern JavaScript. Despite its clean interface, there are several non-obvious behaviors that cause bugs in production code.
The Basic Pattern
// fetch() only rejects on network failure, not HTTP errors
// You MUST check response.ok
async function fetchJson(url) {
const response = await fetch(url);
if (!response.ok) {
throw new Error(`HTTP error ${response.status}: ${response.statusText}`);
}
return response.json(); // Returns a Promise — await it or return it
}
const users = await fetchJson("https://api.example.com/users");
AbortController for Timeouts
async function fetchWithTimeout(url, timeoutMs = 5000) {
const controller = new AbortController();
const timerId = setTimeout(() => controller.abort(), timeoutMs);
try {
const response = await fetch(url, { signal: controller.signal });
if (!response.ok) throw new Error(`HTTP ${response.status}`);
return await response.json();
} catch (err) {
if (err.name === "AbortError") {
throw new Error(`Request timed out after ${timeoutMs}ms`);
}
throw err;
} finally {
clearTimeout(timerId);
}
}
Sending JSON in a POST Request
async function postJson(url, payload) {
const response = await fetch(url, {
method: "POST",
headers: {
"Content-Type": "application/json",
"Accept": "application/json"
},
body: JSON.stringify(payload)
});
if (!response.ok) {
const errorBody = await response.json().catch(() => ({}));
throw Object.assign(new Error(`HTTP ${response.status}`), { body: errorBody });
}
// Handle 204 No Content — response.json() would throw
if (response.status === 204) return null;
return response.json();
}
Async/Await Patterns for JSON APIs
Async/await makes asynchronous JSON fetching as readable as synchronous code. But there are patterns that scale well and anti-patterns that introduce subtle bugs under concurrent usage.
Sequential vs. Parallel Requests
// SLOW: sequential — each awaits the previous
async function loadDashboardSlow(userId) {
const user = await fetchJson(`/api/users/${userId}`);
const orders = await fetchJson(`/api/orders?userId=${userId}`);
const prefs = await fetchJson(`/api/prefs/${userId}`);
return { user, orders, prefs };
}
// FAST: parallel — all requests in flight simultaneously
async function loadDashboardFast(userId) {
const [user, orders, prefs] = await Promise.all([
fetchJson(`/api/users/${userId}`),
fetchJson(`/api/orders?userId=${userId}`),
fetchJson(`/api/prefs/${userId}`)
]);
return { user, orders, prefs };
}
// RESILIENT: parallel with individual error handling
async function loadDashboardResilient(userId) {
const results = await Promise.allSettled([
fetchJson(`/api/users/${userId}`),
fetchJson(`/api/orders?userId=${userId}`),
fetchJson(`/api/prefs/${userId}`)
]);
const [userResult, ordersResult, prefsResult] = results;
return {
user: userResult.status === "fulfilled" ? userResult.value : null,
orders: ordersResult.status === "fulfilled" ? ordersResult.value : [],
prefs: prefsResult.status === "fulfilled" ? prefsResult.value : {}
};
}
Performance tip: Use Promise.all() when all requests must succeed together, and Promise.allSettled() when partial data is acceptable. The difference in wall-clock time for 3 independent 200ms API calls is 200ms (parallel) vs. 600ms (sequential).
Spread and Rest for JSON Manipulation
The spread operator (...) enables immutable JSON object manipulation without mutation — a requirement in React state management, Redux reducers, and any context where object identity matters.
// Shallow clone — safe for flat objects
const original = { id: 1, name: "Alice", score: 100 };
const clone = { ...original };
// Merge two objects (rightmost wins on conflict)
const defaults = { timeout: 5000, retries: 3, verbose: false };
const overrides = { retries: 5, verbose: true };
const config = { ...defaults, ...overrides };
// { timeout: 5000, retries: 5, verbose: true }
// Update a single field immutably (React state pattern)
const updatedUser = { ...user, score: user.score + 10 };
// Remove a field using rest (without mutation)
const { password, ...publicUser } = fullUser;
// publicUser has all fields except password
// Pick specific fields
const { id, name, email } = fullUser;
const summary = { id, name, email };
Warning: Spread is a shallow clone. Nested objects are still shared by reference. If original.address is an object, clone.address === original.address is true. Mutating clone.address.city also changes original.address.city.
Array Methods for JSON Arrays
JSON arrays of objects — the most common API response format — are best processed with the functional array methods. These are composable, non-mutating, and readable.
const products = [
{ id: 1, name: "Widget A", price: 29.99, category: "tools", inStock: true },
{ id: 2, name: "Widget B", price: 9.99, category: "parts", inStock: false },
{ id: 3, name: "Gadget X", price: 49.99, category: "tools", inStock: true },
{ id: 4, name: "Gadget Y", price: 19.99, category: "parts", inStock: true }
];
// filter — get only in-stock items
const available = products.filter(p => p.inStock);
// map — transform to display-ready format
const displayItems = products.map(p => ({
label: `${p.name} ($${p.price.toFixed(2)})`,
value: p.id
}));
// find — get first matching item
const firstTool = products.find(p => p.category === "tools");
// reduce — aggregate (total price of in-stock items)
const totalValue = products
.filter(p => p.inStock)
.reduce((sum, p) => sum + p.price, 0);
// flatMap — expand nested arrays (e.g., tags per product)
const tagged = [
{ name: "A", tags: ["new", "sale"] },
{ name: "B", tags: ["featured"] }
];
const allTags = tagged.flatMap(p => p.tags);
// ["new", "sale", "featured"]
// Build a lookup map from an array (O(n) → O(1) access)
const productById = Object.fromEntries(
products.map(p => [p.id, p])
);
productById[3]; // { id: 3, name: "Gadget X", ... }
JSON and the Structured Clone Algorithm
The JSON.parse(JSON.stringify(obj)) pattern is the most widely-used deep clone idiom in JavaScript. It works reliably for plain data objects but has important limitations developers frequently discover at the worst time.
// What JSON clone handles well
const data = {
name: "Alice",
scores: [100, 95, 87],
nested: { city: "Portland", zip: "97201" }
};
const deep = JSON.parse(JSON.stringify(data));
deep.scores.push(75); // does NOT affect data.scores ✓
deep.nested.city = "LA"; // does NOT affect data.nested.city ✓
// What JSON clone CANNOT handle
const problematic = {
date: new Date(), // → becomes string "2026-04-05T..."
fn: () => 42, // → dropped (undefined in output)
undef: undefined, // → dropped (key disappears)
map: new Map(), // → becomes {}
set: new Set([1,2,3]), // → becomes {}
regex: /foo/gi, // → becomes {}
};
// Circular reference → throws SyntaxError
const circular = { a: 1 };
circular.self = circular;
JSON.parse(JSON.stringify(circular)); // TypeError: Converting circular structure
// Modern alternative: structuredClone() (Node 17+, all modern browsers)
const better = structuredClone(data); // handles Date, Map, Set, RegExp, ArrayBuffer
// Still cannot clone functions or DOM nodes
For production applications, structuredClone() is the correct deep clone for most use cases. It handles a much wider range of types and has native performance. Reserve JSON.parse(JSON.stringify()) for cases where you also need JSON serialization as part of the process — not as a standalone cloning mechanism.
WeakRef and JSON Caching
Parsing large JSON is expensive — a 1 MB JSON response can take 5–15 ms to parse depending on the device. If the same response is needed multiple times (navigation, refetch, search), caching the parsed object avoids redundant work. But a simple Map cache holds strong references and prevents garbage collection of objects the rest of your app no longer needs. WeakRef (ES2021) solves this:
const cache = new Map(); // key → WeakRef
async function fetchJsonCached(url) {
const ref = cache.get(url);
if (ref) {
const cached = ref.deref(); // deref() returns undefined if GC'd
if (cached !== undefined) {
console.log("Cache hit:", url);
return cached;
}
}
const data = await fetchJson(url);
cache.set(url, new WeakRef(data));
return data;
}
// The parsed data can be GC'd when no other code holds a reference.
// A FinalizationRegistry can clean up the cache entry afterward:
const registry = new FinalizationRegistry(key => {
cache.delete(key);
console.log("Cache entry cleaned up:", key);
});
// Register when inserting
function setCached(url, data) {
const ref = new WeakRef(data);
cache.set(url, ref);
registry.register(data, url);
}
JSON in Web Workers
Parsing a 10 MB JSON file on the main thread freezes the browser UI for hundreds of milliseconds. Offloading to a Web Worker keeps the interface responsive. The postMessage API automatically serializes data using the Structured Clone algorithm, so you can send plain objects between threads without manual JSON serialization:
// worker.js
self.onmessage = function(event) {
const { jsonString, requestId } = event.data;
try {
const parsed = JSON.parse(jsonString); // runs off main thread
// Do heavy processing here too
const result = processData(parsed);
self.postMessage({ requestId, result });
} catch (err) {
self.postMessage({ requestId, error: err.message });
}
};
// main.js
const worker = new Worker("worker.js");
function parseJsonInWorker(jsonString) {
return new Promise((resolve, reject) => {
const requestId = crypto.randomUUID();
worker.postMessage({ jsonString, requestId });
worker.addEventListener("message", function handler(e) {
if (e.data.requestId !== requestId) return;
worker.removeEventListener("message", handler);
if (e.data.error) reject(new Error(e.data.error));
else resolve(e.data.result);
});
});
}
// Usage
const data = await parseJsonInWorker(hugJsonString);
Tip: Transferring large ArrayBuffers between worker and main thread using the transfer list in postMessage is zero-copy and takes microseconds regardless of size. If you load JSON from a fetch in the worker, you can pass the raw text via postMessage without the clone overhead.
Template Literals and JSON
Embedding JSON in HTML strings is a common server-side rendering pattern, but it introduces an XSS vulnerability if not handled carefully. Template literals make the code readable, but they do not add safety automatically.
// DANGEROUS: if userData.name contains , page is broken
const html = `<script>const user = ${JSON.stringify(userData)};</script>`;
// SAFE: replace characters that close script blocks
function htmlSafeJson(value) {
return JSON.stringify(value)
.replace(/</g, "\\u003c")
.replace(/>/g, "\\u003e")
.replace(/&/g, "\\u0026")
.replace(/\u2028/g, "\\u2028") // Line separator — breaks JS
.replace(/\u2029/g, "\\u2029"); // Paragraph separator
}
// Now safe to embed in a script tag
const html = `<script>const user = ${htmlSafeJson(userData)};</script>`;
In React with Next.js, use JSON.stringify inside a <script> tag with dangerouslySetInnerHTML only after applying the HTML-safe encoding. Better still, use framework-provided mechanisms like Next.js's getServerSideProps props serialization which handles this automatically.
Common JavaScript JSON Mistakes
These are the bugs that show up in code review and production incidents, not in tutorials.
Mutating a Shared Parsed Object
// BAD: multiple call sites share the same object
const config = JSON.parse(configJson);
function applyUserSettings(settings) {
config.theme = settings.theme; // mutates shared state — side effects!
}
// GOOD: always clone before mutating
function applyUserSettings(settings) {
return { ...config, theme: settings.theme };
}
Assuming JSON Numbers Are Safe Integers
// JSON has no integer type — all numbers are IEEE 754 doubles
// Integers above 2^53 lose precision
const json = '{"id": 9007199254740993}'; // 2^53 + 1
const obj = JSON.parse(json);
console.log(obj.id); // 9007199254740992 — WRONG!
// Solution: use BigInt reviver for known large-integer fields
JSON.parse(json, (key, value, ctx) => {
if (key === "id") return BigInt(ctx.source); // requires Node 22+ / Chrome 114+
return value;
});
Not Handling null in JSON
// JSON null becomes JavaScript null — not undefined
const data = JSON.parse('{"name": null}');
data.name.toUpperCase(); // TypeError: Cannot read properties of null
// Always guard null separately from undefined
const name = data.name ?? "Unknown";
// Or test explicitly
if (data.name !== null && data.name !== undefined) { ... }
Performance: When JSON Gets Slow
JSON parsing is fast for typical payloads, but it becomes a bottleneck at scale. Responses over 1 MB parsed on a low-end mobile CPU can take 50–300 ms. Understanding the performance landscape helps you apply the right solution.
Streaming JSON with NDJSON
Newline-Delimited JSON (NDJSON) formats a response as one JSON object per line. This lets you start rendering data before the full response arrives — critical for large list responses:
async function* streamNdjson(url) {
const response = await fetch(url);
const reader = response.body.getReader();
const decoder = new TextDecoder();
let buffer = "";
while (true) {
const { value, done } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split("\n");
buffer = lines.pop(); // keep incomplete last line
for (const line of lines) {
if (line.trim()) yield JSON.parse(line);
}
}
}
// Render items as they arrive
for await (const item of streamNdjson("/api/export")) {
appendToTable(item); // UI updates progressively
}
Large Payload Strategies
- Pagination: Fetch 50–200 records per page instead of the full dataset.
- Field projection: Only request the fields you render (GraphQL or REST sparse fieldsets).
- Server-side filtering: Push filtering to the API — never send 10,000 rows to filter 50 in the browser.
- Binary formats: For very high-frequency data (telemetry, market data), consider MessagePack or Protobuf over JSON.
- Worker offloading: Keep parsing off the main thread for responses over 500 KB.
Frequently Asked Questions
JSON.parse() converts a JSON string into a JavaScript value (object, array, string, number, boolean, or null). JSON.stringify() does the opposite — it converts a JavaScript value into a JSON-formatted string. Use parse to read JSON data received from an API or file, and stringify to produce JSON output for storage or network transmission.?.) from ES2020. Instead of data && data.user && data.user.address && data.user.address.city, write data?.user?.address?.city. If any part of the chain is null or undefined, the expression returns undefined rather than throwing a TypeError. Combine with ?? to provide a default: data?.user?.address?.city ?? "Unknown".structuredClone(obj) — it handles Date, Map, Set, RegExp, and ArrayBuffer without these limitations. Use Lodash's cloneDeep as a fallback for older environments.fetch() only rejects on network failures, not on HTTP error status codes like 404 or 500. Always check response.ok after awaiting the fetch: if (!response.ok) throw new Error(`HTTP ${response.status}`). Wrap everything in try/catch and handle AbortError separately if you use AbortController for timeouts.postMessage and receive the parsed result back. For truly huge datasets (50 MB+), use NDJSON streaming with the Fetch ReadableStream API to process records incrementally rather than parsing the entire document at once.Related Tools & Guides
JSON Formatter | JSON Validator | JSON Minifier | JSON Stringify / Parse Tool | NDJSON Converter