Write JSON to File in JavaScript (Node.js)

Node.js provides three ways to write files: fs.writeFileSync (synchronous, blocks), fs.writeFile (callback-based async), and fs.promises.writeFile (Promise-based async). All three work with JSON.stringify to serialize your data before writing. This guide covers each approach, pretty printing, safe atomic writes, and streaming large datasets.

Want to validate your JSON before writing it to disk? Paste it into Jsonic's formatter to catch errors instantly.

Validate your JSON before writing to disk

The quickest way — fs.writeFileSync

fs.writeFileSync writes a file synchronously and returns only when the write is complete. Combine it with JSON.stringify to serialize any JavaScript object or array to disk in one step.

const fs = require('fs');

const data = {
  name: "Alice",
  scores: [95, 87, 92],
  active: true,
};

// Write compact JSON
fs.writeFileSync('data.json', JSON.stringify(data), 'utf8');

// Write pretty-printed JSON (2-space indent)
fs.writeFileSync('data.json', JSON.stringify(data, null, 2), 'utf8');

When to use it: scripts, CLI tools, and startup code where blocking the event loop is acceptable. Avoid it inside HTTP request handlers or any code path that must stay responsive under concurrent load — a slow disk write will stall every other request while it completes.

Always pass 'utf8' as the third argument. Without it, Node.js writes a raw Buffer to the file, which is valid binary but not a proper text file that other tools expect.

fs.promises.writeFile — async/await

For server applications and any code that needs to stay non-blocking, use the Promise-based API. Import fs/promises and await the write.

const fs = require('fs/promises');

async function writeConfig(path, config) {
  const json = JSON.stringify(config, null, 2);
  await fs.writeFile(path, json, 'utf8');
  console.log('Config saved.');
}

const config = {
  version: "2.0",
  debug: false,
  features: { darkMode: true, analytics: false },
};

// Usage
writeConfig('config.json', config).catch(console.error);

fs/promises is available from Node.js 14 onward. If you're on an older version, use require('fs').promises instead — it's the same API.

fs.writeFile with callback

The original async form uses an error-first callback. It's non-blocking like the Promise version but requires nesting callbacks rather than await. You'll find this pattern in older Node.js codebases.

const fs = require('fs');

const data = { key: "value", count: 42 };

fs.writeFile('output.json', JSON.stringify(data, null, 2), 'utf8', (err) => {
  if (err) {
    console.error('Write failed:', err.message);
    return;
  }
  console.log('File written successfully.');
});

For new code, prefer fs.promises.writeFile — it composes cleanly withasync/await and avoids callback nesting when you need to chain multiple file operations.

JSON.stringify options for formatting

JSON.stringify(value, replacer, space) accepts three arguments. The replacer filters or transforms keys; space controls indentation. Here are the most useful combinations:

const data = {
  z: "last",
  a: "first",
  nested: { b: 2, a: 1 },
};

// Compact (no whitespace) — smallest file size
JSON.stringify(data);
// {"z":"last","a":"first","nested":{"b":2,"a":1}}

// 2-space indent — most common
JSON.stringify(data, null, 2);

// 4-space indent
JSON.stringify(data, null, 4);

// Tab indent
JSON.stringify(data, null, '\t');

// Sort keys alphabetically
JSON.stringify(data, null, 2);  // then add sort_keys manually
// To sort: JSON.stringify(sortKeys(data), null, 2)

// Replacer: include only certain keys
JSON.stringify(data, ['a', 'nested'], 2);
// {"a":"first","nested":{...}}

Use compact JSON (JSON.stringify(data)) for machine-readable files where size matters, such as API responses cached to disk. Use 2-space indented JSON for config files and anything humans will read or diff in version control.

Read, modify, write back

A common pattern is reading an existing JSON file, updating one or more fields, then writing the modified data back to the same path. Use fs/promises for both the read and the write to keep the operation async.

const fs = require('fs/promises');

async function updateUser(filePath, userId, updates) {
  // Read and parse
  const raw = await fs.readFile(filePath, 'utf8');
  const users = JSON.parse(raw);

  // Modify
  const index = users.findIndex(u => u.id === userId);
  if (index === -1) throw new Error(`User ${userId} not found`);
  users[index] = { ...users[index], ...updates };

  // Write back
  await fs.writeFile(filePath, JSON.stringify(users, null, 2), 'utf8');
}

// Usage
await updateUser('users.json', 42, { active: false, updatedAt: new Date().toISOString() });

For files that multiple processes might update concurrently, consider an atomic write (covered below) to avoid writing a half-updated file if two updates race.

Append records with NDJSON

When you need to append records to a log-style file without reading the whole file first, use NDJSON (newline-delimited JSON): one JSON object per line. Appends are fast with fs.appendFileSync, and the file can be streamed or processed line by line without loading everything into memory.

const fs = require('fs');

// NDJSON: one JSON object per line — fast appends
function appendRecord(filePath, record) {
  const line = JSON.stringify(record) + '\n';
  fs.appendFileSync(filePath, line, 'utf8');
}

appendRecord('events.ndjson', { event: 'login',  user: 'alice', ts: Date.now() });
appendRecord('events.ndjson', { event: 'logout', user: 'alice', ts: Date.now() });

// Read back
const lines = fs.readFileSync('events.ndjson', 'utf8')
  .trim().split('\n')
  .map(line => JSON.parse(line));

NDJSON is the right format for event logs, audit trails, and any append-heavy workload. For more on the format and its tradeoffs, see the JSON vs NDJSON guide.

Atomic write to prevent corruption

If your process crashes partway through a writeFileSync, the file can be left empty or partially written. To guarantee the file is either fully updated or completely unchanged, write to a temp file first and then rename it over the target. Renaming is atomic on the same filesystem.

const fs = require('fs');
const path = require('path');
const os = require('os');

function writeJsonAtomic(filePath, data, indent = 2) {
  const json = JSON.stringify(data, null, indent);
  const dir = path.dirname(path.resolve(filePath));
  const tmpPath = path.join(dir, `.tmp-${Date.now()}-${Math.random().toString(36).slice(2)}`);

  try {
    fs.writeFileSync(tmpPath, json, 'utf8');
    fs.renameSync(tmpPath, filePath); // atomic on same filesystem
  } catch (err) {
    // Clean up temp file on failure
    try { fs.unlinkSync(tmpPath); } catch {}
    throw err;
  }
}

writeJsonAtomic('config.json', { version: '1.0', debug: false });

Keep the temp file in the same directory as the target so the rename stays on the same filesystem. Renaming across filesystems (e.g., from /tmp to your app directory) is not atomic — it becomes a copy-then-delete, which re-introduces the corruption window.

Streaming large JSON files

When writing very large JSON arrays (hundreds of megabytes or more), JSON.stringify must hold the entire serialized string in memory before writing. For large datasets, stream the output record by record instead.

const fs = require('fs');

function writeJsonArrayStream(filePath, records) {
  return new Promise((resolve, reject) => {
    const stream = fs.createWriteStream(filePath, { encoding: 'utf8' });

    stream.on('error', reject);
    stream.on('finish', resolve);

    stream.write('[\n');

    records.forEach((record, i) => {
      const comma = i < records.length - 1 ? ',' : '';
      stream.write('  ' + JSON.stringify(record) + comma + '\n');
    });

    stream.end(']\n');
  });
}

// Usage — works with large arrays without OOM
const bigArray = Array.from({ length: 100_000 }, (_, i) => ({ id: i, value: Math.random() }));
await writeJsonArrayStream('large.json', bigArray);

For maximum throughput with truly massive datasets, prefer NDJSON — you write each line independently and never need to hold more than one record in memory at a time.

Browser: trigger a JSON download

Browsers have no filesystem write access. To let users "save" JSON from a web app, create a Blob from the serialized string and trigger a download via a temporary anchor element.

function downloadJson(data, filename = 'data.json') {
  const json = JSON.stringify(data, null, 2);
  const blob = new Blob([json], { type: 'application/json' });
  const url = URL.createObjectURL(blob);

  const a = document.createElement('a');
  a.href = url;
  a.download = filename;
  a.click();

  URL.revokeObjectURL(url); // free memory
}

// Usage
downloadJson({ user: 'Alice', scores: [95, 87] }, 'results.json');

Call URL.revokeObjectURL immediately after the click — the browser queues the download before the URL is freed, so the file still downloads correctly. Holding onto the object URL longer than needed wastes memory.

Comparison: which method to use?

MethodBlockingError handlingUse when
fs.writeFileSyncYestry/catchScripts, CLI tools, startup code
fs.writeFile (callback)NoError-first callbackOlder Node.js codebases
fs.promises.writeFileNoasync/await try/catchModern async functions, servers
stream.WritableNostream eventsVery large files (>100 MB)

Frequently asked questions

How do I write a JavaScript object to a JSON file?

Use JSON.stringify to convert the object to a JSON string, then write it with fs.writeFileSync or fs.promises.writeFile: fs.writeFileSync('data.json', JSON.stringify(obj, null, 2), 'utf8'). The second argument to JSON.stringify (null) means no replacer, and 2 adds 2-space indentation. Always pass an encoding (utf8) to avoid writing a raw Buffer.

What is the difference between fs.writeFileSync and fs.promises.writeFile?

fs.writeFileSync blocks the Node.js event loop until the write completes — safe for scripts and startup code but wrong for servers handling concurrent requests. fs.promises.writeFile (or the callback-based fs.writeFile) is asynchronous and does not block the event loop. In async functions, await fs.promises.writeFile(path, data) is the idiomatic modern choice.

How do I append a new item to an existing JSON array file?

Read the file, parse the JSON, push the new item, then write it back: const arr = JSON.parse(fs.readFileSync('items.json', 'utf8')); arr.push(newItem); fs.writeFileSync('items.json', JSON.stringify(arr, null, 2)). For high-frequency appends where reading the whole file is too slow, switch to NDJSON: append one JSON object per line with fs.appendFileSync.

How do I write JSON to a file safely to avoid corruption?

Use an atomic write: write to a temp file in the same directory, then rename it over the target with fs.renameSync. Renaming is atomic on the same filesystem — the target is either fully replaced or unchanged, so a crash mid-write cannot leave a half-written file. Keep the temp file in the same directory as the target to stay on the same filesystem.

How do I write a large JSON array to a file without running out of memory?

Stream the output: open a writable stream and write each record one at a time. Start with stream.write('['), write each JSON.stringify(item) separated by commas, then close with ']'. For NDJSON, use stream.write(JSON.stringify(item) + '\n') per record — no brackets needed.

Can I write JSON in the browser (not Node.js)?

Browsers don't have filesystem access for writing local files. To trigger a download, create a Blob from the JSON string and use a temporary anchor element: const blob = new Blob([JSON.stringify(data)], { type: 'application/json' }); then click a link with download='data.json'. The user's download folder receives the file.

Validate your JSON before writing to disk

Catch syntax errors before they reach your filesystem. Paste your JSON into Jsonic's online formatter to validate and pretty-print it instantly.

Validate your JSON before writing to disk