JSON Number Precision: Handle Large Integers, Floats, and BigInt

JSON numbers are stored as IEEE 754 double-precision floating-point values (64-bit doubles) by most parsers. This means integers larger than 2^53 - 1 (9,007,199,254,740,991 — Number.MAX_SAFE_INTEGER in JavaScript) cannot be represented exactly. JSON.parse('{"id":9007199254740993}') in JavaScript returns 9007199254740992 — silently wrong by 1. This is a real problem for APIs that return database IDs, snowflake IDs, or 64-bit integer identifiers. The three safe approaches are: (1) encode large integers as JSON strings; (2) use JavaScript's BigInt type with a custom parser; (3) use json-bigint or lossless-json libraries. Float precision issues (e.g., 0.1 + 0.2 !== 0.3) affect JSON the same way they affect any IEEE 754 system. Use Jsonic's JSON Formatter to inspect your JSON payloads and spot suspicious large numbers.

Need to inspect a JSON payload for suspicious large numbers? Jsonic's JSON Formatter highlights them instantly.

Open JSON Formatter

The problem: IEEE 754 and safe integer range

JavaScript uses 64-bit doubles for all numbers. A 64-bit double has 53 bits of mantissa, which gives exact integer representation up to 2^53 - 1 = 9,007,199,254,740,991. Any integer beyond this range may be rounded to the nearest representable value. The rounding is silent — no error is thrown, no warning is logged. You simply get the wrong number.

console.log(Number.MAX_SAFE_INTEGER)  // 9007199254740991
console.log(9007199254740992 === 9007199254740993)  // true — they're the same!

const parsed = JSON.parse('{"id": 9007199254740993}')
console.log(parsed.id)  // 9007199254740992 — WRONG
console.log(parsed.id === 9007199254740993)  // false

This is not a JSON bug — JSON's spec says numbers can have arbitrary precision. It is a JavaScript parser limitation: JSON.parse() converts JSON numbers to JavaScript's number type, which loses precision above MAX_SAFE_INTEGER. Other parsers (Python's json, Java's Jackson) handle large integers correctly because they use arbitrary-precision integer types. The issue is exclusive to environments that map all JSON numbers to IEEE 754 doubles — primarily JavaScript. See the full guide on JSON.parse() in JavaScript for more on how the parser works.

Check if a number is safe

Before storing or sending large integers, check whether they fall within the safe range using Number.isSafeInteger(). This is especially useful when reading IDs from a database and serializing them to JSON — you want to catch unsafe integers before they silently corrupt data on the client. The helper below walks any parsed object and warns whenever it finds an unsafe integer value.

Number.isSafeInteger(9007199254740991)  // true
Number.isSafeInteger(9007199254740992)  // false (exactly 2^53 — edge case)
Number.isSafeInteger(9007199254740993)  // false

// Check after parsing:
function parseSafe(json) {
  const obj = JSON.parse(json)
  // Check all number values
  function checkNumbers(val, path = '') {
    if (typeof val === 'number' && !Number.isSafeInteger(val) && val % 1 === 0) {
      console.warn(`Unsafe integer at ${path}: ${val}`)
    }
    if (typeof val === 'object' && val !== null) {
      for (const [k, v] of Object.entries(val)) checkNumbers(v, `${path}.${k}`)
    }
  }
  checkNumbers(obj)
  return obj
}

Note that Number.isSafeInteger(9007199254740992) returns false — the value 2^53 is exactly representable as a double but is considered "unsafe" because the next integer (2^53 + 1) is not. The safe range is symmetric: Number.MIN_SAFE_INTEGER is -9007199254740991. Floating-point values (non-integers) should always be treated as approximate — use val % 1 === 0 to distinguish integers from floats before checking safety. You can also validate your JSON with Jsonic to spot structural issues before parsing.

Solution 1: Encode large integers as strings

The most portable approach is to serialize large integers as JSON strings: "id": "9007199254740993" instead of "id": 9007199254740993. Every JSON parser in every language handles strings without precision loss — there is no conversion to a numeric type involved. The receiving JavaScript code must call BigInt("9007199254740993") or use a library to get the full integer value when arithmetic is needed.

On the API server (Node.js, Python, Java), serialize 64-bit IDs as strings. Many major APIs already do this — Twitter's API v1 returned "id_str" alongside "id" for exactly this reason. Discord's API returns Snowflake IDs as strings by default. Tradeoff: you cannot do arithmetic directly on the string — you must convert to BigInt first. But this is the safest cross-platform approach since every JSON parser handles strings without precision loss. See the JSON.stringify() replacer guide for how to automatically convert BigInt values to strings during serialization.

// On the server: serialize large IDs as strings
const response = JSON.stringify({
  id: String(user.id),   // "9007199254740993" — safe for all clients
  name: user.name,
})

// On the client: convert back to BigInt when needed
const data = JSON.parse(response)
const id = BigInt(data.id)  // 9007199254740993n
console.log(id === 9007199254740993n)  // true

Solution 2: json-bigint library

json-bigint is a drop-in replacement for JSON.parse() and JSON.stringify() that parses large JSON integers as JavaScript BigInt values instead of losing precision. Install it with npm install json-bigint. Pass { useNativeBigInt: true } to get native BigInt values rather than the library's custom type.

import JSONbig from 'json-bigint'

const parsed = JSONbig.parse('{"id": 9007199254740993}')
console.log(parsed.id)           // 9007199254740993n (BigInt)
console.log(typeof parsed.id)    // "bigint"
console.log(parsed.id === 9007199254740993n)  // true

// Serialize back — BigInt becomes a JSON number (not quoted)
JSONbig.stringify(parsed)
// '{"id":9007199254740993}'

Note that BigInt values cannot be used in arithmetic with regular number — you must use BigInt() throughout your code or convert explicitly. You also cannot pass a BigInt to functions that expect a number (like Math.max()). Use this library when you consume a third-party API that returns large integers as JSON numbers and you have no control over the serialization format.

Solution 3: lossless-json library

lossless-json preserves all numbers as strings internally and lets you choose the type at access time. This is the safest approach when you have a mix of safe and unsafe numbers in the same JSON object — small numbers stay as number and large integers can be accessed as BigInt without any guessing. Install with npm install lossless-json.

import { parse, stringify, LosslessNumber } from 'lossless-json'

const parsed = parse('{"id": 9007199254740993, "price": 9.99}')
// parsed.id is a LosslessNumber (not a native number)

console.log(parsed.id.valueOf())     // "9007199254740993" (string)
console.log(BigInt(parsed.id))       // 9007199254740993n
console.log(Number(parsed.price))    // 9.99

lossless-json never silently loses data — if you try to convert a LosslessNumber that is outside the safe integer range to a number, the library throws an error rather than silently rounding. This makes it the right choice for financial or cryptographic applications where any silent data loss is unacceptable.

Float precision: the 0.1 + 0.2 problem

IEEE 754 also affects floating-point arithmetic. JSON floats deserialize with the same IEEE 754 rounding that affects all doubles — the problem is not specific to JSON, but JSON parsing does not add any additional precision either. The classic example:

JSON.parse('0.1')   // 0.1 (appears correct)
JSON.parse('0.2')   // 0.2
JSON.parse('0.1') + JSON.parse('0.2')  // 0.30000000000000004 (NOT 0.3)

For monetary values, never use floats in JSON — use strings or integer cents: {"price": "9.99"} or {"price_cents": 999}. Deserialize with parseFloat() only for display purposes. For arithmetic, use a decimal library like decimal.js (npm install decimal.js):

import Decimal from 'decimal.js'

new Decimal('0.1').plus('0.2').equals('0.3')  // true
new Decimal('0.1').plus('0.2').toString()      // "0.3"

// Never do this for money:
const price = JSON.parse('{"total": 9.99}').total  // 9.99 (IEEE 754 — not exact)
price * 3  // 29.97 — may have floating-point error

// Do this instead:
const priceCents = JSON.parse('{"total_cents": 999}').total_cents  // 999 (exact integer)
priceCents * 3  // 2997 — exact

Python handles large integers correctly

Python's json module uses arbitrary-precision integers, so large JSON integers are parsed without any rounding. This makes Python a safe choice for server-side processing of large integer IDs. The precision problem is not a JSON format issue — it is exclusive to environments that map all JSON numbers to IEEE 754 doubles.

import json
data = json.loads('{"id": 9007199254740993}')
print(data['id'])           # 9007199254740993 (correct!)
print(type(data['id']))     # <class 'int'>

Python does NOT lose precision for large integers. However, if your Python API generates large integer IDs and returns them as JSON numbers, JavaScript clients will still lose precision on their end. The fix: serialize as strings in Python — json.dumps({"id": str(user_id)}) — so both Python and JavaScript receive a string value. For float precision in Python, use the decimal standard library: from decimal import Decimal; Decimal('0.1') + Decimal('0.2') returns Decimal('0.3'). See the full parse JSON in Python guide for more details on Python's JSON handling.

Snowflake IDs and Twitter IDs

Twitter's old API returned tweet IDs as 64-bit integers exceeding Number.MAX_SAFE_INTEGER. JavaScript clients silently received wrong IDs. Twitter's fix: add an id_str field with the string version alongside the numeric id. Snowflake IDs (Discord, Instagram, Twitter, Mastodon) are 64-bit integers that typically exceed safe integer range.

A Discord user ID like 123456789012345678 is a Snowflake ID. Most Snowflake IDs are in the range 10^17 to 10^18 — well above Number.MAX_SAFE_INTEGER (9 × 10^15). When a JavaScript client receives a Snowflake ID as a JSON number, JSON.parse() rounds it, producing the wrong value. Comparing two IDs or using one as a lookup key then returns incorrect results. When working with Snowflake IDs in JavaScript: always request string format from the API if available; or use json-bigint to parse the response; or compare IDs as strings, not numbers. You can inspect API responses in Jsonic's JSON Formatter to identify which fields contain Snowflake IDs before writing your parser.

Frequently asked questions

Why do large numbers lose precision in JSON?

JSON numbers are abstract — the spec allows arbitrary precision. The precision loss happens in the JavaScript parser: JSON.parse() converts every JSON number to JavaScript's number type, which is an IEEE 754 64-bit double-precision float. A 64-bit double has 53 bits of mantissa (significand), giving exact representation for integers up to 2^53 − 1 = 9,007,199,254,740,991. Any integer larger than this may be rounded to the nearest representable double value. For example, both 9,007,199,254,740,992 and 9,007,199,254,740,993 are represented as the same double (9,007,199,254,740,992.0), so JSON.parse('9007199254740993') silently returns 9007199254740992. This is not a JSON specification bug — it is a JavaScript type system limitation. Python, Java, Go, and C# do not have this issue because they use arbitrary-precision integers or 64-bit integer types for JSON integer parsing. Read more in the JSON.parse() in JavaScript guide.

How do I handle large integers in JSON safely?

Three approaches: (1) String encoding — the most portable: serialize large integers as JSON strings ("id": "9007199254740993"). Every language's parser handles strings without precision loss. The receiver must convert with BigInt() or parseInt(). Many major APIs (Twitter, Discord) use this approach. (2) json-bigint library — parses large JSON integers as JavaScript BigInt values automatically. Install with npm install json-bigint and use JSONbig.parse() instead of JSON.parse(). (3) lossless-json library — stores all numbers internally as strings and lets you choose the type at access time, so safe integers become number and unsafe integers can be accessed as BigInt. The right choice depends on whether you control the API (use string encoding) or consume a third-party API (use a library).

What is Number.MAX_SAFE_INTEGER?

Number.MAX_SAFE_INTEGER is 9007199254740991 (exactly 2^53 − 1). It is the largest integer that can be represented exactly as an IEEE 754 64-bit double-precision float. Integers at or below this value are guaranteed to be represented exactly — Number.isSafeInteger(n) returns true. Integers above this value may be rounded, and two distinct integers may map to the same double value. Number.MIN_SAFE_INTEGER is -9007199254740991. The constant was added to JavaScript in ES2015 (ES6). To check at runtime: Number.isSafeInteger(value) returns true if the integer can be safely represented without rounding. For unsigned 64-bit integers (common in database IDs): the full range is 0 to 18,446,744,073,709,551,615 — far beyond MAX_SAFE_INTEGER. For signed 64-bit integers: -9,223,372,036,854,775,808 to 9,223,372,036,854,775,807 — also beyond MAX_SAFE_INTEGER.

How do I handle JSON floats and money?

IEEE 754 floating-point arithmetic is inherently imprecise for decimal values: 0.1 + 0.2 evaluates to 0.30000000000000004, not 0.3. This affects any number stored as a JSON float and parsed by JavaScript (or any IEEE 754-based language). For monetary amounts: (1) store as integer cents — {"price_cents": 999} for $9.99; (2) store as a decimal string — {"price": "9.99"} and use a decimal library for arithmetic; (3) use the decimal.js library (npm install decimal.js) for precise decimal arithmetic: new Decimal('0.1').plus('0.2').toString() returns "0.3". Never store money as a JSON float like 9.99 if you need to do arithmetic — rounding errors compound with additions, subtractions, and multiplications. Always apply rounding as the final step, not intermediate steps.

Does Python have JSON precision issues?

Python's built-in json module does not lose precision for large integers. Python integers are arbitrary-precision (no fixed size), so json.loads('{"id": 9007199254740993}') correctly returns 9007199254740993 as a Python int. For floating-point, Python uses the same IEEE 754 doubles as JavaScript, so float precision issues exist in Python too: json.loads('0.1') + json.loads('0.2') returns 0.30000000000000004. For decimal precision in Python, use the decimal standard library: from decimal import Decimal; Decimal('0.1') + Decimal('0.2') returns Decimal('0.3'). Python does encode large integers correctly with json.dumps(): json.dumps({"id": 9007199254740993}) produces '{"id": 9007199254740993}' — but a JavaScript client consuming this JSON will lose precision. If your API has JavaScript clients, encode large IDs as strings. See the parse JSON in Python guide.

What are Snowflake IDs and why do they cause precision problems?

Snowflake is a distributed unique ID generation algorithm created by Twitter in 2010 and now used by Discord, Instagram, Mastodon, Bluesky, and many other platforms. A Snowflake ID is a 64-bit integer encoding a timestamp, worker ID, and sequence number. For example, a Discord user ID like 123456789012345678 is a Snowflake ID. Most Snowflake IDs are in the range 10^17 to 10^18 — well above Number.MAX_SAFE_INTEGER (9 × 10^15). When a JavaScript client receives a Snowflake ID as a JSON number, JSON.parse() rounds it, producing the wrong value. Comparing two IDs or using one as a lookup key then returns incorrect results. Solutions: (1) request string-format IDs from the API (?id_type=string or similar); (2) use json-bigint to parse the API response; (3) treat IDs as strings throughout your code without converting to numbers.

Ready to handle JSON numbers safely?

Use Jsonic to format and inspect your JSON and validate your JSON before parsing. Large integer IDs are easy to spot in a formatted view — catch the problem before it reaches production.

Open JSON Formatter