JSON vs MessagePack: Size, Speed, and When to Use Each

MessagePack is a binary serialization format that encodes the same data as JSON but in 20–50% fewer bytes and parses 2–4× faster — making it a drop-in JSON replacement for high-throughput internal APIs and WebSocket messages. A JSON object {"name":"Alice","age":30} encodes to 23 bytes as JSON and ~14 bytes as MessagePack (39% smaller); the tradeoff is that MessagePack is not human-readable and requires a library on both client and server. This guide covers MessagePack encoding rules, byte-level size comparison against JSON, JavaScript (msgpackr, @msgpack/msgpack) and Python (msgpack) usage, when to use each format, and benchmarks for serialization speed. For a foundational understanding of the format you are optimizing, see JSON.stringify in depth. To compare MessagePack against other binary alternatives, see JSON vs Protobuf and JSON vs BSON.

Need to validate or minify a JSON payload before benchmarking? Jsonic's JSON formatter handles it instantly — no install required.

Open JSON Formatter

What is MessagePack — binary serialization in 5 minutes

MessagePack (msgpack) is an open binary serialization format created by Sadayuki Furuhashi in 2008. It encodes the same logical data model as JSON — maps (objects), arrays, strings, integers, floats, booleans, and null — but uses a compact binary encoding instead of Unicode text. Every value is preceded by a 1-byte type tag that encodes both the data type and, for small values, the value itself. This eliminates the overhead of JSON's text delimiters: no double-quote characters around keys, no colons, no commas, no whitespace — just type-prefixed bytes.

Three encoding families illustrate the density. A positive integer 0–127 fits in a single byte (positive fixint). A string up to 31 bytes uses 1 type byte + the raw UTF-8 bytes (fixstr) — JSON uses 2 extra bytes for the enclosing quotes plus more for any escaped characters. A map with up to 15 keys uses 1 type byte (fixmap) before the key-value pairs — JSON uses at least 2 bytes for the enclosing braces plus 4 bytes per key-value separator (": " and ", "). These small savings add up: for a typical 10-key object with short string keys and mixed value types, MessagePack is 30–40% smaller than compact JSON.

MessagePack also supports types that JSON does not: bin (raw binary bytes, stored 1:1 without Base64 encoding) and ext (extension types — user-defined or standard types like timestamps). The official timestamp extension encodes a Unix timestamp to nanosecond precision in 4–12 bytes, while JSON must rely on string conventions like ISO 8601 with no standardized precision.

MessagePack encoding rules — type bytes and varint encoding

MessagePack uses a fixed set of type bytes (also called format bytes) to identify each value. Understanding 6 core formats explains the majority of real-world payloads. The table below shows the format byte, the value range it covers, and the total byte cost compared to JSON.

FormatType byte(s)CoversTotal bytesJSON bytes (equiv.)
positive fixint0x00–0x7fintegers 0–12711–3
fixstr0xa0–0xbfstrings 0–31 bytes1 + len2 + len (quotes)
fixmap0x80–0x8fmaps 0–15 keys1 + pairs2 + pairs + separators
fixarray0x90–0x9farrays 0–15 elements1 + elements2 + elements + commas
float640xcb64-bit IEEE 754 float9varies (1–20+)
bin80xc4binary up to 255 bytes2 + len4 + ceil(len*4/3) (Base64)

The key insight is that MessagePack uses varint encoding for lengths and counts: small values (fixint, fixstr, fixmap, fixarray) fold the length into the type byte itself, using no extra bytes. Larger values expand to 2-byte (16-bit) or 4-byte (32-bit) length fields. This means small, typical payloads save the most — a 10-key object with keys under 31 chars uses only 1 format byte per key (fixstr) and 1 format byte for the map (fixmap), versus JSON's 2 quotes + colon + comma per entry.

The following byte-level example shows the full encoding of {"name":"Alice","age":30}:

JSON (23 bytes, UTF-8):
7b 22 6e 61 6d 65 22 3a 22 41 6c 69 63 65 22 2c
22 61 67 65 22 3a 33 30 7d
{"name":"Alice","age":30}

MessagePack (14 bytes):
82          -- fixmap, 2 key-value pairs
a4 6e616d65 -- fixstr(4) "name"
a5 416c696365 -- fixstr(5) "Alice"
a3 616765   -- fixstr(3) "age"
1e          -- positive fixint 30

Savings: 9 bytes, 39% smaller

Notice that the integer 30 (0x1e) encodes as a single byte positive fixint — JSON needs 2 bytes for the characters 30. The string "name" costs 5 bytes in MessagePack (1 fixstr byte + 4 UTF-8 bytes) versus 6 bytes in JSON (4 UTF-8 bytes + 2 quotes). The map itself costs 1 byte (fixmap) versus 2 bytes (braces). Across a larger payload these micro-savings compound into 20–40% total reduction.

Byte-level size comparison: JSON vs MessagePack across payload types

The savings from MessagePack vary significantly by payload shape. Payloads with many short numeric values and short string keys benefit most. Payloads that are mostly long string values benefit least — the string content itself is the same size in both formats. Binary data payloads are where MessagePack wins most decisively: JSON must Base64-encode binary, adding 33% to the binary size plus quote/key overhead.

Payload typeJSON sizeMessagePack sizeSavingsNotes
Small object (5 fields, short keys)~120 bytes~75 bytes38%fixmap + fixstr keys
Array of 100 integers~350 bytes~210 bytes40%positive fixint for 0–127
Deeply nested API response (10 KB)10 240 bytes~6 500 bytes36%typical REST response
Long text fields (article body)~5 000 bytes~4 900 bytes2%string content dominates
Binary payload (1 KB image thumbnail)~1 368 bytes (Base64)~1 026 bytes25%bin8 vs Base64 overhead
After gzip compression (10 KB JSON)~2 100 bytes~1 800 bytes14%gap narrows after compression

The gzip row illustrates an important caveat: when HTTP-level compression (gzip or Brotli) is applied, the size gap between JSON and MessagePack narrows considerably because both formats compress well. In practice, if your API already uses Content-Encoding: gzip, the wire-size benefit of switching to MessagePack drops from ~35% to ~10–15%. The CPU throughput benefit (2–4× faster encode/decode) remains regardless of compression. If your bottleneck is CPU time on the serialization path rather than bandwidth, MessagePack still wins even with gzip enabled.

For comparison with other binary formats, see JSON vs Protobuf (Protobuf is typically 5–10× smaller than JSON for schema-constrained data) and JSON vs BSON (BSON is MongoDB's binary format which is actually larger than JSON for most payloads).

MessagePack in JavaScript — msgpackr and @msgpack/msgpack

Two libraries dominate JavaScript MessagePack in 2026. msgpackr is the fastest option, encoding at ~400 MB/s on Node.js 20 (versus JSON.stringify at ~200 MB/s) with zero runtime dependencies. @msgpack/msgpack is the official reference implementation with a 3.9 KB gzipped browser bundle suitable for client-side use.

# Install msgpackr (fastest, Node.js + browser)
npm install msgpackr

# Install @msgpack/msgpack (official reference, 3.9 KB gzip)
npm install @msgpack/msgpack
// msgpackr — basic encode/decode
import { pack, unpack } from 'msgpackr'

const data = { name: 'Alice', age: 30, active: true }

// Encode to Uint8Array
const encoded = pack(data)
console.log(encoded.byteLength)  // ~18 bytes vs 38 bytes JSON

// Decode from Uint8Array
const decoded = unpack(encoded)
console.log(decoded)  // { name: 'Alice', age: 30, active: true }

// For high-throughput use, create a reusable Packr instance
// with record extensions (struct mode) for 50–70% further size reduction
import { Packr } from 'msgpackr'
const packr = new Packr({ useRecords: true })
const encoded2 = packr.pack(data)
// @msgpack/msgpack — browser-friendly
import { encode, decode } from '@msgpack/msgpack'

const encoded = encode({ name: 'Alice', age: 30 })
// Returns Uint8Array

const decoded = decode(encoded)
// Returns { name: 'Alice', age: 30 }

For WebSocket usage in the browser, set binaryType = 'arraybuffer' on the WebSocket instance so received binary frames are delivered as ArrayBuffer rather than Blob, which is what decode() expects:

// Browser WebSocket with MessagePack
import { encode, decode } from '@msgpack/msgpack'

const ws = new WebSocket('wss://api.example.com/ws')
ws.binaryType = 'arraybuffer'   // receive as ArrayBuffer, not Blob

ws.onopen = () => {
  const payload = encode({ type: 'subscribe', channel: 'prices' })
  ws.send(payload)
}

ws.onmessage = (event) => {
  const data = decode(event.data)   // event.data is ArrayBuffer
  console.log(data)
}

To understand the JSON side of this comparison, see the JSON.stringify tutorial for complete serialization options and edge cases.

MessagePack in Python — msgpack library usage

The msgpack Python package (maintained by the MessagePack team) provides a JSON-like API. msgpack.packb() serializes to bytes and msgpack.unpackb() deserializes from bytes. The key option to know is raw=False(default in msgpack >= 1.0): it decodes MessagePack str values as Python str rather than bytes, matching JSON behavior.

pip install msgpack
import msgpack

# Encode
data = {"name": "Alice", "age": 30, "scores": [95, 87, 92]}
encoded = msgpack.packb(data, use_bin_type=True)
print(type(encoded))   # <class 'bytes'>
print(len(encoded))    # ~28 bytes vs 48 bytes JSON

# Decode
decoded = msgpack.unpackb(encoded, raw=False)
print(decoded)  # {'name': 'Alice', 'age': 30, 'scores': [95, 87, 92]}

# Native binary — no Base64 needed
image_bytes = open("thumb.jpg", "rb").read()
msg = msgpack.packb({"id": 1, "image": image_bytes}, use_bin_type=True)
# image_bytes stored as bin type, not Base64
# Streaming encode/decode for large data or network streams
import msgpack

# Incremental packer — avoids buffering full payload
packer = msgpack.Packer(use_bin_type=True)
chunk1 = packer.pack({"event": "trade", "price": 101.5})
chunk2 = packer.pack({"event": "trade", "price": 102.0})

# Incremental unpacker — feed data as it arrives
unpacker = msgpack.Unpacker(raw=False)
unpacker.feed(chunk1 + chunk2)
for obj in unpacker:
    print(obj)

The use_bin_type=Trueoption (required for msgpack < 1.0, the default from 1.0+) ensures Python bytes objects encode as MessagePack bin type rather than the legacy raw type. Always use this option in new code. For more Python JSON patterns, see how to parse JSON in Python.

Serialization benchmarks: MessagePack vs JSON throughput

Benchmark results below are from Node.js 20 on Apple M2 (single-threaded) using a 10 KB representative API response payload (nested object, ~60 keys, mixed string/number values). All figures are approximate — your results will vary by payload shape and hardware. The goal is relative comparison, not absolute numbers.

LibraryEncode throughputDecode throughputOutput size (10 KB JSON)
JSON.stringify / JSON.parse~200 MB/s~250 MB/s10 240 bytes
msgpackr pack/unpack~400 MB/s~500 MB/s~6 500 bytes
@msgpack/msgpack encode/decode~180 MB/s~220 MB/s~6 500 bytes
msgpackr (useRecords: true)~600 MB/s~700 MB/s~4 000 bytes

Key findings from the benchmarks: (1) msgpackr in default mode is 2× faster to encode and 2× faster to decode than native JSON.stringify/JSON.parse while also producing 36% smaller output — a win on both dimensions. (2) @msgpack/msgpack is roughly equivalent to native JSON in speed (slightly slower) but produces the same 36% size savings — a good choice when bundle size matters more than raw throughput. (3) msgpackr with useRecords (struct mode) goes further: it deduplicates repeated key names across objects in a stream, achieving 2–3× size reduction over plain MessagePack for arrays of same-shape objects (like API list responses). The tradeoff is that both ends must use msgpackr with the same configuration — it is not compatible with other MessagePack implementations for this extension.

For Python, msgpack.packb() runs at ~150 MB/s versus Python's json.dumps() at ~60 MB/s on CPython 3.12 — a 2.5× speedup. The gap is larger in Python because Python's JSON parser is pure Python for most operations, while msgpack uses a C extension.

Decision guide: when to use JSON vs MessagePack

The right choice depends on who controls both endpoints, whether debuggability matters, and what your actual bottleneck is (size, CPU, or latency). The decision table below covers the 8 most common scenarios.

ScenarioRecommendationReason
Public REST APIJSONNo library required for consumers; curl/browser DevTools works
Internal microservice RPCMessagePackBoth ends controlled; 2× faster, 35% smaller
Browser WebSocket (real-time)MessagePackBinary frames; @msgpack/msgpack 3.9 KB gzip; reduces frame overhead
Config files / human-edited dataJSONNeeds to be readable and editable without tools
Mobile app ↔ serverMessagePackMetered data plans; battery savings from less parsing CPU
Payload includes binary fieldsMessagePackbin type avoids 33% Base64 overhead; simpler code
Logging / audit trailJSONLog files need to be grep-able and human-readable
Cache storage (Redis)MessagePack35% less memory; faster serialize/deserialize on cache hit

A practical migration path: start with JSON everywhere (zero dependencies, easy debugging), then profile. If serialization shows up on a flame graph or payload size appears in your bandwidth costs, add MessagePack to that specific bottleneck endpoint. You do not need to migrate all APIs at once — JSON and MessagePack can coexist in the same application, with the format negotiated via the HTTP Accept and Content-Type headers (application/msgpack for MessagePack, application/json for JSON).

For a broader comparison of binary formats and when schema-first approaches pay off, see JSON vs Protobuf. For MongoDB's binary format, see JSON vs BSON.

Frequently asked questions

What is MessagePack and how does it compare to JSON?

MessagePack is a binary serialization format that encodes the same data structures as JSON — objects, arrays, strings, numbers, booleans, and null — but uses a compact binary representation instead of human-readable text. Where JSON represents the integer 1000 as the 4-character string "1000", MessagePack encodes it as 2 bytes (0xcd 0x03e8). Where JSON surrounds every key with double-quote characters and colons, MessagePack uses a single type byte (fixmap or map16) followed by densely packed key-value pairs. The result is that MessagePack payloads are typically 20–50% smaller than equivalent JSON for real-world API data, and parse 2–4× faster because there is no text-to-number conversion and no quote/comma scanning. The tradeoff is that MessagePack is not human-readable — you cannot open a .msgpack file in a text editor and understand it — and both the sender and receiver must have a MessagePack library. JSON is built into every language and browser natively, while MessagePack requires an additional dependency. For the foundational JSON data model, see JSON.stringify in depth.

How much smaller is MessagePack than JSON?

For typical REST API payloads with string keys and mixed value types, MessagePack is 20–50% smaller than JSON. The savings depend heavily on the data: payloads with long string values save less (the string content itself is the same size in both formats — MessagePack saves only the surrounding quotes); payloads with many numeric values or short keys save more. A concrete example: the JSON object {"name":"Alice","age":30,"active":true} is 35 bytes as UTF-8 JSON. As MessagePack it encodes to approximately 22 bytes — a 37% reduction. For binary data (images, hashes, audio), MessagePack stores bytes natively at 1 byte per byte, while JSON must Base64-encode binary data, adding 33% overhead on top of the already-larger JSON envelope. Note that after gzip/Brotli compression, the gap between JSON and MessagePack narrows significantly because both compress well — from ~35% down to ~10–15% savings on the wire. If your API already uses Content-Encoding: gzip, the CPU throughput benefit remains but the size benefit is reduced.

How do I encode and decode MessagePack in JavaScript?

Two well-maintained libraries cover JavaScript MessagePack: msgpackr (fastest, 0 dependencies, works in Node.js and browsers) and @msgpack/msgpack (official reference implementation, 3.9 KB gzipped, works in all modern browsers). For msgpackr: install with npm install msgpackr, then import { pack, unpack } from 'msgpackr'. Call pack(value) to encode a JavaScript value to a Uint8Array, and unpack(buffer) to decode back to a JavaScript value. For @msgpack/msgpack: install with npm install @msgpack/msgpack, then import { encode, decode } from '@msgpack/msgpack'. encode(value) returns a Uint8Array and decode(buffer) returns the original value. For WebSocket usage, set ws.binaryType = 'arraybuffer' before connecting so received binary frames are delivered as ArrayBuffer rather than Blob.

How do I use MessagePack in Python?

Install the msgpack package with pip install msgpack. The API mirrors Python's json module: msgpack.packb(obj) encodes a Python object to bytes, and msgpack.unpackb(data, raw=False) decodes bytes back to a Python object. The raw=False option ensures MessagePack str values decode as Python str rather than bytes. Use use_bin_type=True when encoding to ensure Python bytes objects encode as MessagePack bin type. For streaming or file-based usage, msgpack.Packer() and msgpack.Unpacker() provide incremental encode/decode without buffering the full payload in memory — useful for large datasets or network streams. Python dicts, lists, strings, integers, floats, booleans, and None all map directly to MessagePack types. For more Python JSON patterns, see how to parse JSON in Python.

When should I use MessagePack instead of JSON?

Use MessagePack when: (1) you control both the client and server — internal APIs, microservices, WebSocket endpoints — and can deploy a library on both sides; (2) payload size matters — mobile apps on metered connections, high-frequency WebSocket messages, or IoT devices with limited bandwidth; (3) serialization CPU cost is a bottleneck — msgpackr encodes at roughly 400 MB/s versus JSON.stringify at 200 MB/s on Node.js, a 2× throughput improvement; (4) your data includes binary fields like images, audio, or hashes that JSON would Base64-encode at 33% size overhead. Use JSON when: (1) your API is public-facing and consumers should not need a special library; (2) human-readability and debuggability are important — you want to inspect payloads in curl, browser DevTools, or log files; (3) your language or framework has zero-dependency JSON support and the performance difference is not measurable for your use case. You can use Jsonic to inspect and validate any JSON payload before deciding whether the overhead is worth addressing.

Does MessagePack support all JSON data types?

MessagePack supports all JSON data types and more. JSON types map to MessagePack as follows: JSON object → MessagePack map (fixmap for ≤15 keys, map16 for ≤65 535 keys, map32 for larger); JSON array → MessagePack array (fixarray, array16, array32); JSON string → MessagePack str (fixstr for ≤31 bytes, str8/str16/str32 for longer); JSON number → MessagePack int or float (positive fixint for 0–127, int8/int16/int32/int64, float32/float64); JSON boolean → MessagePack true/false (single bytes 0xc3/0xc2); JSON null → MessagePack nil (single byte 0xc0). Beyond JSON types, MessagePack natively supports: bin (raw binary bytes, no Base64 required) and ext (extension types — timestamps, custom application types). The ext type lets you encode types like Date, UUID, Decimal, or custom domain types with a single application-defined type byte rather than relying on string conventions like ISO 8601 dates in JSON. The official timestamp extension (ext type -1) encodes Unix timestamps to nanosecond precision in 4–12 bytes. For a full data type comparison across binary formats, see JSON vs Protobuf.

Ready to work with JSON?

Use Jsonic's JSON Formatter to validate, minify, or prettify payloads before switching serialization formats. You can also use the JSON Diff tool to compare decoded MessagePack output against your original JSON to verify round-trip correctness.

Open JSON Formatter