Go 1.25 JSON v2: Long-Awaited JSON Package Revamp
A closer look at the performance, error handling, and streaming improvements.
Nearly every Go developer has worked with the built-in encoding/json package that has been in the standard library since its initial release. Over the years, the package has been widely criticized for performance bottlenecks, limited flexibility, and poor diagnostics. As a response, the Go team recently released a major update, encoding/json/v2, the updated experimental v2 version of the package.
In this article, we will discuss the main motivation for a revamp of the built-in JSON package and see how you can leverage the new features of the updated v2 package.
The JSON v1 Pain: Why a Rewrite Was Needed
Here are some of the most common frustrations with the original encoding/json package:
Performance: Go built-in JSON library has been 2-5x slower in encoding and decoding JSON structures than many third-party libraries.
Non-descriptive error messages: In cases of complex or nested structures, error messages don’t provide enough context, requiring to validate entire input manually.
Limited control: Handling different edge cases such as rejecting unknown fields in JSON structures has always been a challenge.
No streaming for standard types: You couldn’t decode large arrays from a stream without fully loading them in memory first.
These issues have been known for years and heavily discussed in the Go community, including this popular thread.
JSON v2: A Clean(er) Slate
Go 1.25 introduced encoding/json/v2 as an experimental package with the following goals:
More descriptive error messages
Streaming support
Safer and more predictable encoding/decoding
Extensible hooks for advanced use cases
To use the updated version you must:
Explicitly import
encoding/json/v2.Set the
GOEXPERIMENT=jsonv2environment variable
The API might change in the following releases. Yet, even at this early stage, it offers significant improvements.
Practical Examples: JSON v1 vs JSON v2
Let’s walk through some common use cases and see how JSON v2 makes things better.
Better Errors with Field Context
First, let’s run this example using the original version of encoding/json package:
package main
import (
"encoding/json"
"fmt"
)
type User struct {
Age int `json:"age"`
}
type Data struct {
Users []User `json:"users"`
}
func main() {
data := []byte(`{"users": [{"age": 30}, {"age": "twenty"}]}`)
var users Data
err := json.Unmarshal(data, &users)
fmt.Println(err)
}The output will be:
json: cannot unmarshal string into Go struct field User.users.age of type intFrom the output, we aren’t able to tell what element of the input array caused the error. If the input array contains hundreds of elements, finding an error case would take time.
Let’s switch to v2 version of the JSON package by importing (note that you also need to set GOEXPERIMENT=jsonv2 environment variable to enable JSON v2):
import "encoding/json/v2"After running the same code, we will now see
json: cannot decode string into int at "users[1].age"The output includes the index of an incorrect element in the array — a huge win for debugging.
Streaming Large Arrays
With json/v2, you can stream JSON arrays from disk or network:
f, _ := os.Open("large.json")
defer f.Close()
dec := json.NewDecoder(f)
tok, _ := dec.ReadToken() // Read start of array token (e.g., json.StartArray)
for dec.PeekKind() != json.EndArray {
var entry MyStruct
if err := dec.Decode(&entry); err != nil {
log.Fatal(err)
}
process(entry)
}
dec.ReadToken() // Read json.EndArray tokenThis avoids loading the entire JSON array into memory—ideal for large datasets.
Struct Tag Extensions
json/v2 adds several helpful options to reduce boilerplate:
type Payment struct {
Amount int `json:",intstring"` // Parses "1000" as int 1000
Currency string `json:",nullable"` // Accepts null or string
Optional string `json:",omitempty"` // Omits the field if empty
}Using these options helps to avoid extra steps when decoding structures, such as creating intermediate data structures and converting types (for example, strings and ints).
Performance
The json/v2 package brings major performance improvements, offering zero-heap decoding for many struct types. This means it can populate Go values directly on the stack, avoiding memory allocations. This significantly boosts throughput and reduces GC pressure in performance-critical code.
In decoding benchmarks, json/v2 is reported to be 2-10x faster than the original encoding/json, primarily due to a more efficient parser.
If you need even more performance, you might consider streaming JSON decoder that offered up to 40x speedup for certain large struct types. Though not fully reflected in standard benchmarks, these gains become apparent in deeply nested or recursive use cases.
Compared to popular high-performance third-party libraries like jsoniter, go-json, or segmentio/json, the new standard library is now in the same performance class — a significant milestone for production-grade efficiency without external dependencies.
Conclusion: Should You Use JSON v2?
Keep in mind that encoding/json/v2 package is still experimental and might change in the upcoming releases. If you decide to use it, keep its scope limited to specific scenarios (for example, high-performance or large-volume JSON processing and any advance use cases requiring customization) and track the following changes in the package API.
You may find more useful details on the v2 package in the following links:
JSON v2 proposal issue: https://github.com/golang/go/issues/71497
Package documentation: https://pkg.go.dev/encoding/json/v2
To go deeper into production-grade service design with Go—including gRPC, Protocol Buffers, Kubernetes, and distributed system patterns check out Alexander Shuiskov’s updated book, Microservices with Go, Second Edition. It covers everything from service scaffolding and CI/CD pipelines to observability, secure communication, and advanced reliability techniques.
Here is what some reader’s have said:





