Mapping JSON
Note
If you're using EF Core, please read the page on JSON support in the EF provider. EF has specialized support for JSON beyond what is supported at the lower-level Npgsql layer.
PostgreSQL has rich, built-in support for storing JSON columns and efficiently performing complex queries operations on them. Newcomers can read more about the PostgreSQL support on the JSON types page, and on the functions and operators page. Note that the below mapping mechanisms support both the jsonb
and json
types, although the former is almost always preferred for efficiency and functionality reasons.
Npgsql allows you to map PostgreSQL JSON columns in three different ways:
- As simple strings
- As strongly-typed user-defined types (POCOs)
- As System.Text.Json DOM types (JsonDocument or JsonElement, see docs)
- High-performance JSON parsing with Utf8JsonReader
- Newtonsoft Json.NET
String mapping
The simplest form of mapping to JSON is as a regular .NET string:
// Write a string to a json column:
await using var command1 = new NpgsqlCommand("INSERT INTO test (data) VALUES ($1)", conn)
{
Parameters = { new() { Value = """{ "a": 8, "b": 9 }""", NpgsqlDbType = NpgsqlDbType.Jsonb } }
};
await command1.ExecuteNonQueryAsync();
// Read jsonb data as a string:
await using var command2 = new NpgsqlCommand("SELECT data FROM test", conn);
await using var reader = await command2.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
Console.WriteLine(reader.GetString(0));
}
Note
Note that when writing a string parameter as jsonb
, you must specify NpgsqlDbType.Jsonb
, otherwise Npgsql sends a text
parameter which is incompatible with JSON.
With this mapping style, you're fully responsible for serializing/deserializing the JSON data yourself (e.g. with System.Text.Json) - Npgsql simply passes your strings to and from PostgreSQL.
POCO mapping
Warning
As of Npgsql 8.0, POCO mapping is incompatible with NativeAOT. We plan to improve this, please upvote this issue if you're interested.
If your column JSON contains documents with a stable schema, you can map them to your own .NET types (or POCOs). The provider will use System.Text.Json APIs under the hood to serialize instances of your types to JSON documents before sending them to the database, and to deserialize documents coming back from the database. This effectively allows mapping an arbitrary .NET type - or object graph - to a single column in the database.
Starting with Npgsql 8.0, to use this feature, you must first enable it by calling EnableDynamicJson on your NpgsqlDataSourceBuilder, or, if you're not yet using data sources, on NpgsqlConnection.GlobalTypeMapper
:
Note
NpgsqlDataSource
was introduced in Npgsql 7.0, and is the recommended way to manage type mapping. If you're using an older version, see the other methods.
var dataSourceBuilder = new NpgsqlDataSourceBuilder(...);
dataSourceBuilder.EnableDynamicJson();
await using var dataSource = dataSourceBuilder.Build();
Once you've enabled the feature, you can simply read and write instances of your POCOs directly; when writing, specify NpgsqlDbType.Jsonb
to let Npgsql know you intend for it to get sent as JSON data:
// Write a POCO to a jsonb column:
var myPoco1 = new MyPoco { A = 8, B = 9 };
await using var command1 = new NpgsqlCommand("INSERT INTO test (data) VALUES ($1)", conn)
{
Parameters = { new() { Value = myPoco1, NpgsqlDbType = NpgsqlDbType.Jsonb } }
};
await command1.ExecuteNonQueryAsync();
// Read jsonb data as a POCO:
await using var command2 = new NpgsqlCommand("SELECT data FROM test", conn);
await using var reader = await command2.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
var myPoco2 = reader.GetFieldValue<MyPoco>(0);
Console.WriteLine(myPoco2.A);
}
class MyPoco
{
public int A { get; set; }
public int B { get; set; }
}
This mapping method is quite powerful, allowing you to read and write nested graphs of objects and arrays to PostgreSQL without having to deal with serialization yourself.
Polymorphic JSON mapping
Npgsql uses System.Text.Json to perform the actual serialization and deserialization of your POCO types to JSON. System.Text.Json support polymorphioc serialization
, which allows serializing type hierarchies, using a JSON $type
property as the type discriminator.
If you're using the json
type, you can use System.Text.Json's polymorphic serialization without any extra steps. However, if you're using jsonb
(which is generally recommended), then you'll run into trouble: System.Text.Json requires that the $type
property be at the top of the JSON document, but jsonb
does not preserve property order in JSON objects.
System.Text.Json 9.0 brings support for out-of-order metadata reads, which is an opt-in feature allowing the $type
property to be anywhere in the JSON object. When using Npgsql, you can opt into this when configuring your NpgsqlDataSourceBuilder as follows:
var builder = new NpgsqlDataSourceBuilder("<connection string>");
builder
.EnableDynamicJson()
.ConfigureJsonOptions(new JsonSerializerOptions { AllowOutOfOrderMetadataProperties = true });
await using var dataSource = builder.Build();
Once that's done, you can use JSON polymorphism with jsonb
. If you're still targeting .NET 8.0, you can take a reference on System.Text.Json 9.0 in order to use AllowOutOfOrderMetadataProperties
.
System.Text.Json DOM types
There are cases in which mapping JSON data to POCOs isn't appropriate; for example, your JSON column may not contain a fixed schema and must be inspected to see what it contains; for these cases, Npgsql supports mapping JSON data to JsonDocument or JsonElement (see docs):
var jsonDocument = JsonDocument.Parse("""{ "a": 8, "b": 9 }""");
// Write a JsonDocument:
await using var command1 = new NpgsqlCommand("INSERT INTO test (data) VALUES ($1)", conn)
{
Parameters = { new() { Value = jsonDocument } }
};
await command1.ExecuteNonQueryAsync();
// Read jsonb data as a JsonDocument:
await using var command2 = new NpgsqlCommand("SELECT data FROM test", conn);
await using var reader = await command2.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
var document = reader.GetFieldValue<JsonDocument>(0);
Console.WriteLine(document.RootElement.GetProperty("a").GetInt32());
}
High-performance JSON parsing with Utf8JsonReader
If you're writing a very performance-sensitive application, using System.Text.Json to deserialize to POCOs or JsonDocument may incur too much overhead. If that's the case, you can use System.Text.Json's Utf8JsonReader to parse JSON data from the database. Utf8JsonReader provides a low-level, forward-only API to parse the JSON data, one token at a time.
Utf8JsonReader requires JSON data as raw, UTF8-encoded binary data; fortunately, Npgsql allows reading jsonb
as binary data, and if your PostgreSQL client_encoding
is set to UTF8 (the default), you can feed data directly from PostgreSQL to Utf8JsonReader:
await using var command2 = new NpgsqlCommand("SELECT data FROM test", conn);
await using var reader = await command2.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
ParseJson(reader.GetFieldValue<byte[]>(0));
}
void ParseJson(byte[] utf8Data)
{
var jsonReader = new Utf8JsonReader(utf8Data);
// ... parse the data with jsonReader
}
Note that the above works well for small JSON columns; if you have large columns (above ~8k), consider streaming the JSON data instead. This can be done by passing CommandBehavior.SequentialAccess
to ExecuteReaderAsync
, and then calling reader.GetStream()
on NpgsqlDataReader instead of GetFieldValue<byte[]>
. To process streaming data with Utf8JsonReader, see these docs.
Newtonsoft.JSON
System.Text.Json is the built-in, standard way to handle JSON in modern .NET. However, some users still prefer using Newtonoft Json.NET, and Npgsql includes support for that.
To use Json.NET, add the Npgsql.Json.NET package to your project, and enable the plugin as follows:
Note
NpgsqlDataSource
was introduced in Npgsql 7.0, and is the recommended way to manage type mapping. If you're using an older version, see the other methods.
var dataSourceBuilder = new NpgsqlDataSourceBuilder(...);
dataSourceBuilder.UseJsonNet();
await using var dataSource = dataSourceBuilder.Build();
Once you've enabled the feature, you can simply read and write instances of your POCOs directly; when writing, specify NpgsqlDbType.Jsonb
to let Npgsql know you intend for it to get sent as JSON data:
// Write a POCO to a jsonb column:
var myPoco1 = new MyPoco { A = 8, B = 9 };
await using var command1 = new NpgsqlCommand("INSERT INTO test (data) VALUES ($1)", conn)
{
Parameters = { new() { Value = myPoco1, NpgsqlDbType = NpgsqlDbType.Jsonb } }
};
await command1.ExecuteNonQueryAsync();
// Read jsonb data as a POCO:
await using var command2 = new NpgsqlCommand("SELECT data FROM test", conn);
await using var reader = await command2.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
var myPoco2 = reader.GetFieldValue<MyPoco>(0);
Console.WriteLine(myPoco2.A);
}
class MyPoco
{
public int A { get; set; }
public int B { get; set; }
}
The plugin also allows you to read JObject/JArray for weakly-typed DOM mapping.