Replies: 3 comments 23 replies
-
I guess what I really want is to be able to create a ValueConverter<T, Stream> for one of my POCO column, i.e., I can convert my POCO to/from stream, and linqtodb will happy accept a stream as a parameter and load into a stream and let me do the conversion on certain column. |
Beta Was this translation helpful? Give feedback.
-
The reason I have this request is: Right now, I have an alternative: Serialize json into recyclablememorystream, save it to DB, dispose the buffer. But I hit a wall on reading side. LinqToDB insist to read the text back to me before I can deserialize. They are big strings, certainly won't be garbage collectible easily. I want a stream. Or Somehow allow me to give you a stream, or a buffer, so I can manage the memory myself. If you allocate a big string or a big byte[], if it's over 80K, I am at the mercy of garbage collector, and my app will crash at an undefined time, which is not good. |
Beta Was this translation helpful? Give feedback.
-
I've made some tests. Below is an example that implements streaming support for all 4 types: json, jsonb, text and binary. using LinqToDB;
using LinqToDB.Common;
using LinqToDB.Data;
using LinqToDB.DataProvider;
using LinqToDB.DataProvider.PostgreSQL;
using LinqToDB.Mapping;
using Npgsql;
using System.Data.Common;
using System.Linq.Expressions;
using System.Reflection;
using System.Text.Json;
internal class Program
{
static void Main(string[] args)
{
const string connectionString = "...";
var builder = new NpgsqlDataSourceBuilder(connectionString);
builder.EnableDynamicJson();
var ds = builder.Build();
using var db = new DataConnection(new DataOptions().UseConnectionFactory(CustomProvider.Instance, o => ds.CreateConnection()));
using var tb = db.CreateTempTable<JsonTable>();
var record = new JsonTable()
{
Id = 1,
JsonField = new JsonValue() { Value = 2 },
JsonbField = new JsonValue() { Value = 3 },
TextField = new JsonValue() { Value = 4 },
BinaryField = new JsonValue() { Value = 5 },
};
db.Insert(record);
var rawRecord = db.GetTable<JsonTableRaw>().Single();
var loadedRecord = tb.Single();
}
}
public sealed class CustomProvider : PostgreSQLDataProvider
{
public static readonly IDataProvider Instance = new CustomProvider();
private CustomProvider() : base(PostgreSQLVersion.v15)
{
}
private static readonly MethodInfo _getFieldValue = typeof(DbDataReader).GetMethod(nameof(DbDataReader.GetFieldValue), BindingFlags.Instance | BindingFlags.Public)!;
private static readonly MethodInfo _getStream = typeof(DbDataReader).GetMethod(nameof(DbDataReader.GetStream), BindingFlags.Instance | BindingFlags.Public)!;
private static readonly ParameterExpression _rd = Expression.Parameter(typeof(DbDataReader));
private static readonly ParameterExpression _ordinal = Expression.Parameter(typeof(int));
public override Expression GetReaderExpression(DbDataReader reader, int idx, Expression readerExpression, Type? toType)
{
var typeName = reader.GetDataTypeName(idx);
if (typeName is "json" or "jsonb")
{
// return rd.GetFieldValue<ToType>(ordinal)
return Expression.Lambda(
Expression.Call(_rd, _getFieldValue.MakeGenericMethod(toType!), _ordinal),
_rd, _ordinal);
}
else if (typeName is "text" or "bytea" && toType == typeof(Stream))
{
// return rd.Stream(ordinal)
return Expression.Lambda(
Expression.Call(_rd, _getStream, _ordinal),
_rd, _ordinal);
}
return base.GetReaderExpression(reader, idx, readerExpression, toType);
}
}
[Table(nameof(JsonTable))]
public sealed class JsonTableRaw
{
[PrimaryKey] public int Id { get; set; }
[Column] public string? JsonField { get; set; }
[Column] public string? JsonbField { get; set; }
[Column] public string? TextField { get; set; }
[Column] public byte[]? BinaryField { get; set; }
}
public sealed class JsonTable
{
[PrimaryKey] public int Id { get; set; }
[Column(DataType = DataType.Json)] public JsonValue? JsonField { get; set; }
[Column(DataType = DataType.BinaryJson)] public JsonValue? JsonbField { get; set; }
[ValueConverter(ConverterType = typeof(ObjectToStreamConverter<JsonValue>))]
[Column(DataType = DataType.Text)] public JsonValue? TextField { get; set; }
[ValueConverter(ConverterType = typeof(ObjectToStreamConverter<JsonValue>))]
[Column(DataType = DataType.VarBinary)] public JsonValue? BinaryField { get; set; }
}
public sealed class ObjectToStreamConverter<TModel> : ValueConverter<TModel, Stream>
{
public ObjectToStreamConverter()
: base(m => ConvertToDatabase(m), s => ConvertToClient(s), false)
{
}
private static Stream ConvertToDatabase(TModel clientValue)
{
var ms = new MemoryStream();
JsonSerializer.Serialize(ms, clientValue);
ms.Position = 0;
return ms;
}
private static TModel ConvertToClient(Stream json)
{
return JsonSerializer.Deserialize<TModel>(json)!;
}
}
public sealed class JsonValue
{
public int Value { get; set; }
} |
Beta Was this translation helpful? Give feedback.
-
I need help to save/read big json from text column.
For performance reason, I don't want to read everything into a string and deserialize to my POCO.
How to configure the mapping so I don't incur large objects (large strings) along the way of retrieval/insert? My Json could be over 100K, which is a big POCO.
I am thinking to use stream to convert to/from my POCO, but is it possible to plugin a stream converter in the process?
Beta Was this translation helpful? Give feedback.
All reactions