【问题标题】:Owin WebApi Post method with Large Object带有大对象的 Owin WebApi Post 方法
【发布时间】:2018-03-21 07:05:26
【问题描述】:

我正在使用 OWIN 自托管 Web API 2 Windows 服务。它在大多数情况下运行良好,除了在客户端(winforms 应用程序)上导致 OutOfMemoryException 的大型自定义对象。

问题:如何发布大型自定义对象?

OutOfMemoryException 最初发生在JsonConvert.SerializeObject 的这段代码的末尾:

using Newtonsoft.Json;

static HttpClient _httpClient = new HttpClient();  

public async Task SaveMyObjectAsync(MyObject largeObject)
{
    var response = await _httpClient.PostAsync("myobjects/route/", new JsonContent(largeObject));
    response.EnsureSuccessStatusCode();
}

public static class JsonSettings
{
    public static readonly JsonSerializerSettings Default =
    new JsonSerializerSettings
    {
        ContractResolver = new DefaultContractResolver(),
        NullValueHandling = NullValueHandling.Ignore,
        ReferenceLoopHandling = ReferenceLoopHandling.Serialize,
        DateTimeZoneHandling = DateTimeZoneHandling.RoundtripKind,
        DateParseHandling = DateParseHandling.DateTimeOffset,
        DateFormatHandling = DateFormatHandling.IsoDateFormat,
        Formatting = Formatting.Indented,
        Converters = new List<JsonConverter>
        {
            new StringEnumConverter(),
        }
    };
}

public class JsonContent : StringContent
{
    public JsonContent(object value) : base(JsonConvert.SerializeObject(value, JsonSettings.Default), Encoding.UTF8, "application/json") {}
}

第一次尝试

所以从this answer 我换掉了序列化方法来写入本地文件。这工作了一段时间,直到我意识到它只是增加了它可以处理的大小限制。我仍然收到带有较大对象的 OutOfMemoryException 但现在它在 File.ReadAllText

public JsonContent(object value) : base(SerializeObjectByStream(value), Encoding.UTF8, "application/json") { }

static string SerializeObjectByStream(object value)
{
    using (TextWriter textWriter = File.CreateText("LocalJsonFile.json"))
    {
        JsonConvert.DefaultSettings = () => JsonSettings.Default;
        using (var jsonWriter = new JsonTextWriter(textWriter))
        {
            var serializer = new JsonSerializer();
            serializer.Serialize(jsonWriter, value);
            jsonWriter.Flush();
        }
    }
    return File.ReadAllText("LocalJsonFile.json");
}

多部分尝试

无论如何,在单个部分中发送这么大的对象可能是个坏主意,所以我尝试使用MultipartContent 来分解它。大多数示例似乎涵盖了读取多部分请求,而不是创建它,但此代码适用于我的常规大小自定义对象。不幸的是,它仍然会为大对象抛出 OutOfMemoryException。这次它是 Newtonsoft JsonSerializer 内部的 ser.Serialize(jsonWriter, request)

我也尝试使用 FileStream 而不是 MemoryStream 来解决同样的问题。这次 OutOfMemoryException 在_httpClient.PostAsync

using (var content = new MultipartContent())
{
    using (var stream = new MemoryStream())
    {
        var writer = new StreamWriter(stream);
        JsonConvert.DefaultSettings = () => JsonSettings.Default;
        var jsonWriter = new JsonTextWriter(writer);
        var ser = new JsonSerializer();
        ser.Serialize(jsonWriter, request);
        jsonWriter.Flush();
        stream.Seek(0, SeekOrigin.Begin);
        content.Add(new StreamContent(stream));

        var response = await _httpClient.PostAsync("myobjects/route/", content);
        response.EnsureSuccessStatusCode();
    }
}

看来我所做的只是推动这些数据以在不同的地方解决内存不足问题。
如何将这个大型自定义对象分解成块 - 同时将其保留在 1 个事务中????

【问题讨论】:

    标签: c# asp.net-web-api2 out-of-memory owin dotnet-httpclient


    【解决方案1】:

    这并没有我想象的那么直观。这段代码似乎表现得很好(为简单起见,删除了取消令牌等)。虽然我有点担心为我的 FileStream 使用通用文件名。作为异步,我想可能有 2 个实例可能会尝试同时创建同一个文件?

    JsonSerializer 切换到BinaryFormatter 非常痛苦,因为这意味着我必须通过[Serializable] 属性来装饰我的所有课程。我还依赖 JsonSerializer 的行为来使用默认构造函数 as described here,它为我将 null 转换为空字符串。

    客户:

    const int MaximumChunkSize = 1024000;
    
    public async Task SaveMyObjectAsync(MyObject largeObject)
    {
        //use fileStream to write the object to disk, 
        // so it does not hold it all in memory at the same time
        using (var stream = new FileStream("LocalStreamFile.json", FileMode.Create))
        {
            //JsonSerializer unable to serialize a large object 
            // without OOM error, so use BinaryFormatter
            var formatter = new BinaryFormatter();
            formatter.Serialize(stream, request);
            //return to the start of the stream
            stream.Seek(0, SeekOrigin.Begin);
            using (var content = new MultipartContent())
            {
                var buffer = new byte[MaximumChunkSize];
                while (stream.Read(buffer, 0, buffer.Length) > 0)
                {
                    //add the large object in chunks to the multipart content
                    content.Add(new JsonContent(buffer));
                }
                var response = await _httpClient.PostAsync("myobjects/route/", content);
                response.EnsureSuccessStatusCode();
            }
        }
    }
    

    服务器:

    [HttpPost, Route("myobjects/route/")]
    public async Task<IHttpActionResult> SaveMyObjectAsync()
    {
        if (Request.Content.IsMimeMultipartContent() == false)
        {
            return StatusCode(HttpStatusCode.BadRequest);
        }
        var contentStreamProvider = await Request.Content.ReadAsMultipartAsync();
        var stream = new FileStream("LocalStreamFile.json", FileMode.Create);
        foreach (var content in contentStreamProvider.Contents)
        {
            //read out the chunk and convert from json
            var requestArray = JsonConvert.DeserializeObject<byte[]>(await content.ReadAsStringAsync(), JsonSettings.Default);
            stream.Write(requestArray, 0, requestArray.Length);
        }
        stream.Seek(0, SeekOrigin.Begin);
        var formatter = new BinaryFormatter();
        //convert back from stream to our original large object
        var request = (MyObject)formatter.Deserialize(stream);
        //save to database etc
        ...
        return Ok();
    }
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2016-03-30
      • 2021-04-30
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多