Thanks to visit codestin.com
Credit goes to github.com

Skip to content

.NET 6+ WebSocket HttpListener - memory leak #101022

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
zxcvqwerasdf opened this issue Apr 14, 2024 · 12 comments · Fixed by #114098
Closed

.NET 6+ WebSocket HttpListener - memory leak #101022

zxcvqwerasdf opened this issue Apr 14, 2024 · 12 comments · Fixed by #114098
Labels
area-System.Net bug Priority:1 Work that is critical for the release, but we could probably ship without tenet-performance Performance related issue
Milestone

Comments

@zxcvqwerasdf
Copy link

zxcvqwerasdf commented Apr 14, 2024

Description

The same server build works differently on Windows and Linux, on Windows maximum memory usage is ~300 MB, then after GC calls it's drop to 180-250 average. On Linux after 5 minutes RAM usage is ~1.3 GB (and grows)

Reproduction Steps

usings:

using System;
using System.Collections.Generic;
using System.Net;
using System.Net.WebSockets;
using System.Text;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;

Server:

internal class Program
{
    static async Task Main(string[] args)
    {
        var listener = new HttpListener();

        listener.Prefixes.Add("http://localhost:7123/");
        listener.Start();

        while (true)
        {
            var context = await listener.GetContextAsync();
            if (context.Request.IsWebSocketRequest)
            {
                _ = Task.Run(() => ProcessWebSocketRequest(context));
            }
            else
            {
                context.Response.StatusCode = 400;
                context.Response.Close();
            }
        }
    }

    static async Task ProcessWebSocketRequest(HttpListenerContext context)
    {
        var websocketContext = await context.AcceptWebSocketAsync(null);
        var websocket = websocketContext.WebSocket;
        var buffer = new byte[256].AsMemory();

        while (websocket.State == WebSocketState.Open)
        {
            try
            {
                var result = await websocket.ReceiveAsync(buffer, CancellationToken.None);
                if (result.MessageType == WebSocketMessageType.Text)
                {
                    string message = Encoding.UTF8.GetString(buffer.Span.Slice(0, result.Count));
                    var jsonData = new Dictionary<string, object>();
                    jsonData.Add("type", "response");
                    jsonData.Add("body", message);
                    string serialized = JsonSerializer.Serialize(jsonData);

                    var responseBuffer = Encoding.UTF8.GetBytes(serialized);
                    await websocket.SendAsync(responseBuffer, WebSocketMessageType.Text, true, CancellationToken.None);
                }
                else if (result.MessageType == WebSocketMessageType.Close)
                {
                    await websocket.CloseAsync(WebSocketCloseStatus.NormalClosure, null, CancellationToken.None);
                    break;
                }
            }
            catch (Exception ex)
            {
                //Console.WriteLine(ex.Message);
            }
        }
        websocket.Dispose();
    }

}

Client(s):

internal class Program
{
    private static Random s_random = new Random();
    static async Task Main(string[] args)
    {
        Task[] tasks = new Task[2000];
        for (int i = 0; i < tasks.Length;i++)
        {
            tasks[i] = Task.Run(Tick);
        }
        Console.Read();
    }

    static async Task Tick()
    {
        while(true)
        {

            var clientWebSocket = new ClientWebSocket();
            clientWebSocket.Options.SetRequestHeader("Accept-Encoding", "en");
            clientWebSocket.Options.SetRequestHeader("Pragma", "no-cache");
            clientWebSocket.Options.SetRequestHeader("User-Agent", "MyUserAgent");
            clientWebSocket.Options.KeepAliveInterval = TimeSpan.FromHours(20);
            clientWebSocket.Options.Cookies = new CookieContainer();

            await clientWebSocket.ConnectAsync(new Uri("ws://localhost:7123/"), CancellationToken.None);
            int i = 0;
            while (clientWebSocket.State == WebSocketState.Open && i++ < 4)
            {
                var jsondata = new Dictionary<string, object>();
                jsondata.Add("type", "request");
                byte[] rndBytes = new byte[16];
                s_random.NextBytes(rndBytes);
                jsondata.Add("body", new Guid(rndBytes).ToString());

                string serialized = JsonSerializer.Serialize(jsondata);
                await clientWebSocket.SendAsync(Encoding.UTF8.GetBytes(serialized), WebSocketMessageType.Text, true, CancellationToken.None);

                var rcvBuffer = new byte[256].AsMemory();
                var bytesReaded = await clientWebSocket.ReceiveAsync(rcvBuffer, CancellationToken.None);
                string message = Encoding.UTF8.GetString(rcvBuffer.Span.Slice(0, bytesReaded.Count));

                await Task.Delay(1000);
            }
            await clientWebSocket.CloseAsync(WebSocketCloseStatus.NormalClosure, null, CancellationToken.None);
            clientWebSocket.Dispose();
        }
    }
}

Expected behavior

Same memory usage as on Windows
image

Actual behavior

Memory leak?
image

Regression?

Same on .NET 8 ( Microsoft.NETCore.App 8.0.4 )

Known Workarounds

No response

Configuration

Windows:
Windows 10 x64, Version 10.0.19045 Build 19045, 22H2
Microsoft.NETCore.App 6.0.27
Linux:
Debian 11 (5.10.0-28-amd64 Debian 5.10.209-2 (2024-01-31) x86_64 GNU/Linux)
Microsoft.NETCore.App 6.0.27

Other information

No response

@ghost ghost added the area-System.Net label Apr 14, 2024
@dotnet-policy-service dotnet-policy-service bot added the untriaged New issue has not been triaged by the area owner label Apr 14, 2024
Copy link
Contributor

Tagging subscribers to this area: @dotnet/ncl
See info in area-owners.md if you want to be subscribed.

@janvorli
Copy link
Member

@zxcvqwerasdf thank you for sharing the source code for the repro! I will give it a try. Did both of the windows and Linux machines you were using have similar specs (memory, number of CPU cores)?

@zxcvqwerasdf
Copy link
Author

@zxcvqwerasdf thank you for sharing the source code for the repro! I will give it a try. Did both of the windows and Linux machines you were using have similar specs (memory, number of CPU cores)?

One is vps kvm 2 cores 4 ram, second is vps kvm 1 core 1 ram, third is virtual machine in virtualbox 8 core 16 ram. Same behavior on all specs.

@janvorli
Copy link
Member

I was able to repro the memory growth on Linux with .NET 9 too (haven't tested windows). I've taken a dump after the memory consumption reached about 1.2GB. There is some managed leak, just look at the list of the GC heap objects with the largest counts - over 46000 live HttpClient instances doesn't look healthy.

7f0df7216658  48,775   1,951,000 System.Net.CookieCollection
7f0df6c8e708  46,801   2,246,448 System.Net.HttpResponseStream
7f0df6a811b0  46,802   2,246,496 System.Collections.Concurrent.ConcurrentDictionary<System.IntPtr, System.Net.Sockets.SocketAsyncEngine+SocketAsyncContextWrapper>+Node
7f0df6acb008  46,801   2,620,856 System.Net.Sockets.NetworkStream
7f0df653a820  46,876   2,625,056 System.Uri
7f0df6f504d0  46,876   2,625,056 System.Uri+MoreInfo
7f0df69c20f8  46,802   2,995,328 System.Net.Sockets.SafeSocketHandle
7f0df6acb2b0  46,802   2,995,328 System.Threading.TimerCallback
7f0df68ca7d0  93,677   2,997,664 System.Net.IPEndPoint
7f0df6ace120  46,885   3,000,640 System.IO.MemoryStream
7f0df60cfbe0  46,801   3,369,672 System.Net.HttpListenerContext
7f0df666a768  93,680   3,747,200 System.Net.IPAddress
7f0df6668b40  46,876   3,750,080 System.Uri+UriInfo
7f0df5f55ee8 188,660   4,527,840 System.Object
7f0df6acbad8  48,971   4,701,216 System.Threading.TimerQueueTimer
7f0df62231c8  46,801   4,867,304 System.Net.HttpListenerResponse
7f0df6222aa8  46,801   5,616,120 System.Net.HttpListenerRequest
7f0df68cd430  46,802   5,616,240 System.Net.Sockets.Socket
7f0df6c56ca0  93,602   5,990,528 System.Action<System.Int32, System.Memory<System.Byte>, System.Net.Sockets.SocketFlags, System.Net.Sockets.SocketError>
7f0df6ac4928  46,801   5,990,528 System.Net.Sockets.SocketAsyncContext+BufferMemoryReceiveOperation
7f0df69cf100  46,802   7,113,904 System.Net.Sockets.SocketAsyncContext
7f0df6c50028  95,576   7,646,080 System.Collections.Specialized.NameValueCollection
7f0df6229a88  46,801   8,236,976 System.Net.HttpConnection
7f0df6c50580  93,602   8,985,792 System.Net.WebHeaderCollection
7f0df6227c08 189,340  13,632,480 System.Collections.Hashtable
7f0df6c85b70 530,603  16,979,296 System.Collections.Specialized.NameObjectCollectionBase+NameObjectEntry
7f0df6c53878 768,714  24,598,848 System.Collections.ArrayList
7f0df6ac5fe8  93,602  27,706,192 System.Net.Sockets.Socket+AwaitableSocketAsyncEventArgs
7f0df5f5c690 626,468  31,446,328 System.Object[]
7f0df5fee1b0 987,564  43,140,882 System.String
7f0df622aaf0 189,514  63,799,152 System.Collections.Hashtable+Bucket[]
561a0279e040 392,244 115,535,736 Free
7f0df6514928 144,358 405,768,214 System.Byte[]

Here is a summary of the memory usage, you can see that the GC heap is 830MB large:

 +----------------------------------------------------------------------+ 
 | Memory Type         |          Count |         Size |   Size (bytes) | 
 +----------------------------------------------------------------------+ 
 | GCHeap              |            209 |     830.46mb |    870,801,408 | 
 | Stack               |             33 |     250.20mb |    262,352,896 | 
 | Image               |            500 |      83.30mb |     87,348,736 | 
 | PAGE_READWRITE      |            147 |      33.25mb |     34,869,248 | 
 | GCBookkeeping       |              6 |       7.93mb |      8,314,880 | 
 | HighFrequencyHeap   |             55 |       3.41mb |      3,575,808 | 
 | LowFrequencyHeap    |             30 |       2.32mb |      2,437,120 | 
 | FixupPrecodeHeap    |             98 |       1.53mb |      1,605,632 | 
 | LoaderCodeHeap      |              2 |       1.00mb |      1,048,576 | 
 | PAGE_EXECUTE_READ   |             50 |     788.00kb |        806,912 | 
 | HandleTable         |              3 |     192.00kb |        196,608 | 
 | PAGE_READONLY       |             41 |     164.00kb |        167,936 | 
 | NewStubPrecodeHeap  |              4 |      64.00kb |         65,536 | 
 | CacheEntryHeap      |              1 |      40.00kb |         40,960 | 
 | IndirectionCellHeap |              1 |      24.00kb |         24,576 | 
 | StubHeap            |              1 |      12.00kb |         12,288 | 
 | GCHeapToBeFreed     |              1 |       4.00kb |          4,096 | 
 +----------------------------------------------------------------------+ 
 | [TOTAL]             |          1,182 |       1.19gb |  1,273,673,216 | 
 +----------------------------------------------------------------------+

@MihaZupan MihaZupan added this to the 9.0.0 milestone May 7, 2024
@MihaZupan MihaZupan removed the untriaged New issue has not been triaged by the area owner label May 7, 2024
@karelz karelz added bug tenet-performance Performance related issue labels Jun 25, 2024
@karelz
Copy link
Member

karelz commented Jun 25, 2024

@CarnaViire will you be able to take a look at it?

@CarnaViire CarnaViire assigned CarnaViire and rokonec and unassigned CarnaViire Jun 27, 2024
@rokonec
Copy link
Member

rokonec commented Jul 23, 2024

@zxcvqwerasdf We have identified the issue and found workaround. We are considering a fix for this issue but we are unsure when it will be actually fixed.
Meanwhile please add context.Response.Abort(); in server right after websocket.Dispose(); so it looks like this:

websocket.Dispose();
context.Response.Abort();

This code will work with both Windows and Linux.
We apologies for inconveniences caused.

@rokonec rokonec added the Priority:2 Work that is important, but not critical for the release label Jul 23, 2024
@rokonec
Copy link
Member

rokonec commented Jul 23, 2024

Result of analysis:

private readonly Dictionary<HttpListenerContext, HttpListenerContext> _listenerContexts = new Dictionary<HttpListenerContext, HttpListenerContext>();
is keeping all instances of HttpListenerContext in Dictionary indefinitely for a lifetime of HttpListener instance.
In non web socket scenario it is removed by context.Response.Close(); but since this is not supposed nor documented to be called when context.Request.IsWebSocketRequest with the current codebase we have to free it by context.Response.Abort() when the particular HttpListenerComtext is no longer needed. This will call httpConnection.Close(true) to cleanup all references to given HttpConnection

Appealing and most probably correct option is to cleanup everything in websocket.Dispose(); including used HttpConnection.

@rokonec rokonec removed their assignment Jul 23, 2024
@janvorli
Copy link
Member

cc: @mangod9 - this is a memory leak reported couple of months ago where the culprit was just figured out.

@mangod9
Copy link
Member

mangod9 commented Jul 23, 2024

Ah thanks for the FYI. Is this 6 only or occurs in 8 too?

@janvorli
Copy link
Member

I believe .NET 9 too, as I have reported above that I could repro it in 9.

@rokonec
Copy link
Member

rokonec commented Jul 23, 2024

@mangod9 I believe it is 6+. I have reproduced it with 9 main.

@karelz karelz modified the milestones: 9.0.0, Future Jul 25, 2024
@karelz
Copy link
Member

karelz commented Jul 25, 2024

Triage: This is problem in HttpListener, not in ClientWebSockets. It is rarely used, first report in years, moving to Future.
Workaround: Abort the response - see above.

@karelz karelz changed the title .NET 6 Linux ClientWebSocket memory leak .NET 6+ WebSocket HttpListener - memory leak Jul 25, 2024
@rokonec rokonec added Priority:1 Work that is critical for the release, but we could probably ship without and removed Priority:2 Work that is important, but not critical for the release labels Mar 25, 2025
@rokonec rokonec modified the milestones: Future, 10.0.0 Apr 1, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area-System.Net bug Priority:1 Work that is critical for the release, but we could probably ship without tenet-performance Performance related issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants