Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Leak #3086

Open
sunshineplan opened this issue Jul 19, 2024 · 0 comments
Open

Memory Leak #3086

sunshineplan opened this issue Jul 19, 2024 · 0 comments

Comments

@sunshineplan
Copy link
Contributor

What version of V2Ray are you using?

V2Ray 5.16.1

What's your scenario of using V2Ray?

Set as HTTPS_PROXY env

What problems have you encountered?

Memory Leak

Please attach your configuration here

Server configuration:

{
      "protocol": "vmess",
      "port": XXX,
      "settings": {
        "clients": [
          {
            "id": "XXXXX"
          }
        ]
      },
      "streamSettings": {
        "network": "http",
        "security": "tls",
        "tlsSettings": {
          "certificates": [
            {
              "certificateFile": "XXXXX",
              "keyFile": "XXXX"
            }
          ]
        }
      }
}

...

"transport": {
    "httpSettings": {
      "path": "XXX"
    }
  },

Client configuration:

"inbounds": [
    {
      "tag": "http",
      "port": 1080,
      "listen": "0.0.0.0",
      "protocol": "http",
      "sniffing": {
        "enabled": true,
        "destOverride": [
          "http",
          "tls"
        ]
      }
    }
],
 "outbounds": [
    {
      "tag": "h2",
      "protocol": "vmess",
      "settings": {
        "vnext": [
          {
            "address": "XXXX",
            "port": XXX,
            "users": [
              {
                "id": "XXXX",
                "security": "auto"
              }
            ]
          }
        ]
      },
      "streamSettings": {
        "network": "http",
        "security": "tls"
      }
    }
  ],
  "transport": {
    "httpSettings": {
      "path": "XXXX"
    }
  }

Other configurations (such as Nginx) and logs here

http://localhost:6060/debug/pprof/

goroutine profile: total 3011
1431 @ 0x104606f28 0x10461a4d4 0x104a68224 0x104a693d0 0x104a020ec 0x104ad9c58 0x104ad8458 0x104ad7c60 0x104ad7c24 0x104640a84
#	0x104a68223	github.com/v2fly/v2ray-core/v5/transport/pipe.(*pipe).ReadMultiBuffer+0x63	github.com/v2fly/v2ray-core/v5/transport/pipe/impl.go:91
#	0x104a693cf	github.com/v2fly/v2ray-core/v5/transport/pipe.(*Reader).ReadMultiBuffer+0x1f	github.com/v2fly/v2ray-core/v5/transport/pipe/reader.go:16
#	0x104a020eb	github.com/v2fly/v2ray-core/v5/common/buf.(*BufferedReader).Read+0xeb		github.com/v2fly/v2ray-core/v5/common/buf/reader.go:76
#	0x104ad9c57	golang.org/x/net/http2.(*clientStream).writeRequestBody+0x3d7			golang.org/x/[email protected]/http2/transport.go:1876
#	0x104ad8457	golang.org/x/net/http2.(*clientStream).writeRequest+0x7b7			golang.org/x/[email protected]/http2/transport.go:1569
#	0x104ad7c5f	golang.org/x/net/http2.(*clientStream).doRequest+0x1f				golang.org/x/[email protected]/http2/transport.go:1436
#	0x104ad7c23	golang.org/x/net/http2.(*ClientConn).roundTrip.func1+0x23			golang.org/x/[email protected]/http2/transport.go:1312

1417 @ 0x104606f28 0x10461a4d4 0x104ad7ae8 0x104ad7660 0x104ad309c 0x104ad3089 0x104ad2c20 0x10491c20c 0x10491bbfc 0x10491da24 0x105086c8c 0x105086c7d 0x104a1ec40 0x104a96154 0x104fbb46c 0x104f3c34c 0x104fb9474 0x104a95020 0x104a70778 0x104a6f8e0 0x104640a84
#	0x104ad7ae7	golang.org/x/net/http2.(*ClientConn).roundTrip.func2+0x107				golang.org/x/[email protected]/http2/transport.go:1328
#	0x104ad765f	golang.org/x/net/http2.(*ClientConn).roundTrip+0x47f					golang.org/x/[email protected]/http2/transport.go:1418
#	0x104ad309b	golang.org/x/net/http2.(*ClientConn).RoundTrip+0x21b					golang.org/x/[email protected]/http2/transport.go:1293
#	0x104ad3088	golang.org/x/net/http2.(*Transport).RoundTripOpt+0x208					golang.org/x/[email protected]/http2/transport.go:617
#	0x104ad2c1f	golang.org/x/net/http2.(*Transport).RoundTrip+0x1f					golang.org/x/[email protected]/http2/transport.go:575
#	0x10491c20b	net/http.send+0x4ab									net/http/client.go:259
#	0x10491bbfb	net/http.(*Client).send+0x9b								net/http/client.go:180
#	0x10491da23	net/http.(*Client).do+0x6b3								net/http/client.go:724
#	0x105086c8b	net/http.(*Client).Do+0x5ab								net/http/client.go:590
#	0x105086c7c	github.com/v2fly/v2ray-core/v5/transport/internet/http.Dial+0x59c			github.com/v2fly/v2ray-core/v5/transport/internet/http/dialer.go:144
#	0x104a1ec3f	github.com/v2fly/v2ray-core/v5/transport/internet.Dial+0x1cf				github.com/v2fly/v2ray-core/v5/transport/internet/dialer.go:55
#	0x104a96153	github.com/v2fly/v2ray-core/v5/app/proxyman/outbound.(*Handler).Dial+0xeb3		github.com/v2fly/v2ray-core/v5/app/proxyman/outbound/handler.go:276
#	0x104fbb46b	github.com/v2fly/v2ray-core/v5/proxy/vmess/outbound.(*Handler).Process.func1+0xab	github.com/v2fly/v2ray-core/v5/proxy/vmess/outbound/outbound.go:65
#	0x104f3c34b	github.com/v2fly/v2ray-core/v5/common/retry.(*retryer).On+0xbb				github.com/v2fly/v2ray-core/v5/common/retry/retry.go:27
#	0x104fb9473	github.com/v2fly/v2ray-core/v5/proxy/vmess/outbound.(*Handler).Process+0x103		github.com/v2fly/v2ray-core/v5/proxy/vmess/outbound/outbound.go:63
#	0x104a9501f	github.com/v2fly/v2ray-core/v5/app/proxyman/outbound.(*Handler).Dispatch+0x38f		github.com/v2fly/v2ray-core/v5/app/proxyman/outbound/handler.go:174
#	0x104a70777	github.com/v2fly/v2ray-core/v5/app/dispatcher.(*DefaultDispatcher).routedDispatch+0x887	github.com/v2fly/v2ray-core/v5/app/dispatcher/default.go:336
#	0x104a6f8df	github.com/v2fly/v2ray-core/v5/app/dispatcher.(*DefaultDispatcher).Dispatch.func1+0x36f	github.com/v2fly/v2ray-core/v5/app/dispatcher/default.go:234

26 @ 0x104606f28 0x104600748 0x10463a540 0x10467a288 0x10467b5d0 0x10467b5c1 0x104862c08 0x104873e34 0x104a1e944 0x104a01ea4 0x104a01e71 0x104a02c08 0x1049ff318 0x1049ff520 0x104f46384 0x104f45c70 0x1049fd8d4 0x104640a84
#	0x10463a53f	internal/poll.runtime_pollWait+0x9f							runtime/netpoll.go:345
#	0x10467a287	internal/poll.(*pollDesc).wait+0x27							internal/poll/fd_poll_runtime.go:84
#	0x10467b5cf	internal/poll.(*pollDesc).waitRead+0x1ff						internal/poll/fd_poll_runtime.go:89
#	0x10467b5c0	internal/poll.(*FD).Read+0x1f0								internal/poll/fd_unix.go:164
#	0x104862c07	net.(*netFD).Read+0x27									net/fd_posix.go:55
#	0x104873e33	net.(*conn).Read+0x33									net/net.go:179
#	0x104a1e943	github.com/v2fly/v2ray-core/v5/transport/internet.(*StatCouterConnection).Read+0x33	github.com/v2fly/v2ray-core/v5/transport/internet/connection.go:40
#	0x104a01ea3	github.com/v2fly/v2ray-core/v5/common/buf.(*Buffer).ReadFrom+0xd3			github.com/v2fly/v2ray-core/v5/common/buf/buffer.go:254
#	0x104a01e70	github.com/v2fly/v2ray-core/v5/common/buf.ReadBuffer+0xa0				github.com/v2fly/v2ray-core/v5/common/buf/reader.go:30
#	0x104a02c07	github.com/v2fly/v2ray-core/v5/common/buf.(*SingleReader).ReadMultiBuffer+0x27		github.com/v2fly/v2ray-core/v5/common/buf/reader.go:158
#	0x1049ff317	github.com/v2fly/v2ray-core/v5/common/buf.copyInternal+0x47				github.com/v2fly/v2ray-core/v5/common/buf/copy.go:81
#	0x1049ff51f	github.com/v2fly/v2ray-core/v5/common/buf.Copy+0x8f					github.com/v2fly/v2ray-core/v5/common/buf/copy.go:104
#	0x104f46383	github.com/v2fly/v2ray-core/v5/proxy/http.(*Server).handleConnect.func1+0xf3		github.com/v2fly/v2ray-core/v5/proxy/http/server.go:190
#	0x104f45c6f	github.com/v2fly/v2ray-core/v5/proxy/http.(*Server).handleConnect.OnSuccess.func4+0x2f	github.com/v2fly/v2ray-core/v5/common/task/task.go:12
#	0x1049fd8d3	github.com/v2fly/v2ray-core/v5/common/task.Run.func1+0x33				github.com/v2fly/v2ray-core/v5/common/task/task.go:28

I am using a Macbook Pro(M2) and I almost never shut it down, only closing the lid. I have noticed that after using it for a long period of time, the memory usage of v2ray gradually increases. After profiling with pprof, I obtained the above result.

The issue seems to be related to pipe.ReadMultiBuffer. I suspect that some pipes do not exit properly due to idle connections not being reactivated. I temporarily modified the code to add an idle timeout closure, and this resolved the issue.

func (p *pipe) ReadMultiBuffer() (buf.MultiBuffer, error) {
	for {
		data, err := p.readMultiBufferInternal()
		if data != nil || err != nil {
			p.writeSignal.Signal()
			return data, err
		}

		timer := time.NewTimer(15 * time.Minute) // new add
		select {
		case <-p.readSignal.Wait():
		case <-p.done.Wait():
		case <-timer.C: // new add
			return nil, buf.ErrReadTimeout // new add
		}
		timer.Stop() // new add
	}
}

So, is there an elegant way to close these pipes, or using a timeout as I have done?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant