Skip to content
This repository has been archived by the owner on Feb 18, 2021. It is now read-only.

Hyperbahn received fatal exception #22

Open
jc-fireball opened this issue Oct 2, 2015 · 5 comments
Open

Hyperbahn received fatal exception #22

jc-fireball opened this issue Oct 2, 2015 · 5 comments

Comments

@jc-fireball
Copy link

Hyperbahn node received fatal exception. Here are some stack trace info.

TchannelResponseAlreadyDoneError: cannot send error frame, response already done in state: 3
    at Object.createError [as ResponseAlreadyDone] (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/node_modules/error/typed.js:31:22)
    at V2OutResponse.sendError (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/out_response.js:209:43)
    at TChannelV2Handler.onReqError (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/v2/handler.js:908:17)
    at DefinedEvent.onReqError [as listener] (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/v2/handler.js:91:14)
    at DefinedEvent.emit (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/lib/event_emitter.js:86:14)
    at TChannelConnection.buildResponse (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/connection_base.js:211:24)
    at buildResponse (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/connection_base.js:202:21)
    at onArg23 (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/request-handler.js:57:19)
    at compatCall (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/streaming_in_request.js:106:9)
    at end (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/node_modules/run-parallel/index.js:16:15)
    at done (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/node_modules/run-parallel/index.js:20:10)
    at each (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/node_modules/run-parallel/index.js:26:7)
    at signalReady (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/ready-signal/index.js:21:26)
    at StreamArg.finish (/var/cache/udeploy/r/hyperbahn/sjc1-produ-0000000057-v1/node_modules/hyperbahn/node_modules/tchannel/argstream.js:257:9)
    at StreamArg.emit (events.js:117:20)
    at _stream_readable.js:943:16
    at process._tickDomainCallback (node.js:463:13)
@Raynos
Copy link
Contributor

Raynos commented Oct 12, 2015

See uber/tchannel#1286

@jwolski
Copy link

jwolski commented Oct 31, 2015

I want to make note that we've see this in other forms in Ringpop testing. During a deploy of 1200 instances using Ringpop w/ TChannel 3.5.16 we're seeing some Sentry errors:

out_response.js: responseAlreadyDone detected!!
Error: stack
    at SentryLogger.log (/node_modules/logtron/node_modules/sentry-logger/index.js:145:38)
    at LoggerStream.write [as _write] (/node_modules/logtron/backends/logger-stream.js:59:17)
    at doWrite (/node_modules/logtron/node_modules/readable-stream/lib/_stream_writable.js:279:12)
    at writeOrBuffer (/node_modules/logtron/node_modules/readable-stream/lib/_stream_writable.js:266:5)
    at LoggerStream.Writable.write (/node_modules/logtron/node_modules/readable-stream/lib/_stream_writable.js:211:11)
    at LoggerStream.write (/node_modules/logtron/backends/logger-stream.js:50:37)
    at parallelWrite (/node_modules/logtron/lib/parallel-write.js:48:20)
    at Logger.writeEntry (/node_modules/logtron/logger.js:191:5)
    at Logger.log [as error] (/node_modules/logtron/log-method.js:41:14)
    at V2OutResponse.responseAlreadyDone (/node_modules/tchannel/out_response.js:209:21)
    at onArg23 (/node_modules/tchannel/request-handler.js:57:36)
    at compatCall (/node_modules/tchannel/streaming_in_request.js:106:9)
    at end (/node_modules/tchannel/node_modules/run-parallel/index.js:16:15)
    at done (/node_modules/tchannel/node_modules/run-parallel/index.js:20:10)
    at each (/node_modules/tchannel/node_modules/run-parallel/index.js:26:7)
    at signalReady (/node_modules/tchannel/node_modules/ready-signal/index.js:21:26)
    at StreamArg.finish (/node_modules/tchannel/argstream.js:257:9)
    at StreamArg.emit (events.js:117:20)
    at _stream_readable.js:943:16
    at process._tickDomainCallback (node.js:463:13)

@Raynos
Copy link
Contributor

Raynos commented Nov 1, 2015

@jwolski

Looks like we've downgraded this from a fatal thrown exception into a logger.error().

Not quite sure what the cause is but that log should have a lot of context. I suspect it's something about timeouts and large requests/responses.

@jwolski
Copy link

jwolski commented Nov 2, 2015

The context is here:

callerName: ringpop
codeString: Timeout
currentState: 3
errMessage: request timed out after 1591ms (limit was 1500ms)
listener: False
listeners: 0
serviceName: ringpop

@Raynos
Copy link
Contributor

Raynos commented Nov 2, 2015

@jwolski pretty sure I have a pr for this.

uber/tchannel-node#164

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants