Self Checks
Dify version
1.13.3
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
- Create a workflow with a Code (Python) node.
- Inside
main(), use Python's standard logging module with the default configuration (i.e. logging.basicConfig() with no filename argument, which outputs to sys.stderr / console by default).
- Let the code run for several seconds (e.g. calling an external API via
requests).
- Observe the node status.
Minimal reproducer:
import logging
logging.basicConfig(level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s')
logger = logging.getLogger(__name__)
def main(arg1):
logger.info("Starting task...")
logger.info("Connecting to external service...")
# ... long-running work ...
logger.info("Done.")
return {"result": "ok"}
✔️ Expected Behavior
- The node completes with status SUCCESS and returns the dict from
main().
- Or: if stderr output interferes with result parsing, Dify shows a clear, actionable error message such as:
Run failed: Failed to parse result — non-JSON content detected in stdout/stderr.
❌ Actual Behavior
- The node status is FAIL.
- The Dify UI shows no error message — only the runtime duration and token count (0).
- The Python code itself completes successfully: all
logger.info() lines print, and the function reaches its return statement.
- There is no indication in the UI that the cause is logging output polluting the result-parsing channel.
Root Cause (investigated)
The sandbox wraps user code and reads stdout (and/or merged stderr) to extract the JSON-serialized return value of main(). When logging.basicConfig() is used without a filename, log lines are written to stderr. The sandbox captures this alongside the actual return value, causing JSON parsing of the output stream to fail silently.
The result channel looks like this to the sandbox parser:
2026-04-28 07:56:43,522 - INFO - Starting task...
2026-04-28 07:56:43,660 - INFO - Connecting to external service...
2026-04-28 07:56:50,979 - INFO - Done.
{"result": "ok"} <-- actual return value, buried at the end
The parser fails to isolate the final JSON line and silently marks the node as FAIL.
Why This Is a UX Problem (not just user error)
- Python's
logging module is the standard, recommended way to instrument code. Using it is not a mistake.
- The node shows FAIL with zero error detail — there is no way for a user to know the cause without deep investigation.
- The sandbox's output-channel contract (stdout = return value only) is completely undocumented.
- Debugging required: running the workflow, checking raw logs, and eventually discovering — by accident — that removing
logging fixes the issue.
Suggested Fixes
Option A (preferred — robust fix): Separate result transport from logging output. Instead of reading the return value from stdout, the sandbox wrapper should write the JSON result to a dedicated file descriptor or temp file, leaving stdout/stderr free for user logging.
Option B (simpler fix): In the sandbox wrapper, explicitly redirect sys.stderr to /dev/null or a log file before executing user code, so only stdout is captured. Then scan stdout for the last valid JSON line.
Option C (minimum viable): Detect when result parsing fails due to non-JSON content in the output stream and surface a clear error message to the user, e.g.:
Run failed: could not parse return value.
Hint: avoid writing to stdout/stderr directly (print, logging)
as it may interfere with result extraction.
Workaround (for users until fixed):
logging.basicConfig(
filename='/tmp/mycode.log', # redirect to file
level=logging.INFO
)
Additional Context
This bug is particularly hard to diagnose because:
- The code node's "上次运行" (last run) panel does show the logging output — making it appear the code ran correctly.
- The FAIL status contradicts what the logs show.
SANDBOX_WORKER_TIMEOUT and CODE_EXECUTION_READ_TIMEOUT are irrelevant red herrings a user will chase first.
Self Checks
Dify version
1.13.3
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
main(), use Python's standardloggingmodule with the default configuration (i.e.logging.basicConfig()with nofilenameargument, which outputs tosys.stderr/ console by default).requests).Minimal reproducer:
✔️ Expected Behavior
main().Run failed: Failed to parse result — non-JSON content detected in stdout/stderr.❌ Actual Behavior
logger.info()lines print, and the function reaches itsreturnstatement.Root Cause (investigated)
The sandbox wraps user code and reads
stdout(and/or mergedstderr) to extract the JSON-serialized return value ofmain(). Whenlogging.basicConfig()is used without afilename, log lines are written tostderr. The sandbox captures this alongside the actual return value, causing JSON parsing of the output stream to fail silently.The result channel looks like this to the sandbox parser:
2026-04-28 07:56:43,522 - INFO - Starting task...
2026-04-28 07:56:43,660 - INFO - Connecting to external service...
2026-04-28 07:56:50,979 - INFO - Done.
{"result": "ok"} <-- actual return value, buried at the end
The parser fails to isolate the final JSON line and silently marks the node as FAIL.
Why This Is a UX Problem (not just user error)
loggingmodule is the standard, recommended way to instrument code. Using it is not a mistake.loggingfixes the issue.Suggested Fixes
Option A (preferred — robust fix): Separate result transport from logging output. Instead of reading the return value from stdout, the sandbox wrapper should write the JSON result to a dedicated file descriptor or temp file, leaving stdout/stderr free for user logging.
Option B (simpler fix): In the sandbox wrapper, explicitly redirect
sys.stderrto/dev/nullor a log file before executing user code, so only stdout is captured. Then scan stdout for the last valid JSON line.Option C (minimum viable): Detect when result parsing fails due to non-JSON content in the output stream and surface a clear error message to the user, e.g.:
Run failed: could not parse return value.
Hint: avoid writing to stdout/stderr directly (print, logging)
as it may interfere with result extraction.
Workaround (for users until fixed):
logging.basicConfig(
filename='/tmp/mycode.log', # redirect to file
level=logging.INFO
)
Additional Context
This bug is particularly hard to diagnose because:
SANDBOX_WORKER_TIMEOUTandCODE_EXECUTION_READ_TIMEOUTare irrelevant red herrings a user will chase first.