Skip to content

Issue #1: Nested FlowUnit Wrapping in Neural Network Layers #1

@Shyanil

Description

@Shyanil

Issue #1 Bug Report: Nested FlowUnit Wrapping in Neural Network Layers

Description

When passing an input through the NeuralNetwork layers, FlowUnit objects are getting nested multiple times instead of maintaining a single wrapper. This results in outputs like FlowUnit(FlowUnit(...)), which is incorrect.

Steps to Reproduce

  1. Initialize the neural network:
    X = [2.0, 3.0, -1.0]
    n = NeuralNetwork(3, [4, 4, 1])
  2. Forward pass through layers:
    layer_1_output = n.layers[0](X)
    print("First Layer Output:", layer_1_output)
    
    layer_2_output = n.layers[1](layer_1_output)
    print("Second Layer Output:", layer_2_output)
    
    final_output = n.layers[2](layer_2_output)
    print("Final Output:", final_output)
    
    print("The final output")
    n(X)
  3. Observed Output:
    First Layer Output: [FlowUnit(-0.7321), FlowUnit(1.5557), FlowUnit(-1.0274), FlowUnit(-1.6452)]
    Second Layer Output: [FlowUnit(FlowUnit(0.9508)), FlowUnit(FlowUnit(-2.0511)), ...]
    Final Output: FlowUnit(FlowUnit(FlowUnit(-1.1022)))
    

Expected Behavior

Each layer should return a list of FlowUnit objects, but they should not be nested inside each other. Instead of FlowUnit(FlowUnit(...)), the output should be:

Final Output: FlowUnit(-1.1022)

Possible Cause

  • Each layer might be applying FlowUnit() wrapping without checking if the input is already a FlowUnit object.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions