Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Continue signal refactors. #122

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

ericsnekbytes
Copy link
Collaborator

Unwind functional args approach and switch to signals for main widget <--> FssTreeItem communication.

@ericsnekbytes ericsnekbytes added the maintenance General tasks for project upkeep label Mar 20, 2025
Copy link
Member

@martindurant martindurant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Glad to see you are working on this - I was trying to dig through the TS code a little myself.

I am happy to see the FS creation functions moved from globals in the helper module.

You maybe did exactly this, but: Can we please ensure to pass the kwargs to the frontend in the /jupyter_fsspec/config endpoint, so that the open block looks like

with fsspec.open(path, {kwargs}) as f:
    ...

where the kwargs need to be filled out.

Note that this will need documentation to explain that tokens/credentials should not be stored in the jupyter-fsspec config unless the whole deployment is isolated behind a firewall. Also, the documentation should make clear, that where authentication with storage is automatic (env vars, local config files, etc), such open blocks will only work when the kernel also has access to the same (env vars, config files).

@@ -190,6 +190,7 @@ async def post(self):
:rtype: dict
"""
request_data = json.loads(self.request.body.decode("utf-8"))
print(f"FTRANSFER data\n{request_data}")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be logging

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Stray debugging statement, it's been removed 👍

@ericsnekbytes
Copy link
Collaborator Author

The helper was/still is using a manager-singleton approach to obtain fs instances. There are many other potential solutions here, but the underlying principle is that the helper should not be reading the config file or parsing and constructing fs instances itself IMO, that should be delegated to other parts of the codebase that are (already) capable of understanding how a filesystem is defined inside the jupyter_fsspec config file which is a de facto standard invented by the extension).

In this PR, fs construction has been pulled out into two utility functions on the manager because that's probably the easiest way to achieve that goal given the current backend architecture.

Whether the extension is operating in a server-side context using async fs instances, or in a client/kernel side context (synchronous, here), the jupyter_fsspec config is always going to be the source of truth for what a named filesystem is, so the same underlying machinery for fs construction should be used in both contexts IMO to ensure consistent behavior across the extension. There are options for alternative solutions/refactors if we want to pursue them at some point.

You maybe did exactly this, but: Can we please ensure to pass the kwargs to the frontend in the /jupyter_fsspec/config endpoint, so that the open block looks like

I'll chat with Rosio about this and see if she maybe wants to pick up these edits (she's the original author of this functionality), and tagging #107 here so the details are linked :)

@ericsnekbytes ericsnekbytes marked this pull request as ready for review March 20, 2025 18:23
@danyeaw
Copy link
Contributor

danyeaw commented Mar 21, 2025

I gave this a run this morning and it seemed to work well. Do we know why the integration tests are failing, is this something I introduced yesterday?

@RRosio
Copy link
Collaborator

RRosio commented Mar 21, 2025

I don't think that the changes from the last few days would have caused these failures. It might be an issue with the test itself.. I will try running and debugging locally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
maintenance General tasks for project upkeep
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants