Skip to content

Conversation

eddiebergman
Copy link
Contributor

@eddiebergman eddiebergman commented Feb 20, 2025

WIP

Re-adds graphs :)

Currently we have a very fast graph sampling and model building which is PyTorch agnostic.
This works with random search (which randomly samples graphs) in both ask-and-tell mode and using neps.run()

Turns out the runtime supports this with no changes which is nice.

The PR from #179 was merged into the branch to re-introduce the TorchWLKernel thanks to @vladislavalerievich. This should provide BO support for graphs but requires some more work to integrate the new Grammar with it.


Sample code for now:

from __future__ import annotations

from dataclasses import dataclass

import rich

import neps


# The model
@dataclass
class T:
    s: str

    def __call__(self) -> str:
        return self.s


def join(*s: str) -> str:
    return "(" + ", ".join(s) + ")"


def run_pipeline(**config):
    string = config["gr"]
    rich.print("config", config["gr"])
    model = grammar.to_model(string)
    rich.print("model", model)
    return 1


grammar = neps.Grammar.from_dict(
    start_symbol="s",
    grammar={
        "s": (["a", "b", "p a", "p p"], join),
        "p": ["a b", "s"],
        "a": T("a"),
        "b": T("b"),
    },
)


space = neps.SearchSpace(
    {
        "fl": neps.Float(0, 1),
        "gr": grammar,
    },
)

neps.run(
    run_pipeline,
    space,
    optimizer="random_search",
    root_directory="blah",
    max_evaluations_total=10,
    overwrite_working_directory=True,
)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

Successfully merging this pull request may close these issues.

3 participants