Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient-enhanced surrogates #156

Open
1 of 2 tasks
ChrisRackauckas opened this issue Jun 15, 2020 · 7 comments
Open
1 of 2 tasks

Gradient-enhanced surrogates #156

ChrisRackauckas opened this issue Jun 15, 2020 · 7 comments

Comments

@ChrisRackauckas
Copy link
Member

ChrisRackauckas commented Jun 15, 2020

@ludoro
Copy link
Contributor

ludoro commented Jun 15, 2020

Nice thanks, they could be good assignment for the MLH students

@vikram-s-narayan
Copy link
Contributor

I would like to work on Gradient Enhanced Kriging.

@vikram-s-narayan
Copy link
Contributor

vikram-s-narayan commented May 26, 2022

@ChrisRackauckas and @ranjanan - I've coded up a very rough version of GEKPLS as a bunch of functions here.

All of my code is a translation of the SMT python code. The code is not fully tested. And there are still some kinks and bugs that I'm working out but I thought I'd share this early and get feedback :)

In the example that I have provided in the gist, the underlying function simply returns x1^2 + x2^2 + x3^2 (given an array with components x1,x2 and x3)

If this looks okay, I can begin to add it as a surrogate and begin refining and optimizing the code.

Apart from the above RFC, I also have a question:

I'm using ScikitLearn's PLS (@sk_import cross_decomposition: PLSRegression). This will require us to add the following line in the main Surrogates.jl file: __precompile__(false)

This is needed because of this issue in ScikitLearn.jl

I hope this is okay?

@ChrisRackauckas
Copy link
Member Author

It's a start. We won't want the final version to use ScikitLearn as that would cause some packaging issues (PyCall is hard to build into sysimages for example). But to get a working version, that's a good way to start, then add some tests, and replace pieces one-by-one.

@vikram-s-narayan
Copy link
Contributor

OK. I'll start cleaning this up and search for an alternative for ScikitLearn PLS.
Thanks!

@vikram-s-narayan
Copy link
Contributor

@ChrisRackauckas - I've created a draft pull request for GEKPLS with some basic tests added in. This still uses the SK Learn PLS Regressor which I plan to replace. With regard to a Julia PLS Regressor there is one called - PartialLeastSquaresRegressor.jl - but it has a few issues (ex: it does not have an attribute called 'x_rotations' which is what we use from SKLearn's PLS).

Hence, I'm now planning on writing our own PLS function based on the SKLearn PLS. I plan to take only the parts that are needed for GEKPLS. Is this approach of writing our own PLS function okay?

@ChrisRackauckas
Copy link
Member Author

That sounds great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants