Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: multithreaded model functions #40

Open
albop opened this issue Aug 9, 2017 · 4 comments
Open

ENH: multithreaded model functions #40

albop opened this issue Aug 9, 2017 · 4 comments

Comments

@albop
Copy link
Member

albop commented Aug 9, 2017

@sglyon : I did some tests and suspect we would get sizeable performance gains, just by adding Threads.@threads in front of the loop iterating over points.
Any particular reason to not do it ? That should of course be deactivated as one might want to lock functions to a single thread.

@sglyon
Copy link
Member

sglyon commented Aug 9, 2017

That's great that you saw good speedups.

I've had good and bad experiences with threads in Julia. When it works, it is very nice and gives a good performance boost. However, often I see that it fails because Threads.@threads often introduces type inference problems. Under the hood, Threads.@threads will create a closure out of the body of the loop and then call the closure on multiple threads for all the values in the for loop range. Julia has a known issue where variables that are closed over become boxed and type inference basically gives up on them (I can find links to issues in the JuliaLang/julia repo if you are interested, but for now I take the easy way out).

Can you think of a way to preserve performance and give the user the ability to opt-in to the multithreaded version?

@albop
Copy link
Member Author

albop commented Aug 9, 2017

I'm not sure. I would guess an option vectorize=:threads (or vectorize=:simd for @inbounds @fastmath @simd) to make_function would be enough.

@sglyon
Copy link
Member

sglyon commented Aug 10, 2017

Ahh yes that makes sense. If we put it as an option at that level then there shouldn't be any problem

@sglyon
Copy link
Member

sglyon commented Aug 10, 2017

Also, I would say there is no reason to not implement it as an option. If the default is to have it off then it is non breaking and adds a cool feature

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants