Replies: 1 comment
-
|
Have you tried using Claude or ChatGPT to get you started with a solid framework for solving your scaling issue? I'd bet that would at least get you heading in the right direction. My opinion is that by nature, code you write for ML/DL using any tool is made to be iterated over and over because you're expected to iterate over data sets multiple times while training and testing, so it should just be a matter of developing your training code for one data set, and then use a for loop to iterate over an entire directory of datasets later on. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey everyone 👋
I’m currently working on a project where I need to forecast meat sales per product (article) — sounds simple, but it’s getting a bit messy.
Right now, I have an Excel sheet with 4 product articles. I train a separate model for each one (so 4 models in total).
But soon there might be 150–200 articles, and I’ll need to automate everything — so that each model updates every 2 weeks or once a month and gives daily forecasts.
Basically, I need to figure out:
1. how to build a scalable pipeline (so I don’t have to train hundreds of models manually),
2. and later, how to handle MLOps (deployment, updates, retraining, etc.).
I’m pretty new to this side of ML engineering, and of course, time is tight (don’t even ask how I ended up in this situation 😅).
I’m looking for someone who could maybe point me in the right direction, share some experience, or even mentor or collaborate on this.
Any advice is appreciated — where to look for information, what tools or frameworks might fit, or how to structure things so it scales nicely.
If needed, I can share a small example of the data and part of my code.
Thanks in advance to anyone who takes the time to help 🙏
Beta Was this translation helpful? Give feedback.
All reactions