Discussion about this post

User's avatar
OdedM's avatar

Worth checking out MLEM (python, open source) , which can really ease model deployment, especially if you need fastapi serving. Saves a ton of boilerplate. Docs and info: https://mlem.ai/

Expand full comment
Mihail-Iulian Pleșa's avatar

Thank you for the article. What about tf serving or torch serving ?

Expand full comment
3 more comments...

No posts