nimblebox logo
Docs
v3.3.1b

NimbleBox Deploy

Your token protected API endpoint.

Last Update: 17th September, 2022. Platform images are intentionally not added.

Deploy or nbox.Serving are the live API endpoints that take in requests via HTTP/REST. It is a really general concept since we can not only deploy ML Models into production but esentially be used to run any stateful computation in your workflow.

Quick Start

Here's a quick example on how to make a Deploy using an Operator:

  • Start by making a new directory with mkdir moonshot && cd moonshot
  • And create a new file touch baz.py
Python
Loading...
  • Run the code python3 baz.py

Adding the __name__ == "__main__" guard is critical to ensure there is only one parent process running.

ML Models

Deploying ML models in production is a breeze. Let's just say you want to serve an sklearn model in production. Here's what you are going to do:

  1. Create a new folder, store the model.pkl pickle file in that folder
  2. Copy the code below and put it fie.py. Note that instead of using the @operator decorator we are subclassing Operator. There are two functions you need to define the __remote_init__ where you initialise your model and foward where you invoke your model
Python
Loading...

The current deployment will be terminated when the new upload is complete. nbox will convert the model to a fastAPI endpoint and you can:

  • HTTP/REST command at https://api.nimblebox.ai/{depl_id}/forward_rest
  • Opertor RPC using Operator.from_serving(depl_id), which is what happens in the above code.

Advanced

If you think about it any class in Python language is either used as a namespace or because it carries some state that will either be modified or used in some other part of your flow. Thus classes are by default hosted as Servings when deployed. Here's a quick model

Python
Loading...

User must be ensure that their code does not lead to race conditions.

Live Endpoint Monitoring (WIP)

Simply deploying the models is not enough and you need far more insight into what your model is doing, for this we have Live Enpoint monitoring that connects using nbox.lmao.LMAOAsgiMiddleware and logs all the information. You can optionally configure it to detect drifts in your endpoints really easily.

nbox SDKnbox provides built in access to all the APIs and packages them in the most user friendly manner.Star 0