How to use the mlpm.app.aidserver.route function in mlpm

To help you get started, we’ve selected a few mlpm examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github autoai-org / AID / components / mlserve / aid / server.py View on Github external
@aidserver.route("/train", methods=["GET", "POST"])
async def train(request):
    if request.method == "POST":
        return handle_post_solver_train_or_infer(request, UPLOAD_TRAIN_FOLDER,
                                          "train")
github autoai-org / AID / components / mlserve / aid / server.py View on Github external
@aidserver.route("/infer", methods=["GET", "POST"])
async def infer(request):
    if request.method == 'POST':
        return handle_post_solver_train_or_infer(request, UPLOAD_INFER_FOLDER,
                                          "infer")
github autoai-org / AID / components / mlserve / mlpm / server.py View on Github external
@aidserver.route("/infer", methods=["GET", "POST"])
async def infer():
    if request.method == 'POST':
        return await handle_post_solver_train_or_infer(request, UPLOAD_INFER_FOLDER,
                                                 "infer", PUBLIC_FOLDER)
github autoai-org / AID / components / mlserve / mlpm / server.py View on Github external
@aidserver.route("/train", methods=["GET", "POST"])
async def train():
    if request.method == "POST":
        return await handle_post_solver_train_or_infer(request, UPLOAD_TRAIN_FOLDER,
                                                 "train", PUBLIC_FOLDER)