I have this example code below:
from flask import Flask, request, jsonify
import json
import os
app = Flask(__name__)
model = pickle.load(open('~filepath/ml_classifier.sav', 'rb')) #filepath is on google cloud bucket storage
@app.route('/get_predictions')
def pred_probs():
data = request.args.get('data')
output = model.predict(data)
return {'predictions': output}
The challenge I am facing is, the model file ml_classifier.sav gets updated and saved everyday at its directory by a scheduler but the model variable keeps using the original model file. I can read the model file directly in the pred_probs() function but that slows down my inference time which is not suitable in my case.
I want to know if there is any strategy or function I can employ so that the model variable updates every time its file in the filepath is updated.