Datagran allows you to upload a python Scikit-Learn or Tensor Flow trained model.
To do so, go to your project and select Models.
Then upload your trained model into Datagran:
When a custom model operator runs; it has 2 main roles:
predict()function of the dumped model
fit()function of the dumped model (if the conditions you defined is met)
As a quick start; the following code block shows how to create a dump of a model.
import cloudpickle f = open('./model.cloudpickle', 'wb') cloudpickle.dump(model, f) f.close()
You can upload a model that has complex sklearn pipelines including your custom transformers. The following code block shows how to create such a pipeline. Once the model is ready, you can use the code above to dump the model. (The full example of this code can be found on this notebook: https://colab.research.google.com/drive/1--PKvlYN8c2EcaW-UJuhetG8r_0tjvMZ?usp=sharing)
class CustomTransformer(BaseEstimator): """ This CustomTransformer only returns the incoming values as it is The goal of this transformer is to demonstrate custom transformers are supported when deployed to Datagran platform with cloudpickle library """ def fit(self, X, y = None): return self def transform(self, X, y = None): return X numericTransfomer = Pipeline( steps=[ ("imputer", SimpleImputer(strategy="median")), ("scaler", StandardScaler()), ('CustomTransformer', CustomTransformer()) ] ) categoricalTransformer = OneHotEncoder(handle_unknown="ignore") preprocessor = ColumnTransformer( transformers=[ ("num", numericTransfomer, numericCols), ("cat", categoricalTransformer, categoricalCols), ] ) model = Pipeline( steps=[("preprocessor", preprocessor), ("classifier", LogisticRegression())] ) model.fit(X_train, y_train)