Seldon Core


Open source platform for

deploying machine learning

models on Kubernetes

Any Toolkit

Allow data scientists to create models using any machine learning toolkit or programming language.

Runtime inference graphs

Allow complex runtime inference graphs to be deployed as microservices. Composed of: models, routers, combiners and transformers.

Easy Integration

Expose machine learning models via REST and gRPC for easy integration into business apps.

Full lifecycle management

Handle full lifecycle management of the deployed model. Updating – Scaling – Monitoring – Security

Toolkits currently supported by Seldon Core


TensorFlow is an open source software library for high performance numerical computation.


Sklearn is a common machine learning toolkit for Python, offering simple and efficient tools for data mining and data analysis.


MLlib is Apache  Spark‘s scalable machine learning library. MLlib fits into Spark’s APIs and interoperates with NumPy in Python and R libraries.


R is a language and environment for statistical computing and graphics.


H2O is a fully open source, distributed in-memory machine learning platform with linear scalability.


Java is a general-purpose programming language used by popular frameworks like DL4J.

Seldon Core Stack

Seldon Core, our open-source framework, makes it easier and faster to deploy your machine learning models and experiments at scale on Kubernetes. Serve your models built in any open-source or commercial model building framework. Leverage powerful Kubernetes features like Custom Resource Definitions to manage model graphs. And connect your continuous integration and deployment (CI/CD) tools to scale and update your deployment.

Platforms Integrated with Seldon


The Kubeflow project is dedicated to making deployments of machine learning (ML) workflows on Kubernetes simple, portable and scalable. Anywhere you are running Kubernetes, you should be able to run Kubeflow. Kubeflow are integrated with Seldon Core, to deploy machine learning models on Kubernetes. Read the docs on and explore the end-to-end machine learning demo project to learn how Seldon integrates with Kubeflow.


Seldon has integrated with IBM’s Fabric for Deep Learning (FfDL). Leveraging the power of Kubernetes, FfDL provides a scalable, resilient, and fault-tolerant deep-learning framework. The platform uses a distribution and orchestration layer that facilitates learning from a large amount of data in a reasonable amount of time across compute nodes. Read the IBM developerWorks blog and run the FfDL demo project to see how Seldon integrates with IBM FfDL.

Red Hat: OpenShift

OpenShift combines application lifecycle management – including image builds, continuous integration, deployments, and updates – with Kubernetes. Get OpenShift as a managed service, in the cloud, or in your own datacenter; migrate applications seamlessly no matter where you run. Read more on Seldon’s integration with Openshift.

Runtime Inference Graphs

Create and manage complex deployments that include multiple models, experiments, ensembles and transformers. Describing your deployments using YAML for better versioning and reproducibility. Deploy using Kubernetes Customer Resource Definitions (CRDs) and kubectl – no custom CLI required.

The below example contains:

  • Multi-armed bandit tests multiple models.
  • Outlier detection to spot anomalous predictions.
  • Explanation to highlight why a model has made a decision.

Google Cloud Platform Marketplace

Seldon has introduced a commercial Kubernetes application to all users of the Google Cloud Platform Marketplace. Providing customers with Seldon Core makes it easier and faster to deploy machine learning models and experiments at scale.  Commercial Kubernetes applications can be deployed on-premise or even on other public clouds through the GCP Marketplace. Read our blog post here.

Amazon Web Services Marketplace

AWS customers can now easily deploy machine learning models and experiments at scale running on Amazon Web Services (AWS) with Seldon Core. With the addition of Seldon Core, AWS Marketplace has extended its existing benefits and features to container products. Seldon Core uses containers and microservices to serve models built in any open-source or commercial model building framework. Read more here.

Let’s talk

Let’s start a conversation about how the next generation of devops for machine learning can transform the future of your business.