Skip to content

MLOps in Equinor#

Build and deploy to Radix

Who Are We? - AI Platforms#

The AI Foundational Platforms is an interdisciplinary team of Data Scientists, MLOps Experts, Cloud Infrastructure Experts and Kubernetes Developers dedicated for serving MLOps needs.

AI-Platforms-Logo

Mission - AI Platforms#

Mission

  • Develop and maintain cross-platform, flexible and evolvable MLOps Framework, Tools and Infrastructure
    serving entire ML lifecycle.
  • Provide MLOps training and advisory support for implementation of MLOps tools and best practices.
  • Manage MLOps infrastructure stack, architecture and provide infrastructure support to projects.

What is MLOps?#

MLOps Lifecycle

Machine Learning Operations (MLOps)
MLOps is an end-to-end machine learning development process to design, build, deploy and monitor reproducible, testable, and evolvable ML-powered solutions.

MLOps Capabilities#

Wide spectrum of MLOps capabilities include:

  • MLOps Infrastructure Automation
  • Container Image Storage & Version Control
  • Persistent Storage Volumes
  • Virtual Environment Automation
  • Jupyter Notebooks/Notebook Server for Experimentation
  • Pipeline Development / Pipeline Automation
  • Experiment Tracking
  • Model Tracking & Registry
  • Data Versioning
  • AutoML
  • Hyperparameter Tuning
  • Model Deployment / Serving
  • Model Monitoring
  • Feature Store (Coming Soon)

MLOps Framework

Value Proposition - AI Platforms#

  • Managed MLOps Infrastructure & Support
  • Leverage best capabilities from multiple platforms as per project needs
  • Easily configure project specific permissions and role based access (RBAC)
  • Pre-installed Environments Python and IDEs
  • Automate Virtual Environments/dependencies setup
  • Reduce onboarding time –  to start experimenting
  • Reduce learning curve for using MLOps tools and capabilities
  • Guidance on best practices
  • Seamless integration among various MLOps tools
  • Properly track code/data/models with proper lineage
  • Quickly deploy/scale model for testing or production
  • Efficiently Monitor and compare deployed models
  • Equinor compliant Platform security

MLOps Services#

MLOps Services offered by AI Foundational Platforms can be divided into 3 main parts:

  • MLOps Infrastructure Stack: Standardized MLOps stack, configuration options and infrastructure support provided by AI Foundational Platforms
  • MLOps Project Support: Assisting projects in identifying, defining and implementing MLOps tools, best practices and solutions as per their project needs.
  • MLOps Training: Providing standardized MLOps training to Citizen Data Scientists, Data Scientists / ML Practitioners and Advanced Application Developers.

Supported Platforms#

We leverage, integrate and support wide variety of MLOps platforms to serve diverse range of ML solutions across equinor. Some of the platforms we support today are:

MLOps Capability Matrix#

Side by side comparison of various capabilities from multiple MLOps platforms across the ML Lifecycle can be found here: (This is continually updated)

https://statoilsrm.sharepoint.com/:x:/r/sites/OMNIA.aurora/Shared%20Documents/General/4%20-%20Training/MLOps_Platform_Capability_Matrix.xlsx?d=w2a6e8fe01582437a9ff6ac63b4cbcd98&csf=1&web=1&e=Ld4U7z

Apply for Access#

Individuals wishing to apply for access to MLOps stack should follow instructions provided on at this link:

https://docs.aurora.equinor.com/docs/get-started/access_management/

Projects and Teams wishing to setup project specific MLOps stack should contact via email to:

Prerit Shah (pshah@equinor.com) Alexis Canizares (acani@equinor.com) Matthew Li (mattl@equinor.com)

Slack#

Please join mlops-ai-platforms slack channel to be part of MLOps conversation and get technical support.

Principles#

We define, develop, maintain and serve MLOps Infrastructure, Training and support based on following principles:

  • Capability focused (NOT Tool centric)
  • Think Big Picture (Entire ML Lifecycle)
  • Align with Project Needs (Not only user preferences)
  • Keep Open Architecture (Not get locked into a single tool or platform)
  • Multiple Options (Not to be dependent on a single tool, package or framework)

We are Capability Focused#

  • We are capability centric MLOps framework (NOT tool/platform centric)
  • We believe in leveraging best capabilities from multiple platforms in a cross-platform and interoperable manner.
  • We support platforms such as Kubeflow, AzureML, Databricks, Dataiku as well as hybrid custom MLOps tools/workflows.

We Consider Entire ML Lifecycle#

  • We help projects evaluate the requirements, needs and solutions considering the entire ML lifecycle.
  • Our MLOps framework includes multiple tools, packages and options for each stage of the ML lifecycle.
  • We provide MLOps capabilities and support for all aspects of ML experimentation, testing and deployments.

We Serve Project Needs#

  • We focus on aligning with project needs. (not only user preferences)
  • We believe in delivering most optimal outcomes to create business value from ML projects by applying MLOps.
  • We serve projects with all size and scope in all stages of ML lifecycle.

We Believe in Open Architecture#

  • We believe in not limiting MLOps framework or projects into single MLOps tool or platform.
  • We believe in supporting integration of new capabilities from various platforms.
  • We constantly strive to add cutting edge functionalities to our framework to support constantly evolving AI/ML usecases.

We Keep Multiple Options#

  • For each stage of the ML lifecyle, we support multiple tools and options.
  • We believe in maintaining diverse set of capabilities to serve diverse projects, usecases and user-preferences

Here are some links to other sites and resources that document are related to MLOps and Data Science at Equinor.

Equinor AI Foundational Platforms is inspired by and based on the following work:

MLOps-Success-Stories#

Drilling Operations#

Dip-Picking
Drilling-Log-Imputation

Expoloration#

Coming Soon

Production#

Coming Soon

Predictive Maintenance#

Coming Soon

Renewables#

Coming Soon

Trading#

Coming Soon

Generative AI#

Stable Diffusion Models

Knowledge AI#

Kai-Multilingual-Models
Kai-Enablers
Kai-Mate-Notifications
Kai-CSSU-OPT

Timeseries AI#

Coming Soon