How startups can accelerate their Machine Learning Journey with Automat-it’s MLOps Accelerator

Table of Contents

by Nir Shney-dor and Oleksii Davydov

There are many ways you can embark on your machine learning journey using AWS Cloud. Infusing your application with machine learning opens many exciting new horizons and can enrich the user’s experience. Many startups are eager to kickstart their journey and get a Minimum Viable Product (MVP) running. Yet, they frequently find that building a Machine Learning Operations (MLOps) pipeline that maximizes the value of their existing data and integrates well with their developer tools can be challenging. In the race to meet market demands, startups must remain agile and work with evolutive solutions that will support their continued growth.

This blog will cover typical challenges startups face with developing an MLOps pipeline and how they could use Automat-it’s MLOps Accelerator to fast-track their AI/ML journey.

Challenges Startup Customers Face

In the dynamic landscape of startups, operating within an agile framework, priorities frequently undergo shifts. Additionally, resource constraints are common, leading individuals to frequently assume multiple roles. When it comes to machine learning, data pipelines are managed manually and take hours if not days to set up. Meanwhile, machine learning experiments, hyperparameters, and model performance are documented in Excel sheets. It is not uncommon to see EC2 instances left idle and incur costs or powerful GPUs that are not utilized to their full potential. While the availability of Large Language Models (LLMs) allows startups to integrate Generative AI capabilities into their applications, customizations are often required leverage the full capabilities of those models.

The MLOps Accelerator was designed to address the aforementioned challenges. As an end-to-end solution, the Accelerator takes care of every step of the MLOps pipeline and offers CI/CD capabilities. The CI/CD process, orchestrated by Amazon SageMaker Pipelines and GitHub Actions, offers the following capabilities that are built on the following components:

  • An automated ETL pipeline preprocesses, cleanses, and prepares the data for training.
  • The training pipeline is launched automatically, and models are saved in the Feature Store.
  • ClearML integration allows you to get insights into running and past experiments.
  • The model is deployed to production with a click of a button after passing tests and an approval phase.
  • Optional model retraining when data drift is detected ensures your model continues to perform well on new, unseen data.

Automat-it’s team of AI/ML experts will also work with you to identify the right models for your use case(s), customize the pipeline to your needs, and provide a technical overview of your AWS account. We will help you secure and optimize the cost of your workloads, following AWS’s well-architected Framework and leveraging Automat-its experience helping 600+ customers in EMEA.

Looking at the MLOps Maturity Model, we find that customers want to move fast on the operating model curve once an MVP has been established. Moving from the “initial” stage to the “repeatable” and “reliable” stages is a process that requires expertise and domain knowledge.

Figure 2: Automat-it ML Ops Accelerator

Many startups face a significant hurdle in getting the Machine Learning (ML) process to work at scale. How do you automate the ML lifecycle and streamline the ML process? So that you can get to the reliable stage of MLOps maturity in the least amount of time.

Automat-it’s MLOps Accelerator

Automat-it understands the unique needs of a startup organization. Therefore, the MLOps Accelerator offers you the most flexibility.

Figure 2: Automat-it ML Ops Accelerator

Getting started with the MLOps Accelerator is easy. Get in touch with Automat-it using the AWS Marketplace, and you will get the following:

  • The tailored solution, deployed in your AWS account
  • 50 hours of hands-on MLOps Engineer working with you
  • Code adjustments and optimization for AWS

First, we will meet with the customer and run a comprehensive assessment session (1-2 hours long). This session will help us understand the day-to-day operations, the challenges, and the specific requirements.

Next, our team of MLOps Engineers will customize the solution to your requirements. Whether it’s adapting the code to leverage Amazon SageMaker or making it more robust and automated – we are here for you. Every part of the solution can be customized or replaced to better fit your needs. The Accelerator uses the popular tools GitHub for source control and GitHub Actions to orchestrate the workflow. ClearML, an OSS solution, is integrated with the solution to help you visualize the results of your experiments.

Subsequently, we will support you in seamlessly integrating the MLOps Accelerator into both your product and daily operations. This ensures that you have a functional solution and the capability to utilize the CI/CD pipeline tailored to your specific requirements.

Automat-its experienced team (with 200+ active AWS certifications) will continue to support you through your onboarding experience. Whether it’s a technical review, FinOps review, or architecture office hours – we’ve got you covered.

The MLOps Accelerator serves various personas in your organization:

PersonaRole
Lead Data ScientistPromote model to production
SageMaker admin access
Product Owner

Promote model to production

Data Scientist

Develop the ML solution using local IDE or SageMaker Studio

ML Engineer

Source control access, update workflow

Data Engineer

Data pipeline access (including Feature Store)

Figure 3: A logical view of the MLOps Accelerator

Sample Customization

Here are some examples of how AWS Data Engineering tools help you prepare your data that is specific to your needs

  1. A Personal Identifiable Information (PII) scrubbing job can be configured in a couple of clicks.

    Figure 4: AWS Glue Studio
  2. This example shows a more complex scenario that retrieves data from S3, joins it with an RDS PostgreSQL table, and saves results to S3 for an ML job to ingest and uses a Redshift Serverless for analytics:

    Figure 5: Elaborate ETL pipeline using AWS Glue Studio
  3. In this example, we are writing an arbitrarily complex Spark code directly in a Jupyter Notebook environment:
    Figure 6: Spark code run in a Jupyter Notebook

    You only need to upload your data to S3 and set up the necessary permissions for SageMaker jobs to access the S3 buckets for processing.

Summary

We’ve covered the common pitfalls when working with machine learning applications. By working with the MLOps Accelerator, you reduce your learning curve, avoid mistakes and surprises, and get specialized support from an AWS Premier Partner to get you started and quickly and confidently ship your product.

We will work with you to set up your MLOps Pipeline and continue supporting you as part of the service packaged offering that we offer to our customers.

Automat-it – AWS Partner Spotlight

Automat-it is an all-in AWS Premier partner specializing in cloud migration, complex DevOps, and AI/ML projects. Automat-it works globally and serves hundreds of tech-savvy startup customers, spanning a variety of use cases and technologies, from Security through cloud migration and modernization to 24/7 NOC services.