KitOps

0

Tools for easing the handoff between AI/ML and App/SRE teams.

Collaboration

devops
datasets
code
ai
KitOps

Standards-based packaging and versioning system for AI/ML projects.

LICENSE Language Discord Twitter Hits

Official Website

Use Cases

What is KitOps?

KitOps is a packaging, versioning, and sharing system for AI/ML projects that uses open standards so it works with the AI/ML, development, and DevOps tools you are already using, and can be stored in your enterprise container registry. It's AI/ML platform engineering teams' preferred solution for securely packaging and versioning assets.

KitOps creates a ModelKit for your AI/ML project which includes everything you need to reproduce it locally or deploy it into production. You can even selectively unpack a ModelKit so different team members can save time and storage space by only grabbing what they need for a task. Because ModelKits are immutable, signable, and live in your existing container registry they're easy for organizations to track, control, and audit.

ModelKits simplify the handoffs between data scientists, application developers, and SREs working with LLMs and other AI/ML models. Teams and enterprises use KitOps as a secure storage throughout the AI/ML project lifecycle.

Use KitOps to speed up and de-risk all types of AI/ML projects:

  • Predictive models
  • Large language models
  • Computer vision models
  • Multi-modal models
  • Audio models
  • etc...

🇪🇺 EU AI Act Compliance 🔒

For our friends in the EU - ModelKits are the perfect way to create a library of model versions for EU AI Act compliance because they're tamper-proof, signable, and auditable.

😍 What's New? ✨

Features

  • 🎁 Unified packaging: A ModelKit package includes models, datasets, configurations, and code. Add as much or as little as your project needs.
  • 🏭 Versioning: Each ModelKit is tagged so everyone knows which dataset and model work together.
  • 🔒 Tamper-proofing: Each ModelKit package includes an SHA digest for itself, and every artifact it holds.
  • 🤩 Selective-unpacking: Unpack only what you need from a ModelKit with the kit unpack --filter command - just the model, just the dataset and code, or any other combination.
  • 🤖 Automation: Pack or unpack a ModelKit locally or as part of your CI/CD workflow for testing, integration, or deployment (e.g. GitHub Actions or Dagger.
  • 🐳 Deploy containers: Generate a basic or custom docker container from any ModelKit.
  • 🚢 Kubernetes-ready: Generate a Kubernetes / KServe deployment config from any ModelKit.
  • 🪛 LLM fine-tuning: Use KitOps to fine-tune a large language model using LoRA.
  • 🎯 RAG pipelines: Create a RAG pipeline for tailoring an LLM with KitOps.
  • 📝 Artifact signing: ModelKits and their assets can be signed so you can be confident of their provenance.
  • 🌈 Standards-based: Store ModelKits in any OCI 1.1-compliant container or artifact registry.
  • 🥧 Simple syntax: Kitfiles are easy to write and read, using a familiar YAML syntax.
  • 🩰 Flexible: Reference base models using model parts, or store key-value pairs (or any YAML-compatible JSON data) in your Kitfile - use it to keep features, hyperparameters, links to MLOps tool experiments, or validation output.
  • 🏃‍♂️‍➡️ Run locally: Kit's Dev Mode lets you run an LLM locally, configure it, and prompt/chat with it instantly.
  • 🤗 Universal: ModelKits can be used with any AI, ML, or LLM project - even multi-modal models.

See KitOps in Action

There's a video of KitOps in action on the KitOps site.

🚀 Try KitOps in under 15 Minutes

  1. Install the CLI for your platform.
  2. Follow the Getting Started docs to learn to pack, unpack, and share a ModelKit.
  3. Test drive one of our ModelKit Quick Starts that includes everything you need to run your model including a codebase, dataset, documentation, and of course the model.

For those who prefer to build from the source, follow these steps to get the latest version from our repository.

What is in the box?

ModelKit: At the heart of KitOps is the ModelKit, an OCI-compliant packaging format for sharing all AI project artifacts: datasets, code, configurations, and models. By standardizing the way these components are packaged, versioned, and shared, ModelKits facilitate a more streamlined and collaborative development process that is compatible with any MLOps or DevOps tool.

Kitfile: A ModelKit is defined by a Kitfile - your AI/ML project's blueprint. It uses YAML to describe where to find each of the artifacts that will be packaged into the ModelKit. The Kitfile outlines what each part of the project is.

Kit CLI: The Kit CLI not only enables users to create, manage, run, and deploy ModelKits -- it lets you pull only the pieces you need. Just need the serialized model for deployment? Use unpack --model, or maybe you just want the training datasets? unpack --datasets.

Need Help?

Join KitOps community

For support, release updates, and general KitOps discussion, please join the KitOps Discord. Follow KitOps on X for daily updates.

If you need help there are several ways to reach our community and Maintainers outlined in our support doc

Reporting Issues and Suggesting Features

Your insights help KitOps evolve as an open standard for AI/ML. We deeply value the issues and feature requests we get from users in our community :sparkling_heart:. To contribute your thoughts,navigate to the Issues tab and hitting the New Issue green button. Our templates guide you in providing essential details to address your request effectively.

Joining the KitOps Contributors

We ❤️ our KitOps community and contributors. To learn more about the many ways you can contribute (you don't need to be a coder) and how to get started see our Contributor's Guide. Please read our Governance and our Code of Conduct before contributing.

A Community Built on Respect

At KitOps, inclusivity, empathy, and responsibility are at our core. Please read our Code of Conduct to understand the values guiding our community.

Roadmap

We share our roadmap openly so anyone in the community can provide feedback and ideas. Let us know what you'd like to see by pinging us on Discord or creating an issue.