KitOps

0

Tools for easing the handoff between AI/ML and App/SRE teams.

Collaboration

devops
datasets
code
ai
KitOps

Standards-based packaging and versioning system for AI/ML projects.

LICENSE Language Discord Twitter Hits

Official Website Use Cases

What is KitOps?

KitOps is a packaging and versioning system for AI/ML projects that uses open standards so it works with the AI/ML, development, and DevOps tools you are already using, and can be stored in your enterprise registry. It's tamper-proof, signable, and auditable.

KitOps makes it easy for organizations to track, control, and audit access and changes to their AI project artifacts. It simplifies the handoffs between data scientists, application developers, and SREs working with LLMs and other AI/ML models. KitOps' ModelKits are an OCI-compliant package for models, their dependencies, configurations, codebases, features, hyperparmeters, and any other documentation. ModelKits are portable, reproducible, and work with the tools you already use.

Teams and enterprises use KitOps as a gate between development and production deployment. This ensures that projects are complete, versioned, and immutable before they are deployed and makes rollbacks and other production operations simpler and safer.

Use KitOps to speed up and de-risk all types of AI projects from small analysis models to large language models, including fine-tuning and RAG.

🎉 New

Get the most out of KitOps' ModelKits by using them with the Jozu Hub repository (don't worry - ModelKits are still compatible with any OCI registry).

Features

  • 🎁 Unified packaging: A ModelKit package includes models, datasets, configurations, and code. Add as much or as little as your project needs.
  • 🏭 Versioning: Each ModelKit is tagged so everyone knows which dataset and model work together.
  • 🤖 Automation: Pack or unpack a ModelKit locally or as part of your CI/CD workflow for testing, integration, or deployment.
  • 🪛 LLM fine-tuning: Use KitOps to fine-tune a large language model using LoRA.
  • 🎯 RAG pipelines: Create a RAG pipeline for tailoring an LLM with KitOps.
  • 🔒 Tamper-proofing: Each ModelKit package includes a SHA digest for itself, and every artifact it holds.
  • 📝 Artifact signing: ModelKits and their assets can be signed so you can be confident of their provenance.
  • 🌈 Standards-based: Store ModelKits in any OCI 1.1-compliant container or artifact registry.
  • 🥧 Simple syntax: Kitfiles are easy to write and read, using a familiar YAML syntax.
  • 🏃‍♂️‍➡️ Run locally: Kit's Dev Mode lets your run an LLM locally, configure it, and prompt/chat with it instantly.
  • 🐳 Deploy containers: Generate a Docker container as part of your kit unpack (coming soon).
  • 🚢 Kubernetes-ready: Generate a Kubernetes / KServe deployment config as part of your kit unpack (coming soon).
  • 🩰 Flexible: Store key-value pairs, or any YAML-compatible JSON data in your Kitfile - use it to keep features, hyperparameters, links to MLOps tool experiments our validation output...whatever you want!
  • 🤗 Universal: ModelKits can be used with any AI, ML, or LLM project - even multi-modal models.

See KitOps in Action

https://github.com/jozu-ai/kitops/assets/4766570/05ae1362-afd3-4e78-bfce-e982c17f8df2

What is in the box?

ModelKit: At the heart of KitOps is the ModelKit, an OCI-compliant packaging format for sharing all AI project artifacts: datasets, code, configurations, and models. By standardizing the way these components are packaged, versioned, and shared, ModelKits facilitate a more streamlined and collaborative development process that is compatible with any MLOps or DevOps tool.

Kitfile: A ModelKit is defined by a Kitfile - your AI/ML project's blueprint. It uses YAML to describe where to find each of the artifacts that will be packaged into the ModelKit. Reading the Kitfile gives you a quick understanding of what's involved in each AI project.

Kit CLI: The Kit CLI not only enables users to create, manage, run, and deploy ModelKits -- it lets you pull only the pieces you need. Just need the serialized model for deployment? Use unpack --model, or maybe you just want the training datasets? unpack --datasets.

🚀 Try Kit in under 15 Minutes

  1. Install the CLI for your platform.
  2. Follow the Quick Start to learn to pack, unpack, and share a ModelKit.

For those who prefer to build from the source, follow these steps to get the latest version from our repository.

✨ What's New? 😍

We've been busy and shipping quickly!

  • 📝 Add any arbitrary metadata to your Kitfile - hyperparameters, test results, links to resources, etc...
  • 🔼 Automatic chunking of uploads to registries
  • 🪵 New log-levels and per-request trace logging
  • 🏎️ Reduced ModelKit packing time by >97%
  • 🎯 Use KitOps to do LLM fine-tuning or package a RAG pipeline
  • 🤓 Read Kitfile from stdin
  • 👩‍💻 Use Kit Dev mode to run an LLM instantly (no GPUs or internet required)

You can see all the gory details in our release changelogs.

Your Voice Matters

Need Help?

If you need help there are several ways to reach our community and Maintainers outlined in our support doc

Reporting Issues and Suggesting Features

Your insights help Kit evolve as an open standard for AI/ML. We deeply value the issues and feature requests we get from users in our community :sparkling_heart:. To contribute your thoughts,navigate to the Issues tab and hitting the New Issue green button. Our templates guide you in providing essential details to address your request effectively.

Joining the KitOps Contributors

We ❤️ our Kit community and contributors. To learn more about the many ways you can contribute (you don't need to be a coder) and how to get started see our Contributor's Guide. Please read our Governance and our Code of Conduct before contributing.

A Community Built on Respect

At KitOps, inclusivity, empathy, and responsibility are at our core. Please read our Code of Conduct to understand the values guiding our community.

Join KitOps community

For support, release updates, and general KitOps discussion, please join the KitOps Discord. Follow KitOps on X for daily updates.

Roadmap

We share our roadmap openly so anyone in the community can provide feedback and ideas. Let us know what you'd like to see by pinging us on Discord or creating an issue.




Trackgit