Home / Meta/ Changelog

Changelog

Version history and release notes for musegpt. Follows the Keep a Changelog format.

View the raw changelog on GitHub.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

0.1.0 - 2025-02-14

Added

  • Project skeleton with CMake 3.20+ build system and Google Test via FetchContent
  • Three-thread architecture spec: audio (real-time, lock-free), UI (non-blocking), background (owns I/O)
  • IPC model spec: ports and adapters pattern with typed commands and events
  • Inference backend spec: OpenAI-compatible API translation, streaming and structured response modes
  • Configuration model spec: four-source merge (defaults < file < env < UI) with TOML config file
  • Per-request metrics: latency, time-to-first-token, tokens/sec, token count
  • Deterministic inference support: temperature=0, seed parameter
  • Audio pipeline: whisper.cpp transcription chaining into LLM chat
  • 42 test files defining the full contract (all tests fail; no implementation yet)
  • Test coverage for 9 cloud inference providers: OpenAI, Anthropic, Gemini, Groq, OpenRouter, Together AI, Fireworks AI, Mistral AI, DeepSeek
  • Test coverage for local backends: Ollama, llama.cpp, vLLM, whisper.cpp
  • Anthropic Messages API adapter tests (non-OpenAI-compatible: /v1/messages, x-api-key, contentblockdelta streaming)
  • Backend config matrix: parameterized tests for host, port, model, and protocol combinations
  • GitHub Actions CI with 5-compiler matrix: GCC, Clang, AppleClang, MSVC, ClangCL on Ubuntu/macOS/Windows
  • Documentation site at musegpt.org built with schemaflux
  • 18 site entities: 5 architecture docs + 13 backend provider pages
  • 3 taxonomy dimensions: category, component, tags
  • Built-in SEO: XML sitemap, robots.txt, JSON-LD structured data, Open Graph, RSS feeds, llms.txt
  • Client-side search with keyboard navigation
  • Mermaid sequence diagrams in architecture docs
  • AGENTS.md with agentic coding guidelines and implementation order
  • MIT license

Changed

  • Repositioned from local-only to universal inference adapter supporting both local and cloud backends
  • Replaced AGPL-3.0 license with MIT to maximize distribution
Compiled with SchemaFlux