Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

1amageek/OpenFoundationModels

Repository files navigation

OpenFoundationModels

Swift Platform Tests License Ask DeepWiki

100% Apple Foundation Models ฮฒ SDK Compatible Implementation

OpenFoundationModels is a complete open-source implementation of Apple's Foundation Models framework (iOS 26.0+/macOS 26.0+), providing 100% API compatibility while enabling integration with any LLM provider.

Why OpenFoundationModels?

Apple Foundation Models Limitations

Apple Foundation Models is an excellent framework, but has significant limitations:

  • Apple Intelligence Required: Only available on Apple Intelligence-enabled devices
  • Apple Platform Exclusive: Works only on iOS 26+, macOS 15+
  • Provider Locked: Only Apple-provided models supported
  • On-Device Only: No integration with external LLM services

OpenFoundationModels Value

OpenFoundationModels solves these limitations as an Apple-compatible alternative implementation:

// Apple Foundation Models (Apple ecosystem only)
#if canImport(FoundationModels)
import FoundationModels
let model = SystemLanguageModel.default
#else
// OpenFoundationModels with any provider
import OpenFoundationModels
import OpenFoundationModelsOpenAI // or MLX, Anthropic, etc.
let model = OpenAILanguageModel(apiKey: "your-key")
#endif
// ๐ŸŽฏ 100% API Compatible - Same code works with any backend
let session = LanguageModelSession(model: model) {
 Instructions("You are a helpful assistant")
}

โœ… Apple Official API Compliant: Same API as Apple Foundation Models โœ… Provider Freedom: OpenAI, Anthropic, MLX, Ollama, and more โœ… Seamless Migration: Switch between Apple's model and third-party providers โœ… Enterprise Ready: Integrates with existing infrastructure

Quick Start

Get started with OpenFoundationModels in minutes:

Try Sample Applications

# Clone and run sample chat applications
git clone https://github.com/1amageek/OpenFoundationModels-Samples.git
cd OpenFoundationModels-Samples
# Option 1: On-device chat (no setup required)
swift run foundation-chat
# Option 2: OpenAI-powered chat
export OPENAI_API_KEY="your_api_key_here"
swift run openai-chat

Basic Usage

import OpenFoundationModels
import OpenFoundationModelsOpenAI // Choose your provider
// Create a model from any provider
let model = OpenAILanguageModel(apiKey: "your-key")
// Apple-compatible API works with any backend
let session = LanguageModelSession(model: model) {
 Instructions("You are a helpful assistant")
}
let response = try await session.respond {
 Prompt("Hello, OpenFoundationModels!")
}
print(response.content)

With OpenAI

import OpenFoundationModels
import OpenFoundationModelsOpenAI
let provider = OpenAIProvider(apiKey: "your_key")
let session = LanguageModelSession(model: provider.gpt4o)
let response = try await session.respond {
 Prompt("Explain Swift concurrency")
}

Key Features

โœ… 100% Apple API Compatible - Same API as Apple Foundation Models โœ… Transcript-Centric Design - Apple's official conversation management system โœ… Provider Freedom - OpenAI, Anthropic, MLX, Ollama, or any LLM provider โœ… Structured Generation - Type-safe data with @Generable macro โœ… Real-time Streaming - Responsive UI with partial updates โœ… Tool Calling - Let LLMs execute your functions โœ… Production Ready - 328 tests passing, memory efficient, thread-safe

Installation

Swift Package Manager

// Package.swift
dependencies: [
 // Core framework
 .package(url: "https://github.com/1amageek/OpenFoundationModels.git", from: "1.0.0"),
 // Choose your provider(s):
 .package(url: "https://github.com/1amageek/OpenFoundationModels-OpenAI.git", from: "1.0.0"), // OpenAI
 .package(url: "https://github.com/1amageek/OpenFoundationModels-MLX.git", from: "1.0.0"), // Local MLX
]

Try Sample Apps (No Setup Required)

# Clone and run sample applications
git clone https://github.com/1amageek/OpenFoundationModels-Samples.git
cd OpenFoundationModels-Samples
# Option 1: On-device chat (no API key needed)
swift run foundation-chat
# Option 2: OpenAI-powered chat
export OPENAI_API_KEY="your_api_key_here"
swift run openai-chat

Usage

1. Basic Text Generation

import OpenFoundationModels
import OpenFoundationModelsOpenAI // Or MLX, Anthropic, etc.
// Create model from your chosen provider
let model = OpenAILanguageModel(apiKey: "your-key")
// Check model availability
guard model.isAvailable else {
 print("Model not available")
 return
}
// Create session with instructions (Apple Official API)
// Instructions and tools are stored in Transcript as the first entry
let session = LanguageModelSession(model: model) {
 Instructions("You are a helpful assistant")
}
// Apple Official closure-based prompt
// Each prompt is added to the Transcript, maintaining full conversation history
let response = try await session.respond {
 Prompt("Tell me about Swift 6.2 new features")
}
print(response.content)
// Access the complete conversation history via Transcript
print("Total conversation entries: \(session.transcript.count)")

2. Type-Safe Structured Generation

// Define your data structure with validation rules
@Generable
struct ProductReview {
 @Guide(description: "Product name", .pattern("^[A-Za-z0-9\\s]+$"))
 let productName: String
 
 @Guide(description: "Rating from 1 to 5", .range(1...5))
 let rating: Int
 
 @Guide(description: "Review between 50-500 chars", .count(50...500))
 let comment: String
 
 @Guide(description: "Recommendation level", .anyOf(["Highly Recommend", "Recommend", "Neutral", "Not Recommend"]))
 let recommendation: String
}
// LLM generates validated, type-safe data
let response = try await session.respond(
 generating: ProductReview.self
) {
 Prompt("Generate a review for iPhone 15 Pro")
}
// Direct property access - no JSON parsing needed!
print("Product: \(response.content.productName)")
print("Rating: \(response.content.rating)/5")
print("Comment: \(response.content.comment)")

3. Streaming Responses

// Apple Official streaming API
let stream = session.streamResponse {
 Prompt("Explain the history of Swift programming language in detail")
}
for try await snapshot in stream {
 print(snapshot.content, terminator: "")
}
// The stream completes when generation is done
let finalResponse = try await stream.collect()
print("\n--- Generation Complete ---")

4. Stream Complex Data Structures

@Generable
struct BlogPost {
 let title: String
 let content: String
 let tags: [String]
}
let stream = session.streamResponse(
 generating: BlogPost.self
) {
 Prompt("Write a blog post about Swift Concurrency")
}
for try await partial in stream {
 if let post = partial.content as? BlogPost {
 print("Title: \(post.title)")
 print("Progress: \(post.content.count) characters")
 }
 
 // Stream continues until completion
}
// Collect final response when stream ends
let finalArticle = try await stream.collect()
print("Article generation complete!")
print("Final article: \(finalArticle.content)"
}

5. Function Calling (Tools)

// Define a tool that LLMs can call
struct WeatherTool: Tool {
 static let name = "get_weather"
 static let description = "Get current weather for a city"
 
 // Type-safe arguments
 @Generable
 struct Arguments {
 @Guide(description: "City name")
 let city: String
 }
 
 func call(arguments: Arguments) async throws -> ToolOutput {
 // Your actual API call here
 let weather = try await weatherAPI.fetch(city: arguments.city)
 return ToolOutput("Weather in \(arguments.city): \(weather)ยฐC")
 }
}
// LLM decides when to call tools
let session = LanguageModelSession(
 model: model, // Any LanguageModel provider
 tools: [WeatherTool()]
) {
 Instructions("You are a helpful assistant that can check weather")
}
let response = try await session.respond {
 Prompt("What's the weather in Tokyo and Paris?")
}
// LLM calls WeatherTool twice and combines results
// Output: "Tokyo is 22ยฐC and sunny, while Paris is 15ยฐC with clouds."

6. Generation Control & Response Format

// Fine-tune generation behavior
let options = GenerationOptions(
 sampling: .greedy, // Deterministic output
 // sampling: .random(top: 50, seed: 42), // Top-K sampling
 // sampling: .random(probabilityThreshold: 0.9, seed: nil), // Top-P sampling
 temperature: 0.7, // Creativity level (0.0-1.0)
 maximumResponseTokens: 500 // Length limit
)
// Apply custom instructions
let session = LanguageModelSession(model: model) {
 Instructions("""
 You are a Swift expert.
 Use modern Swift 6.2+ features.
 Include error handling in all examples.
""")
}
let response = try await session.respond(options: options) {
 Prompt("Write a networking function")
}
// Response Format for structured output
// ResponseFormat is automatically set when using respond(generating:)
@Generable
struct CodeReview {
 let summary: String
 let issues: [Issue]
 let rating: Int
}
// When using respond(generating:), ResponseFormat is automatically included
let review = try await session.respond(generating: CodeReview.self) {
 Prompt("Review this Swift code: \(code)")
}
// Internally sets: responseFormat: Transcript.ResponseFormat(type: CodeReview.self)
// Or explicitly with schema
let schema = GenerationSchema(
 type: CodeReview.self,
 description: "Code review structure",
 properties: [...]
)
let review = try await session.respond(to: prompt, schema: schema)
// Internally sets: responseFormat: Transcript.ResponseFormat(schema: schema)

7. Use Any LLM Provider

import OpenFoundationModels
// Same API, different providers - all using Transcript-based interface
#if canImport(FoundationModels)
import FoundationModels
let model = SystemLanguageModel.default // Apple (on-device)
#else
import OpenFoundationModelsOpenAI
let model = OpenAILanguageModel(apiKey: key) // OpenAI
// import OpenFoundationModelsMLX
// let model = MLXLanguageModel() // MLX (local)
#endif
let session = LanguageModelSession(model: model)
// Write once, run with any provider
// Each provider receives the full Transcript and interprets it appropriately
let response = try await session.respond {
 Prompt("Explain quantum computing")
}
// All advanced features work with all providers
@Generable
struct Analysis {
 let summary: String
 let keyPoints: [String]
 let confidence: Double
}
let analysis = try await session.respond(
 generating: Analysis.self
) {
 Prompt("Analyze this code: \(codeSnippet)")
}
// Providers implement the simple LanguageModel protocol
// They receive Transcript and return responses - implementation details are provider-specific

Real-World Use Cases

๐Ÿค– AI Chatbots

// Build chatbots that work with any provider
let chatbot = LanguageModelSession(model: model) {
 Instructions("You are a helpful chatbot assistant")
}
let response = try await chatbot.respond {
 Prompt(userMessage)
}

๐Ÿ“Š Data Extraction

// Extract structured data from unstructured text
@Generable
struct Invoice {
 let invoiceNumber: String
 let totalAmount: Double
 let items: [LineItem]
}
let invoice = try await session.respond(generating: Invoice.self) {
 Prompt("Extract invoice data from: \(pdfText)")
}

๐Ÿ” Content Analysis

// Analyze and categorize content
@Generable
struct ContentAnalysis {
 @Guide(description: "Sentiment", .anyOf(["positive", "neutral", "negative"]))
 let sentiment: String
 let topics: [String]
 let summary: String
}

๐Ÿ› ๏ธ Code Generation

// Generate code with validation
@Generable
struct SwiftFunction {
 @Guide(description: "Valid Swift function signature")
 let signature: String
 let implementation: String
 let tests: [String]
}

Architecture

Transcript-Centric Design

OpenFoundationModels follows Apple's Foundation Models design philosophy where Transcript is the single source of truth for all conversation context:

// Transcript manages the complete conversation
public struct Transcript {
 public enum Entry {
 case instructions(Instructions) // System instructions & tool definitions
 case prompt(Prompt) // User input with optional ResponseFormat
 case response(Response) // Model output
 case toolCalls(ToolCalls) // Tool invocations
 case toolOutput(ToolOutput) // Tool results
 }
 
 // Prompt includes ResponseFormat for structured output
 public struct Prompt {
 var segments: [Segment]
 var options: GenerationOptions
 var responseFormat: ResponseFormat? // Optional structured output format
 }
 
 // ResponseFormat defines expected output structure
 public struct ResponseFormat {
 public init(schema: GenerationSchema) // From explicit schema
 public init<Content: Generable>(type: Content.Type) // From Generable type
 public var name: String { get }
 }
}
// LanguageModelSession manages Transcript
let session = LanguageModelSession(model: model, tools: tools) {
 Instructions("You are a helpful assistant")
 // Automatically added as first Transcript.Entry
}
// Each interaction updates the Transcript
let response = try await session.respond {
 Prompt("Hello") // Added to Transcript before sending to model
}
// Response is added to Transcript after generation
// Structured output with ResponseFormat
let response = try await session.respond(generating: ProductReview.self) {
 Prompt("Review the iPhone 15")
 // ResponseFormat(type: ProductReview.self) automatically added to Prompt
}
// LanguageModel receives complete context
protocol LanguageModel {
 // Receives full Transcript with all history, instructions, tools, and ResponseFormat
 func generate(transcript: Transcript, options: GenerationOptions?) async throws -> Transcript.Entry
 func stream(transcript: Transcript, options: GenerationOptions?) -> AsyncThrowingStream<Transcript.Entry, Error>
}

This design ensures:

  • Stateless Models: LanguageModel implementations don't manage state
  • Complete Context: Every request includes full conversation history
  • Clear Responsibilities: Session manages Transcript, Model generates responses

Development

Build

swift build

Format

swift-format --in-place --recursive Sources/ Tests/

Documentation

swift package generate-documentation

Ecosystem

OpenFoundationModels provides a complete ecosystem with core framework, provider integrations, and sample applications:

๐Ÿ—๏ธ Core Framework

  • OpenFoundationModels - Apple Foundation Models compatible core framework
  • 100% API compatibility with Apple's official specification
  • 328 tests passing with comprehensive coverage

๐Ÿ”— Provider Integrations

  • OpenFoundationModels-OpenAI โœ… Complete

    • Full GPT model support (GPT-4o, GPT-4o Mini, GPT-4 Turbo, o1, o1-pro, o3, o3-pro, o4-mini)
    • Streaming and multimodal capabilities
    • Production-ready with rate limiting and error handling
  • OpenFoundationModels-MLX โœ… Complete

    • Local LLM inference using Apple MLX framework
    • Optimized for Apple Silicon (M1/M2/M3/M4)
    • No API key required - fully on-device

๐Ÿ“ฑ Sample Applications

  • OpenFoundationModels-Samples โœ… Complete
    • foundation-chat: On-device chat using Apple's SystemLanguageModel
    • openai-chat: Cloud-based chat using OpenAI models
    • Interactive CLI applications with full streaming support

๐Ÿ”ฎ Planned Integrations

Provider adapters can be added for:

  • Anthropic (Claude 3 Haiku, Sonnet, Opus, etc.)
  • Google (Gemini Pro, Ultra, etc.)
  • Ollama (Local models via Ollama)
  • Azure OpenAI Service
  • AWS Bedrock

Why Choose OpenFoundationModels?

For Developers

  • Zero Learning Curve: If you know Apple's API, you already know ours
  • Platform Freedom: Deploy anywhere - cloud, edge, mobile, embedded
  • Provider Flexibility: Switch LLMs without changing code
  • Type Safety: Catch errors at compile time, not runtime

For Businesses

  • Vendor Independence: No lock-in to Apple or any LLM provider
  • Cost Control: Use local models or choose the most cost-effective provider
  • Compliance Ready: Keep data on-premise with local models
  • Future Proof: Easy migration path when Apple's API goes public

Testing

# Run all 328 tests
swift test
# Test specific components
swift test --filter GenerableTests # @Generable macro
swift test --filter LanguageModelTests # Core functionality
swift test --filter StreamingTests # Async streaming

Contributing

We welcome contributions! See CONTRIBUTING.md for guidelines.

# Quick start
git clone https://github.com/1amageek/OpenFoundationModels.git
cd OpenFoundationModels
swift test # Verify setup
# Make your changes
git checkout -b feature/your-feature
# Submit PR

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • Apple for the Foundation Models framework design and API
  • The Swift community for excellent concurrency and macro tools
  • Contributors and early adopters

Related Projects

Official OpenFoundationModels Extensions

Community Swift AI Projects


Note: This is an independent open-source implementation and is not affiliated with Apple Inc. Apple, Foundation Models, and related trademarks are property of Apple Inc.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

AltStyle ใซใ‚ˆใฃใฆๅค‰ๆ›ใ•ใ‚ŒใŸใƒšใƒผใ‚ธ (->ใ‚ชใƒชใ‚ธใƒŠใƒซ) /