Skip to content

docs: add community provider-huggingface module#216

Open
michaeljabbour wants to merge 2 commits intomicrosoft:mainfrom
michaeljabbour:docs/add-provider-huggingface
Open

docs: add community provider-huggingface module#216
michaeljabbour wants to merge 2 commits intomicrosoft:mainfrom
michaeljabbour:docs/add-provider-huggingface

Conversation

@michaeljabbour
Copy link
Contributor

Adds the HuggingFace Inference API provider module to the community modules catalog.

Module Details

What it does

Integrates HuggingFace models into Amplifier via the OpenAI-compatible chat completions endpoint. Supports both Serverless Inference API and dedicated Inference Endpoints.

Features:

  • Full Amplifier Provider protocol (structural typing)
  • Lazy AsyncInferenceClient initialization
  • Tool call support with missing-result repair
  • Event emission (llm:request/response at info/debug/raw levels)
  • Curated model list (Llama 3.x, Qwen 2.5, Mixtral, DeepSeek R1, Phi-3)
  • Configurable base_url for Inference Endpoints

Built following the exact same patterns as provider-anthropic, provider-openai, and provider-ollama.

🤖 Generated with Amplifier

michaeljabbour and others added 2 commits December 19, 2025 12:38
New modules from @michaeljabbour:
- tool-memory: Persistent memory with FTS5, observation types, sessions
- context-memory: Progressive disclosure context injection
- hooks-memory-capture: Automatic observation capture from tool:post
- hooks-event-broadcast: Transport-agnostic event relay

Includes bundling recommendations and feature summary.

🤖 Generated with Claude
🤖 Generated with [Amplifier](https://github.com/microsoft/amplifier)

Co-Authored-By: Amplifier <240397093+microsoft-amplifier@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments