Conversation
- Rewrite root README: SDK quickstart first, CLI section minimized, E2E solution messaging, links to MS Learn docs - Create samples/README.md: top-level overview of all sample languages - Create samples/js/README.md: list all 12 JS samples with run instructions - Create samples/python/README.md: list all 9 Python samples with run instructions - Update samples/cs/README.md: add missing LiveAudioTranscription samples - Update samples/rust/README.md: add 4 tutorial samples, table format Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
Pull request overview
Updates and standardizes README documentation to better match the current Foundry Local product positioning, and adds/refreshes language-specific sample indexes and run instructions.
Changes:
- Reworks the root
README.mdmessaging, quickstarts, and CLI section to align with the “ship on-device AI” narrative. - Adds new README entrypoints for Python and JavaScript samples and refreshes Rust/C# sample listings.
- Introduces a
samples/README.mdhub that links to language sample directories and summarizes coverage.
Reviewed changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 10 comments.
Show a summary per file
| File | Description |
|---|---|
| README.md | Updates product overview, quickstarts, samples table, CLI info, and FAQs to align with current offering. |
| samples/README.md | Adds a language-indexed samples hub README. |
| samples/cs/README.md | Expands the C# sample list and trims some running guidance text. |
| samples/js/README.md | Adds a JavaScript samples index and “running a sample” instructions. |
| samples/python/README.md | Adds a Python samples index and “running a sample” instructions. |
| samples/rust/README.md | Refreshes Rust samples listing and adds “running a sample” plus a Windows WinML note. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: | ||
|
|
||
| ```bash | ||
| cd native-chat-completions | ||
| pip install foundry-local-sdk-winl |
There was a problem hiding this comment.
The Windows install command uses foundry-local-sdk-winl, but elsewhere in this repo (README.md) the package name is foundry-local-sdk-winml. This inconsistency is likely a typo and will cause the documented command to fail. Please align the package name across docs (and verify the published package name).
| If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: | |
| ```bash | |
| cd native-chat-completions | |
| pip install foundry-local-sdk-winl | |
| If you are developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: | |
| ```bash | |
| cd native-chat-completions | |
| pip install foundry-local-sdk-winml |
| If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: | ||
|
|
||
| ```bash | ||
| cd native-chat-completions | ||
| npm install foundry-local-sdk-winl |
There was a problem hiding this comment.
The Windows install command uses foundry-local-sdk-winl, but the root README.md uses foundry-local-sdk-winml. This mismatch is likely a typo and will cause the documented command to fail. Please standardize on the correct package name throughout the repository.
| If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: | |
| ```bash | |
| cd native-chat-completions | |
| npm install foundry-local-sdk-winl | |
| If you are developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: | |
| ```bash | |
| cd native-chat-completions | |
| npm install foundry-local-sdk-winml |
|
|
||
| 2. Navigate to a sample and install dependencies: | ||
|
|
||
| If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: |
There was a problem hiding this comment.
Fix grammar: change 'If you developing' to 'If you are developing'.
| If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: | |
| If you are developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: |
|
|
||
| 2. Navigate to a sample and install dependencies: | ||
|
|
||
| If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: |
There was a problem hiding this comment.
Fix grammar: change 'If you developing' to 'If you are developing'.
| If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: | |
| If you are developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration: |
| messages = [ | ||
| {"role": "user", "content": "What is the golden ratio?"} | ||
| ] | ||
| response = client.complete_chat(messages): |
There was a problem hiding this comment.
The Python quickstart snippet has a syntax error: the trailing : on the response = ... line will raise a SyntaxError, and print(...) will not be correctly indented/executed as intended. Please correct the call line so the snippet runs as pasted.
| response = client.complete_chat(messages): | |
| response = client.complete_chat(messages) |
| | Sample | Description | | ||
| |--------|-------------| | ||
| | [native-chat-completions](native-chat-completions/) | Non-streaming and streaming chat completions using the native chat client. | | ||
| | [audio-transcription-example](audio-transcription-example/) | Audio transcription (non-streaming and streaming) using the Whisper model. | | ||
| | [foundry-local-webserver](foundry-local-webserver/) | Start a local OpenAI-compatible web server and call it with a standard HTTP client. | | ||
| | [tool-calling-foundry-local](tool-calling-foundry-local/) | Tool calling with streaming responses, multi-turn conversation, and local tool execution. | |
There was a problem hiding this comment.
The samples table markup currently starts with || in the diff (|| Sample | Description |), which renders as an extra empty column in GitHub Markdown. Use a single leading | for the header/separator/rows to render a 2-column table correctly.
| | Sample | Description | | ||
| |--------|-------------| | ||
| | [native-chat-completions](native-chat-completions/) | Initialize the SDK, start the local service, and run streaming chat completions. | | ||
| | [audio-transcription](audio-transcription/) | Transcribe audio files using the Whisper model. | |
There was a problem hiding this comment.
The samples table markup currently starts with || in the diff, which renders as an unintended empty column in GitHub Markdown. Use a single leading | for the table header/separator/rows.
| | Sample | Description | | ||
| |--------|-------------| | ||
| | [native-chat-completions](native-chat-completions/) | Initialize the SDK, download a model, and run non-streaming and streaming chat completions. | | ||
| | [audio-transcription-example](audio-transcription-example/) | Transcribe audio files using the Whisper model with streaming output. | |
There was a problem hiding this comment.
The samples table markup currently starts with || in the diff, which renders as an unintended empty column in GitHub Markdown. Use a single leading | for the table header/separator/rows.
| cd native-chat-completions | ||
| cargo run | ||
| ``` | ||
| >[!NOTE] |
There was a problem hiding this comment.
GitHub admonitions require a space after > (e.g., > [!NOTE]). As written (>[!NOTE]), it may not render as a note callout.
| >[!NOTE] | |
| > [!NOTE] |
| > If you are developing or shipping on **Windows**, you should update the sample's `Cargo.toml` file to include the WinML feature - this integrates with WinML to provide a greater breadth of hardware acceleration support. | ||
| > | ||
| > ```toml | ||
| > foundry-local-sdk = { features = ["winml"] } |
There was a problem hiding this comment.
This TOML snippet is likely incomplete/misleading for users copying it into Cargo.toml: it omits at least the dependency version (and may replace an existing dependency line). Consider rephrasing to explicitly show modifying the existing foundry-local-sdk dependency entry (e.g., adding features = [\"winml\"] alongside the existing version), or include a placeholder version to avoid invalid cargo manifests.
| > foundry-local-sdk = { features = ["winml"] } | |
| > foundry-local-sdk = { version = "x.y.z", features = ["winml"] } |
| | [tutorial-document-summarizer](tutorial-document-summarizer/) | Summarize documents with AI (tutorial). | | ||
| | [tutorial-tool-calling](tutorial-tool-calling/) | Create a tool-calling assistant (tutorial). | | ||
| | [tutorial-voice-to-text](tutorial-voice-to-text/) | Transcribe and summarize audio (tutorial). | | ||
| | [live-audio-transcription-example](live-audio-transcription-example/) | Real-time microphone-to-text transcription using NAudio (Windows). | |
There was a problem hiding this comment.
will this be ready by GA time. I assume once we have the sample ready, but the model is not ready, it may cause confusion.
Update README to better align the product offering.
Updated sample READMEs.