Skip to content

Samuel100/update readmes#573

Draft
samuel100 wants to merge 2 commits intomainfrom
samuel100/update-readmes
Draft

Samuel100/update readmes#573
samuel100 wants to merge 2 commits intomainfrom
samuel100/update-readmes

Conversation

@samuel100
Copy link
Copy Markdown
Contributor

Update README to better align the product offering.

Updated sample READMEs.

samuel100 and others added 2 commits April 1, 2026 15:24
- Rewrite root README: SDK quickstart first, CLI section minimized,
  E2E solution messaging, links to MS Learn docs
- Create samples/README.md: top-level overview of all sample languages
- Create samples/js/README.md: list all 12 JS samples with run instructions
- Create samples/python/README.md: list all 9 Python samples with run instructions
- Update samples/cs/README.md: add missing LiveAudioTranscription samples
- Update samples/rust/README.md: add 4 tutorial samples, table format

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Copilot AI review requested due to automatic review settings April 1, 2026 17:09
@vercel
Copy link
Copy Markdown

vercel bot commented Apr 1, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
foundry-local Ready Ready Preview, Comment Apr 1, 2026 5:09pm

Request Review

@samuel100 samuel100 marked this pull request as draft April 1, 2026 17:09
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates and standardizes README documentation to better match the current Foundry Local product positioning, and adds/refreshes language-specific sample indexes and run instructions.

Changes:

  • Reworks the root README.md messaging, quickstarts, and CLI section to align with the “ship on-device AI” narrative.
  • Adds new README entrypoints for Python and JavaScript samples and refreshes Rust/C# sample listings.
  • Introduces a samples/README.md hub that links to language sample directories and summarizes coverage.

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 10 comments.

Show a summary per file
File Description
README.md Updates product overview, quickstarts, samples table, CLI info, and FAQs to align with current offering.
samples/README.md Adds a language-indexed samples hub README.
samples/cs/README.md Expands the C# sample list and trims some running guidance text.
samples/js/README.md Adds a JavaScript samples index and “running a sample” instructions.
samples/python/README.md Adds a Python samples index and “running a sample” instructions.
samples/rust/README.md Refreshes Rust samples listing and adds “running a sample” plus a Windows WinML note.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +34 to +38
If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:

```bash
cd native-chat-completions
pip install foundry-local-sdk-winl
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Windows install command uses foundry-local-sdk-winl, but elsewhere in this repo (README.md) the package name is foundry-local-sdk-winml. This inconsistency is likely a typo and will cause the documented command to fail. Please align the package name across docs (and verify the published package name).

Suggested change
If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:
```bash
cd native-chat-completions
pip install foundry-local-sdk-winl
If you are developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:
```bash
cd native-chat-completions
pip install foundry-local-sdk-winml

Copilot uses AI. Check for mistakes.
Comment on lines +37 to +41
If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:

```bash
cd native-chat-completions
npm install foundry-local-sdk-winl
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Windows install command uses foundry-local-sdk-winl, but the root README.md uses foundry-local-sdk-winml. This mismatch is likely a typo and will cause the documented command to fail. Please standardize on the correct package name throughout the repository.

Suggested change
If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:
```bash
cd native-chat-completions
npm install foundry-local-sdk-winl
If you are developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:
```bash
cd native-chat-completions
npm install foundry-local-sdk-winml

Copilot uses AI. Check for mistakes.

2. Navigate to a sample and install dependencies:

If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fix grammar: change 'If you developing' to 'If you are developing'.

Suggested change
If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:
If you are developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:

Copilot uses AI. Check for mistakes.

2. Navigate to a sample and install dependencies:

If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fix grammar: change 'If you developing' to 'If you are developing'.

Suggested change
If you developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:
If you are developing or shipping on **Windows**, use the Windows version - it has the same API surface area but integrates with WinML for a greater breadth of hardware acceleration:

Copilot uses AI. Check for mistakes.
messages = [
{"role": "user", "content": "What is the golden ratio?"}
]
response = client.complete_chat(messages):
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Python quickstart snippet has a syntax error: the trailing : on the response = ... line will raise a SyntaxError, and print(...) will not be correctly indented/executed as intended. Please correct the call line so the snippet runs as pasted.

Suggested change
response = client.complete_chat(messages):
response = client.complete_chat(messages)

Copilot uses AI. Check for mistakes.
Comment on lines +11 to +16
| Sample | Description |
|--------|-------------|
| [native-chat-completions](native-chat-completions/) | Non-streaming and streaming chat completions using the native chat client. |
| [audio-transcription-example](audio-transcription-example/) | Audio transcription (non-streaming and streaming) using the Whisper model. |
| [foundry-local-webserver](foundry-local-webserver/) | Start a local OpenAI-compatible web server and call it with a standard HTTP client. |
| [tool-calling-foundry-local](tool-calling-foundry-local/) | Tool calling with streaming responses, multi-turn conversation, and local tool execution. |
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The samples table markup currently starts with || in the diff (|| Sample | Description |), which renders as an extra empty column in GitHub Markdown. Use a single leading | for the header/separator/rows to render a 2-column table correctly.

Copilot uses AI. Check for mistakes.
Comment on lines +11 to +14
| Sample | Description |
|--------|-------------|
| [native-chat-completions](native-chat-completions/) | Initialize the SDK, start the local service, and run streaming chat completions. |
| [audio-transcription](audio-transcription/) | Transcribe audio files using the Whisper model. |
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The samples table markup currently starts with || in the diff, which renders as an unintended empty column in GitHub Markdown. Use a single leading | for the table header/separator/rows.

Copilot uses AI. Check for mistakes.
Comment on lines +11 to +14
| Sample | Description |
|--------|-------------|
| [native-chat-completions](native-chat-completions/) | Initialize the SDK, download a model, and run non-streaming and streaming chat completions. |
| [audio-transcription-example](audio-transcription-example/) | Transcribe audio files using the Whisper model with streaming output. |
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The samples table markup currently starts with || in the diff, which renders as an unintended empty column in GitHub Markdown. Use a single leading | for the table header/separator/rows.

Copilot uses AI. Check for mistakes.
cd native-chat-completions
cargo run
```
>[!NOTE]
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GitHub admonitions require a space after > (e.g., > [!NOTE]). As written (>[!NOTE]), it may not render as a note callout.

Suggested change
>[!NOTE]
> [!NOTE]

Copilot uses AI. Check for mistakes.
> If you are developing or shipping on **Windows**, you should update the sample's `Cargo.toml` file to include the WinML feature - this integrates with WinML to provide a greater breadth of hardware acceleration support.
>
> ```toml
> foundry-local-sdk = { features = ["winml"] }
Copy link

Copilot AI Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This TOML snippet is likely incomplete/misleading for users copying it into Cargo.toml: it omits at least the dependency version (and may replace an existing dependency line). Consider rephrasing to explicitly show modifying the existing foundry-local-sdk dependency entry (e.g., adding features = [\"winml\"] alongside the existing version), or include a placeholder version to avoid invalid cargo manifests.

Suggested change
> foundry-local-sdk = { features = ["winml"] }
> foundry-local-sdk = { version = "x.y.z", features = ["winml"] }

Copilot uses AI. Check for mistakes.
| [tutorial-document-summarizer](tutorial-document-summarizer/) | Summarize documents with AI (tutorial). |
| [tutorial-tool-calling](tutorial-tool-calling/) | Create a tool-calling assistant (tutorial). |
| [tutorial-voice-to-text](tutorial-voice-to-text/) | Transcribe and summarize audio (tutorial). |
| [live-audio-transcription-example](live-audio-transcription-example/) | Real-time microphone-to-text transcription using NAudio (Windows). |
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will this be ready by GA time. I assume once we have the sample ready, but the model is not ready, it may cause confusion.

Copy link
Copy Markdown
Collaborator

@metang metang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some comment

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants