Conversation
Implement safeReadJson function to complement the existing safeWriteJson functionality: - Uses stream-json for efficient processing of large JSON files - Supports both full object reading and selective path extraction - Provides file locking to prevent concurrent access - Includes comprehensive error handling - Adds complete test coverage - Passthrough all exceptions This enables efficient and safe JSON reading operations throughout the codebase. Signed-off-by: Eric Wheeler <roo-code@z.ewheeler.org>
Replace manual file reading and JSON parsing with the safer safeReadJson utility across multiple files in the codebase. This change: - Provides atomic file access with proper locking to prevent race conditions - Streams file contents efficiently for better memory usage - Improves error handling consistency - Reduces code duplication Fixes: RooCodeInc#5331 Signed-off-by: Eric Wheeler <roo-code@z.ewheeler.org>
Updated test files to properly mock and use safeReadJson/safeWriteJson: - Added proper imports for safeReadJson from safeWriteJson module - Updated mock implementations to mock both functions correctly - Replaced direct fs operations with calls to safe functions - Updated assertions to match the new behavior This fixes all failing tests after the conversion to safeReadJson. Signed-off-by: Eric Wheeler <roo-code@z.ewheeler.org>
There was a problem hiding this comment.
Pull Request Overview
This PR restores commits from a deleted branch (PR #5332) and implements the use of safe JSON file reading across the codebase. The main change introduces a new safeReadJson utility function to replace unsafe file reading patterns and ensures consistent error handling throughout the application.
- Introduction of
safeReadJsonutility for atomic JSON file reading with proper locking - Refactoring of the
_acquireLockfunction fromsafeWriteJsonto be reusable - Comprehensive replacement of unsafe
fs.readFile+JSON.parsepatterns across the codebase
Reviewed Changes
Copilot reviewed 23 out of 23 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| src/utils/safeWriteJson.ts | Extracts _acquireLock function for reuse and refactors existing code |
| src/utils/safeReadJson.ts | New utility providing atomic JSON reading with streaming and locking |
| src/utils/tests/safeReadJson.spec.ts | Comprehensive test suite for the new utility |
| .roo/rules/use-safeReadJson.md | Documentation rule requiring use of safeReadJson |
| Multiple files | Replacement of unsafe JSON reading patterns with safeReadJson |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
|
|
||
| // Reset safeReadJson to reject with ENOENT by default (no MDM config) | ||
| vi.mocked(safeReadJson).mockClear() | ||
| vi.mocked(safeReadJson).mockRejectedValue({ code: "ENOENT" }) |
There was a problem hiding this comment.
The mock is rejecting with a plain object { code: "ENOENT" } instead of an Error object. This should be mockRejectedValue(Object.assign(new Error("ENOENT"), { code: "ENOENT" })) to properly simulate a file system error.
| vi.mocked(safeReadJson).mockRejectedValue({ code: "ENOENT" }) | |
| vi.mocked(safeReadJson).mockRejectedValue(Object.assign(new Error("ENOENT"), { code: "ENOENT" })) |
| try { | ||
| const cacheData = await vscode.workspace.fs.readFile(this.cachePath) | ||
| this.fileHashes = JSON.parse(cacheData.toString()) | ||
| this.fileHashes = await safeReadJson(this.cachePath.fsPath) |
There was a problem hiding this comment.
This code expects this.cachePath to have a fsPath property (suggesting it's a vscode.Uri), but safeReadJson expects a string path. Consider using this.cachePath.fsPath only if this.cachePath is indeed a Uri, or use this.cachePath directly if it's already a string.
There was a problem hiding this comment.
Thank you for restoring PR #5332! I've reviewed the implementation of safeReadJson and found it addresses the memory usage issues well. However, there are some critical issues that need attention before merging.
Critical Issues:
- The PR has merge conflicts that must be resolved
- The test mock issue noted by Copilot is still present
Additional findings:
- Documentation inconsistency with the jsonPath parameter
- Inconsistent error handling patterns across files
- An unused import that should be cleaned up
|
|
||
| // Reset safeReadJson to reject with ENOENT by default (no MDM config) | ||
| vi.mocked(safeReadJson).mockClear() | ||
| vi.mocked(safeReadJson).mockRejectedValue({ code: "ENOENT" }) |
There was a problem hiding this comment.
This issue from Copilot's review is still valid. The mock is rejecting with a plain object { code: "ENOENT" } instead of an Error object. Consider:
| vi.mocked(safeReadJson).mockRejectedValue({ code: "ENOENT" }) | |
| vi.mocked(safeReadJson).mockRejectedValue(Object.assign(new Error("ENOENT"), { code: "ENOENT" })) |
| @@ -0,0 +1,33 @@ | |||
| # JSON File Reading Must Be Safe and Atomic | |||
|
|
|||
| - You MUST use `safeReadJson(filePath: string, jsonPath?: string | string[]): Promise<any>` from `src/utils/safeReadJson.ts` to read JSON files | |||
There was a problem hiding this comment.
The documentation mentions jsonPath?: string | string[] parameter but the actual implementation of safeReadJson doesn't support this parameter. Either update the documentation to match the implementation or add jsonPath support to the function.
| import * as fsSync from "fs" | ||
| import * as path from "path" | ||
| import * as Parser from "stream-json/Parser" | ||
| import * as Pick from "stream-json/filters/Pick" |
There was a problem hiding this comment.
The Pick import from stream-json is not used anywhere in this file. Consider removing it:
| import * as Pick from "stream-json/filters/Pick" | |
| import * as Parser from "stream-json/Parser" | |
| import * as StreamValues from "stream-json/streamers/StreamValues" |
| const content = await fs.readFile(configPath, "utf-8") | ||
| const config = JSON.parse(content) | ||
| // This is a read-modify-write-operation, but we cannot | ||
| // use safeWriteJson because it does not (yet) support pretty printing. |
There was a problem hiding this comment.
Multiple comments mention "cannot use safeWriteJson because it does not (yet) support pretty printing". Should this be tracked as a feature request? Pretty printing support would eliminate these read-modify-write operations that bypass the safe write mechanism.
| const exists = await fileExistsAtPath(filePath) | ||
| return exists ? JSON.parse(await fs.readFile(filePath, "utf8")) : undefined | ||
| try { | ||
| return await safeReadJson(filePath) |
There was a problem hiding this comment.
Is it intentional to silently return undefined for all errors, not just ENOENT? This could hide actual issues like permission errors or corrupted JSON. Consider:
| return await safeReadJson(filePath) | |
| try { | |
| return await safeReadJson(filePath) | |
| } catch (error: any) { | |
| if (error.code === "ENOENT") { | |
| return undefined | |
| } | |
| throw error | |
| } |
Restores commits from #5332 after the source branch was deleted. Preserves original commit authorship via refs/pull/5332 fetch. Supersedes #5332. Original PR: #5332
Important
Restores changes from a deleted branch, introducing
safeReadJsonfor atomic JSON file reading and updating codebase and tests to use this utility.safeReadJsoninsafeReadJson.tsfor atomic and safe JSON file reading using streams and locks.safeReadJsoninmodelCache.ts,modelEndpointCache.ts,importExport.ts,FileContextTracker.ts,apiMessages.ts,taskMessages.ts,ClineProvider.ts,extract-text.ts,MarketplaceManager.ts,SimpleInstaller.ts,McpHub.ts,MdmService.ts, andautoImportSettings.spec.ts.cache-manager.spec.ts,SimpleInstaller.spec.ts,McpHub.spec.ts,MdmService.spec.ts,autoImportSettings.spec.ts, andsafeReadJson.spec.tsto mock and testsafeReadJson.safeReadJsoninsafeReadJson.spec.tsto cover various scenarios including errors and edge cases..roo/rules/use-safeReadJson.mdto enforce the use ofsafeReadJsonfor JSON file reading.This description was created by
for b0ec3f4. You can customize this summary. It will automatically update as commits are pushed.