fix 16 kB alignment issue returns with computer visio#1010
fix 16 kB alignment issue returns with computer visio#1010Daniel-ADFA merged 5 commits intostagefrom
Conversation
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review infoConfiguration used: Organization UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
📝 WalkthroughRelease NotesChanges
Risk Assessment
Recommended Testing:
WalkthroughTwo Gradle build configuration files were updated: JNI packaging behavior was reconfigured in the Android build, and TensorFlow Lite dependencies were replaced with litert equivalents while ML Kit Text Recognition was updated to a newer patch version. Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~3 minutes Poem
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@cv-image-to-xml/build.gradle.kts`:
- Around line 41-45: YoloModelSource.kt still imports and constructs legacy
TensorFlow Lite types (Interpreter, TensorImage, TensorBuffer, ImageProcessor,
etc.), causing compatibility with the newly added LiteRT libs; update
YoloModelSource.kt to replace those usages with the LiteRT equivalents from
com.google.ai.edge.litert (use the LiteRT Interpreter/client APIs,
TensorImage/TensorBuffer replacements and image preprocessing utilities provided
by litert), remove or replace imports of org.tensorflow.lite.* and map the model
loading, invocation, and tensor handling code (where functions/constructors
reference Interpreter, ImageProcessor, TensorImage, TensorBuffer) to the
corresponding LiteRT APIs so model inference and preprocessing use the litert
classes and methods consistently.
In `@gradle.properties`:
- Around line 34-35: The gradle property android.enableR8.fullMode is
ineffective because proguard-rules.pro contains -dontshrink which disables
shrinking; either remove android.enableR8.fullMode from gradle.properties if
minification is intentionally off, or remove the -dontshrink entry in
proguard-rules.pro and run/verify release builds (using existing keep rules) to
ensure minification with fullMode succeeds—update whichever file contains the
chosen change and re-run a release build to confirm no missing-keep issues.
No description provided.