Skip to content

Problem with large files #16

@cyrillef

Description

@cyrillef

We got an Out of Memory exception while uploading a large file (2.7GB).
We also tried using the objectAPI.uploadChunk with the same result.
Playing with the Xmx had no effect.
Increasing the instance memory to 16GB didn’t either.
Any ideas?

Here’s the command line:
java -jar autodesk-console.jar –Xmx16000m …

Here’s the stack:
Exception in thread "main" java.lang.OutOfMemoryError: Required array size too large
at java.nio.file.Files.readAllBytes(Files.java:3156)
at com.autodesk.client.ApiClient.getAPIResponse(ApiClient.java:476)
at com.autodesk.client.ApiClient.invokeAPI(ApiClient.java:555)
at com.autodesk.client.api.ObjectsApi.uploadObject(ObjectsApi.java:696)
at com.datumate.autodesk.application.service.handlers.impl.AutodeskBucketHandlerImpl.uploadFile(AutodeskBucketHandlerImpl.java:103)
at com.datumate.autodesk.application.service.UploadTextureModelFlow.uploadFilesToAutodeskS3AndGetUrn(UploadTextureModelFlow.java:98)
at com.datumate.autodesk.application.service.UploadFileFlow.downloadFileFromDatumateS3AndUploadToAutodeskS3(UploadFileFlow.java:19)
at com.datumate.autodesk.application.impl.Menu.handleUserSelection(Menu.java:102)
at com.datumate.autodesk.application.impl.Menu.operate(Menu.java:37)
at com.datumate.autodesk.application.impl.AutodeskApplication.start(AutodeskApplication.java:19)
at com.datumate.autodesk.application.App.main(App.java:22)

The use of java.nio.file.Files.readAllBytes which is called by the AD API. It is not recommended for large files (that’s where the exception happened). The java InputStream API is recommended.

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions