Files
seaweedFS/weed
Chris Lu 134fd6a1ae fix: S3 remote storage cold-cache read fails with 'size reported but no content available' (#7817)
fix: S3 remote storage cold-cache read fails with 'size reported but no content available' (#7815)

When a remote-only entry's initial caching attempt times out or fails,
streamFromVolumeServers() now detects this case and retries caching
synchronously before streaming, similar to how the filer server handles
remote-only entries.

Changes:
- Modified streamFromVolumeServers() to check entry.IsInRemoteOnly()
  before treating missing chunks as a data integrity error
- Added doCacheRemoteObject() as the core caching function (calls filer gRPC)
- Added buildRemoteObjectPath() helper to reduce code duplication
- Refactored cacheRemoteObjectWithDedup() and cacheRemoteObjectForStreaming()
  to reuse the shared functions
- Added integration tests for remote storage scenarios

Fixes https://github.com/seaweedfs/seaweedfs/issues/7815
2025-12-18 21:19:44 -08:00
..
2025-12-16 23:16:45 -08:00
2025-10-13 18:05:17 -07:00
2024-06-25 09:18:11 -07:00
2024-02-14 08:26:38 -08:00
2025-03-17 23:13:27 -07:00