Browse Source

Add TODO documenting large file SSE-S3 copy limitation

The streaming copy approach encrypts the entire stream with a single IV
but stores data in chunks with per-chunk IVs. This causes decryption
issues for large files. Small inline files work correctly.

This is a known architectural issue that needs separate work to fix.
pull/7598/head
chrislu 2 days ago
parent
commit
88f301a4ce
  1. 8
      test/s3/sse/s3_sse_integration_test.go

8
test/s3/sse/s3_sse_integration_test.go

@ -2082,6 +2082,14 @@ func TestCopyToBucketDefaultEncryptedRegression(t *testing.T) {
require.NoError(t, err, "Failed to read object")
assertDataEqual(t, testData, data, "Data mismatch")
})
// TODO: Large file SSE-S3 copy has a known issue with streaming encryption
// The streaming copy encrypts the entire stream with a single IV, but stores
// data in multiple chunks with calculated per-chunk IVs. This causes decryption
// to fail because each chunk tries to decrypt with its per-chunk IV, but the
// data was encrypted with the base IV. This needs architectural changes to fix:
// either use chunk-by-chunk encryption like SSE-C/SSE-KMS, or store a single IV.
// For now, small inline files work correctly (the original #7562 bug fix).
}
// REGRESSION TESTS FOR CRITICAL BUGS FIXED

Loading…
Cancel
Save