"./operations.test -test.v -test.timeout 1h0m0s -remote TestS3R2: -verbose -test.run '^(TestCopyURL|TestDirMove|TestMoveFileBackupDir|TestMoveFileWithIgnoreExisting|TestRcatSize)$|^TestRcat$/^(withChecksum=false,ignoreChecksum=false|withChecksum=false,ignoreChecksum=true|withChecksum=true,ignoreChecksum=false|withChecksum=true,ignoreChecksum=true)$|^TestRcatMetadata$/^ViaDisk$'" - Starting (try 2/5) 2026/01/02 03:52:04 DEBUG : Creating backend with remote "TestS3R2:rclone-test-bomevoq4nipe" 2026/01/02 03:52:04 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/01/02 03:52:04 DEBUG : Creating backend with remote "/tmp/rclone4292539765" === RUN TestCopyURL run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" 2026/01/02 03:52:05 INFO : S3 bucket rclone-test-bomevoq4nipe: Bucket "rclone-test-bomevoq4nipe" created with ACL "" 2026/01/02 03:52:05 DEBUG : filename.txt: File name found in url 2026/01/02 03:52:05 DEBUG : headerfilename.txt: filename found in Content-Disposition header. --- PASS: TestCopyURL (2.64s) === RUN TestMoveFileWithIgnoreExisting run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" 2026/01/02 03:52:06 DEBUG : file1: Need to transfer - File not found at Destination 2026/01/02 03:52:07 DEBUG : file1: size = 14 OK 2026/01/02 03:52:07 DEBUG : file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2026/01/02 03:52:07 INFO : file1: Copied (new) 2026/01/02 03:52:07 INFO : file1: Deleted 2026/01/02 03:52:07 DEBUG : file1: Destination exists, skipping 2026/01/02 03:52:07 DEBUG : file1: Not removing source file as destination file exists and --ignore-existing is set --- PASS: TestMoveFileWithIgnoreExisting (0.73s) === RUN TestMoveFileBackupDir run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" 2026/01/02 03:52:08 DEBUG : Creating backend with remote "TestS3R2:rclone-test-bomevoq4nipe/backup" 2026/01/02 03:52:08 DEBUG : dst/file1: size = 14 (Local file system at /tmp/rclone4292539765) 2026/01/02 03:52:08 DEBUG : dst/file1: size = 18 (S3 bucket rclone-test-bomevoq4nipe) 2026/01/02 03:52:08 DEBUG : dst/file1: Sizes differ 2026/01/02 03:52:08 DEBUG : dst/file1: size = 18 OK 2026/01/02 03:52:08 DEBUG : dst/file1: md5 = 05164b153084ba910184c26e561a7c18 OK 2026/01/02 03:52:08 INFO : dst/file1: Copied (server-side copy) 2026/01/02 03:52:08 INFO : dst/file1: Deleted 2026/01/02 03:52:08 DEBUG : dst/file1: size = 14 OK 2026/01/02 03:52:08 DEBUG : dst/file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2026/01/02 03:52:08 INFO : dst/file1: Copied (new) 2026/01/02 03:52:08 INFO : dst/file1: Deleted --- PASS: TestMoveFileBackupDir (1.79s) === RUN TestDirMove run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" 2026/01/02 03:52:10 INFO : A1/B2: Making directory 2026/01/02 03:52:10 INFO : A1/B1/C3: Making directory fstest.go:250: Filtering empty directory "A1/B2" fstest.go:250: Filtering empty directory "A1/B1/C3" 2026/01/02 03:52:11 DEBUG : A2/B1/C1/four: size = 4 OK 2026/01/02 03:52:11 DEBUG : A1/B1/C1/four: md5 = 8cbad96aced40b3838dd9f07f6ef5772 OK 2026/01/02 03:52:11 INFO : A1/B1/C1/four: Copied (server-side copy) to: A2/B1/C1/four 2026/01/02 03:52:11 DEBUG : A2/one: size = 3 OK 2026/01/02 03:52:11 DEBUG : A1/one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2026/01/02 03:52:11 INFO : A1/one: Copied (server-side copy) to: A2/one 2026/01/02 03:52:11 DEBUG : A2/B1/three: size = 5 OK 2026/01/02 03:52:11 DEBUG : A1/B1/three: md5 = 35d6d33467aae9a2e3dccb4b6b027878 OK 2026/01/02 03:52:11 INFO : A1/B1/three: Copied (server-side copy) to: A2/B1/three 2026/01/02 03:52:11 DEBUG : A2/two: size = 3 OK 2026/01/02 03:52:11 DEBUG : A1/two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2026/01/02 03:52:11 INFO : A1/two: Copied (server-side copy) to: A2/two 2026/01/02 03:52:11 INFO : A1/one: Deleted 2026/01/02 03:52:11 INFO : A1/B1/three: Deleted 2026/01/02 03:52:11 INFO : A1/B1/C1/four: Deleted 2026/01/02 03:52:11 INFO : A1/two: Deleted 2026/01/02 03:52:11 DEBUG : A2/B1/C2/five: size = 4 OK 2026/01/02 03:52:11 DEBUG : A1/B1/C2/five: md5 = 30056e1cab7a61d256fc8edd970d14f5 OK 2026/01/02 03:52:11 INFO : A1/B1/C2/five: Copied (server-side copy) to: A2/B1/C2/five 2026/01/02 03:52:11 INFO : A1/B1/C2/five: Deleted fstest.go:250: Filtering empty directory "A2/B2" fstest.go:250: Filtering empty directory "A2/B1/C3" 2026/01/02 03:52:12 DEBUG : A3/B1/three: size = 5 OK 2026/01/02 03:52:12 DEBUG : A2/B1/three: md5 = 35d6d33467aae9a2e3dccb4b6b027878 OK 2026/01/02 03:52:12 INFO : A2/B1/three: Copied (server-side copy) to: A3/B1/three 2026/01/02 03:52:12 DEBUG : A3/two: size = 3 OK 2026/01/02 03:52:12 DEBUG : A2/two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2026/01/02 03:52:12 INFO : A2/two: Copied (server-side copy) to: A3/two 2026/01/02 03:52:12 DEBUG : A3/B1/C1/four: size = 4 OK 2026/01/02 03:52:12 DEBUG : A2/B1/C1/four: md5 = 8cbad96aced40b3838dd9f07f6ef5772 OK 2026/01/02 03:52:12 INFO : A2/B1/C1/four: Copied (server-side copy) to: A3/B1/C1/four 2026/01/02 03:52:12 DEBUG : A3/B1/C2/five: size = 4 OK 2026/01/02 03:52:12 DEBUG : A2/B1/C2/five: md5 = 30056e1cab7a61d256fc8edd970d14f5 OK 2026/01/02 03:52:12 INFO : A2/B1/C2/five: Copied (server-side copy) to: A3/B1/C2/five 2026/01/02 03:52:12 DEBUG : A3/one: size = 3 OK 2026/01/02 03:52:12 DEBUG : A2/one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2026/01/02 03:52:12 INFO : A2/one: Copied (server-side copy) to: A3/one 2026/01/02 03:52:12 INFO : A2/B1/three: Deleted 2026/01/02 03:52:12 INFO : A2/two: Deleted 2026/01/02 03:52:12 INFO : A2/B1/C1/four: Deleted 2026/01/02 03:52:12 INFO : A2/one: Deleted 2026/01/02 03:52:12 INFO : A2/B1/C2/five: Deleted fstest.go:250: Filtering empty directory "A3/B2" fstest.go:250: Filtering empty directory "A3/B1/C3" 2026/01/02 03:52:12 INFO : S3 bucket rclone-test-bomevoq4nipe: Can't DirMove - falling back to file moves: can't move directory - incompatible remotes 2026/01/02 03:52:12 DEBUG : A4/two: size = 3 OK 2026/01/02 03:52:12 DEBUG : A3/two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2026/01/02 03:52:12 INFO : A3/two: Copied (server-side copy) to: A4/two 2026/01/02 03:52:13 DEBUG : A4/B1/C2/five: size = 4 OK 2026/01/02 03:52:13 DEBUG : A3/B1/C2/five: md5 = 30056e1cab7a61d256fc8edd970d14f5 OK 2026/01/02 03:52:13 INFO : A3/B1/C2/five: Copied (server-side copy) to: A4/B1/C2/five 2026/01/02 03:52:13 DEBUG : A4/B1/C1/four: size = 4 OK 2026/01/02 03:52:13 DEBUG : A3/B1/C1/four: md5 = 8cbad96aced40b3838dd9f07f6ef5772 OK 2026/01/02 03:52:13 INFO : A3/B1/C1/four: Copied (server-side copy) to: A4/B1/C1/four 2026/01/02 03:52:13 DEBUG : A4/B1/three: size = 5 OK 2026/01/02 03:52:13 DEBUG : A3/B1/three: md5 = 35d6d33467aae9a2e3dccb4b6b027878 OK 2026/01/02 03:52:13 INFO : A3/B1/three: Copied (server-side copy) to: A4/B1/three 2026/01/02 03:52:13 DEBUG : A4/one: size = 3 OK 2026/01/02 03:52:13 DEBUG : A3/one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2026/01/02 03:52:13 INFO : A3/one: Copied (server-side copy) to: A4/one 2026/01/02 03:52:13 INFO : A3/two: Deleted 2026/01/02 03:52:13 INFO : A3/B1/C2/five: Deleted 2026/01/02 03:52:13 INFO : A3/B1/C1/four: Deleted 2026/01/02 03:52:13 INFO : A3/one: Deleted 2026/01/02 03:52:13 INFO : A3/B1/three: Deleted fstest.go:250: Filtering empty directory "A4/B2" fstest.go:250: Filtering empty directory "A4/B1/C3" --- PASS: TestDirMove (4.60s) === RUN TestRcat === RUN TestRcat/withChecksum=false,ignoreChecksum=false run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" 2026/01/02 03:52:14 DEBUG : S3 bucket rclone-test-bomevoq4nipe: File to upload is small (34 bytes), uploading instead of streaming 2026/01/02 03:52:14 DEBUG : no_checksum_small_file_from_pipe: size = 34 OK 2026/01/02 03:52:14 DEBUG : no_checksum_small_file_from_pipe: md5 = 4c762bc6ec18f21b23fcae426c7863b5 OK 2026/01/02 03:52:14 DEBUG : no_checksum_small_file_from_pipe: Size and md5 of src and dst objects identical 2026/01/02 03:52:14 NOTICE: S3 bucket rclone-test-bomevoq4nipe: Streaming uploads using chunk size 5Mi will have maximum file size of 48.828Gi 2026/01/02 03:52:14 DEBUG : no_checksum_big_file_from_pipe: open chunk writer: started multipart upload: AKKANRnXLwiyGhbRSoJCsE88iIsOb68xcslLrYRTst0HdCIx98VUQHIr7aNf4U4CXWghCdJ3LJntTLTbhgclasLsFB0YH53T6HbQKi18WXtF0GzEvSkkWF-I9PlrYoMzYSE0Fj9Pd-z4zRccY7tuZkmusJD-DJjAiLhamcUJGOWcvY6FvKta09xLUQbjKJfg0wIJRf95TenDKIJytbQwiydJNqsvpRVUCS3g9RnvxVhcp7w2YQCF018s9_0LM9LjWzjRm1lkC05nkHoxgEnE9cxD54Y7W8wB7Vu6g3_EItuSj3D5EL9JoYlAdq_q8ubUDIYLOZ3JJYZvxSC0BN-awvY 2026/01/02 03:52:14 DEBUG : no_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2026/01/02 03:52:14 DEBUG : no_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2026/01/02 03:52:14 DEBUG : no_checksum_big_file_from_pipe: multipart upload "AKKANRnXLwiyGhbRSoJCsE88iIsOb68xcslLrYRTst0HdCIx98VUQHIr7aNf4U4CXWghCdJ3LJntTLTbhgclasLsFB0YH53T6HbQKi18WXtF0GzEvSkkWF-I9PlrYoMzYSE0Fj9Pd-z4zRccY7tuZkmusJD-DJjAiLhamcUJGOWcvY6FvKta09xLUQbjKJfg0wIJRf95TenDKIJytbQwiydJNqsvpRVUCS3g9RnvxVhcp7w2YQCF018s9_0LM9LjWzjRm1lkC05nkHoxgEnE9cxD54Y7W8wB7Vu6g3_EItuSj3D5EL9JoYlAdq_q8ubUDIYLOZ3JJYZvxSC0BN-awvY" finished 2026/01/02 03:52:14 DEBUG : no_checksum_big_file_from_pipe: size = 102401 OK 2026/01/02 03:52:14 DEBUG : no_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2026/01/02 03:52:14 DEBUG : no_checksum_big_file_from_pipe: Size of src and dst objects identical === RUN TestRcat/withChecksum=true,ignoreChecksum=false run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" 2026/01/02 03:52:15 DEBUG : S3 bucket rclone-test-bomevoq4nipe: File to upload is small (34 bytes), uploading instead of streaming 2026/01/02 03:52:15 DEBUG : with_checksum_small_file_from_pipe: size = 34 OK 2026/01/02 03:52:15 DEBUG : with_checksum_small_file_from_pipe: md5 = 4c762bc6ec18f21b23fcae426c7863b5 OK 2026/01/02 03:52:15 DEBUG : with_checksum_small_file_from_pipe: Size and md5 of src and dst objects identical 2026/01/02 03:52:15 DEBUG : with_checksum_big_file_from_pipe: open chunk writer: started multipart upload: AASa3-MPtKGfZU1eYRl-5plIBVlL_Abfr6roXmGg1tRoR2fKVh5IpJeAlMKk3dVwZ50xGkEIMS-MW6hYwGUB6sl50gTeGE79rrUMWldMJqxRWbt09ogTIR1kw_FFtWbhZ1vVRm5F3b1ukkggmbUAAQCckOZtWMrYtjR1rSPwAAAtewYUWWMTrZP6j_OYBs0Fc9ifkJrjsadkdy2brflG-HdwjsMrRTfr3SY1-9zeta9Cu2eqNV2K4N6T87zSKfayq6ump-5H3cyFMonQ8jbHtLbpgi-SF5N4QJW8MsTByuf2IzVnN2QYuM5Zyz4nDpiN0hFEwokDhFVTxCYPOWjm5xM 2026/01/02 03:52:15 DEBUG : with_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2026/01/02 03:52:15 DEBUG : with_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2026/01/02 03:52:15 DEBUG : with_checksum_big_file_from_pipe: multipart upload "AASa3-MPtKGfZU1eYRl-5plIBVlL_Abfr6roXmGg1tRoR2fKVh5IpJeAlMKk3dVwZ50xGkEIMS-MW6hYwGUB6sl50gTeGE79rrUMWldMJqxRWbt09ogTIR1kw_FFtWbhZ1vVRm5F3b1ukkggmbUAAQCckOZtWMrYtjR1rSPwAAAtewYUWWMTrZP6j_OYBs0Fc9ifkJrjsadkdy2brflG-HdwjsMrRTfr3SY1-9zeta9Cu2eqNV2K4N6T87zSKfayq6ump-5H3cyFMonQ8jbHtLbpgi-SF5N4QJW8MsTByuf2IzVnN2QYuM5Zyz4nDpiN0hFEwokDhFVTxCYPOWjm5xM" finished 2026/01/02 03:52:15 DEBUG : with_checksum_big_file_from_pipe: size = 102401 OK 2026/01/02 03:52:15 DEBUG : with_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2026/01/02 03:52:15 DEBUG : with_checksum_big_file_from_pipe: Size of src and dst objects identical === RUN TestRcat/withChecksum=false,ignoreChecksum=true run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" 2026/01/02 03:52:16 DEBUG : S3 bucket rclone-test-bomevoq4nipe: File to upload is small (34 bytes), uploading instead of streaming 2026/01/02 03:52:16 DEBUG : ignore_checksum_small_file_from_pipe: size = 34 OK 2026/01/02 03:52:16 DEBUG : ignore_checksum_small_file_from_pipe: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/01/02 03:52:16 DEBUG : ignore_checksum_big_file_from_pipe: open chunk writer: started multipart upload: AKKfDH_Jqpg9d2NUORhMx8ZX7qZ94ftc17HVf3Fj53u-lHdEQ4ftqssp_BHhA-S7xQCGhmdIs38b9jjzaY5UmY3DR0Iz6W9aJbIrDf00_lshXvLpalDYqBFpQrxBGYAgiPhyLMdxE_mMC3m5YWztS3uB_8ANGchWTHgtqP87OCd50PZuBUgoSr9Am9Dxssb6tPB0EFfovREy3F-cBYkFJmO2nHz3hOBztiUjUydZYyc-Ek-pNUJkpqjPrkktvOwXNhyr0rtg05O-GXBjWQdd_MP0JOYZQeu640ndq6IXinfOiZh5SbUpnFdGhmxAr_6wsWTSWtuKugMb9dvaQpYB5us 2026/01/02 03:52:16 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2026/01/02 03:52:17 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2026/01/02 03:52:17 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload "AKKfDH_Jqpg9d2NUORhMx8ZX7qZ94ftc17HVf3Fj53u-lHdEQ4ftqssp_BHhA-S7xQCGhmdIs38b9jjzaY5UmY3DR0Iz6W9aJbIrDf00_lshXvLpalDYqBFpQrxBGYAgiPhyLMdxE_mMC3m5YWztS3uB_8ANGchWTHgtqP87OCd50PZuBUgoSr9Am9Dxssb6tPB0EFfovREy3F-cBYkFJmO2nHz3hOBztiUjUydZYyc-Ek-pNUJkpqjPrkktvOwXNhyr0rtg05O-GXBjWQdd_MP0JOYZQeu640ndq6IXinfOiZh5SbUpnFdGhmxAr_6wsWTSWtuKugMb9dvaQpYB5us" finished 2026/01/02 03:52:17 DEBUG : ignore_checksum_big_file_from_pipe: size = 102401 OK 2026/01/02 03:52:17 DEBUG : ignore_checksum_big_file_from_pipe: Size and modification time the same (differ by 0s, within tolerance 1ns) === RUN TestRcat/withChecksum=true,ignoreChecksum=true run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" 2026/01/02 03:52:17 DEBUG : S3 bucket rclone-test-bomevoq4nipe: File to upload is small (34 bytes), uploading instead of streaming 2026/01/02 03:52:17 DEBUG : ignore_checksum_small_file_from_pipe: size = 34 OK 2026/01/02 03:52:17 DEBUG : ignore_checksum_small_file_from_pipe: Src hash empty - aborting Dst hash check 2026/01/02 03:52:17 DEBUG : ignore_checksum_small_file_from_pipe: Size of src and dst objects identical 2026/01/02 03:52:18 DEBUG : ignore_checksum_big_file_from_pipe: open chunk writer: started multipart upload: APA9F9Yxk23AD0ZIv4fDLpeFLIL5mkC2rzGU2th_yJT98BCPldwNYSjMOwWOADrDWLwTx6XdksLQQWswXwEF1H96Wa1WnoqDDmu7MZWG3XRRRcvciotSfFROUFJ3MUfE9jO9W_pXeWox41T1bIoON7M9qhTBu8rkbrv0lOP5rVEPGnRXPCd40zJY0Xjehgao5s2zyYcgVtnTJQrLQUaHEfTJq3A0suwcipe3zjdyNldmBwjl7Z1PYNzzmc67wqhYiTWArROvGzl3q02l53yqfk9tBuH8wp2BQhPR8-3siJiKJJ7Z3cOrFjB5UMYXFHlrWGnU5SLswj78_wxxNqC4Bdw 2026/01/02 03:52:18 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2026/01/02 03:52:18 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2026/01/02 03:52:18 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload "APA9F9Yxk23AD0ZIv4fDLpeFLIL5mkC2rzGU2th_yJT98BCPldwNYSjMOwWOADrDWLwTx6XdksLQQWswXwEF1H96Wa1WnoqDDmu7MZWG3XRRRcvciotSfFROUFJ3MUfE9jO9W_pXeWox41T1bIoON7M9qhTBu8rkbrv0lOP5rVEPGnRXPCd40zJY0Xjehgao5s2zyYcgVtnTJQrLQUaHEfTJq3A0suwcipe3zjdyNldmBwjl7Z1PYNzzmc67wqhYiTWArROvGzl3q02l53yqfk9tBuH8wp2BQhPR8-3siJiKJJ7Z3cOrFjB5UMYXFHlrWGnU5SLswj78_wxxNqC4Bdw" finished 2026/01/02 03:52:18 DEBUG : ignore_checksum_big_file_from_pipe: size = 102401 OK 2026/01/02 03:52:18 DEBUG : ignore_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2026/01/02 03:52:18 DEBUG : ignore_checksum_big_file_from_pipe: Src hash empty - aborting Dst hash check 2026/01/02 03:52:18 DEBUG : ignore_checksum_big_file_from_pipe: Size of src and dst objects identical --- PASS: TestRcat (4.81s) --- PASS: TestRcat/withChecksum=false,ignoreChecksum=false (1.21s) --- PASS: TestRcat/withChecksum=true,ignoreChecksum=false (1.22s) --- PASS: TestRcat/withChecksum=false,ignoreChecksum=true (1.19s) --- PASS: TestRcat/withChecksum=true,ignoreChecksum=true (1.19s) === RUN TestRcatMetadata run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" === RUN TestRcatMetadata/ViaDisk 2026/01/02 03:52:19 DEBUG : rcat_metadata_uploadcutoff0: open chunk writer: started multipart upload: AFSyTa_B9GTNhDWtV5ihiTA_uOrGU1uW_zVd6Jwq4DzSplgRRnhZ9tmdCkghCem7WiPQGskdJFXkLwbIxUXRTotqPZznvKgUDYizZfy3_c6-SvmQ34trN-yB35fP5xz8w0L5n5QwSUkXvHNhaRtxdmi1U3wih6uJbrQ_yCWpap0mpp-IqG9WfevlDXuPJp4C92mjOroOkkqPZM3bf1aNzjn2ZwYhJn_WVyEr4-uf6iEuTYmCUNTN4iaTu_ye_R_g4baS3iZaWZj3SHFp_mFDBM2PpcHivsvefUH_VZ2q2OUaYkb62kHkT265xZBTUnWhCJPO-8pVJ_w62kHiy34Td2k 2026/01/02 03:52:19 DEBUG : rcat_metadata_uploadcutoff0: multipart upload: starting chunk 0 size 63 offset 0/off 2026/01/02 03:52:19 DEBUG : rcat_metadata_uploadcutoff0: multipart upload wrote chunk 1 with 63 bytes and etag "51ca8560f35b9b87370a65d72356b86c" 2026/01/02 03:52:19 DEBUG : rcat_metadata_uploadcutoff0: multipart upload "AFSyTa_B9GTNhDWtV5ihiTA_uOrGU1uW_zVd6Jwq4DzSplgRRnhZ9tmdCkghCem7WiPQGskdJFXkLwbIxUXRTotqPZznvKgUDYizZfy3_c6-SvmQ34trN-yB35fP5xz8w0L5n5QwSUkXvHNhaRtxdmi1U3wih6uJbrQ_yCWpap0mpp-IqG9WfevlDXuPJp4C92mjOroOkkqPZM3bf1aNzjn2ZwYhJn_WVyEr4-uf6iEuTYmCUNTN4iaTu_ye_R_g4baS3iZaWZj3SHFp_mFDBM2PpcHivsvefUH_VZ2q2OUaYkb62kHkT265xZBTUnWhCJPO-8pVJ_w62kHiy34Td2k" finished 2026/01/02 03:52:19 DEBUG : rcat_metadata_uploadcutoff0: size = 63 OK 2026/01/02 03:52:19 DEBUG : rcat_metadata_uploadcutoff0: Dst hash empty - aborting Src hash check 2026/01/02 03:52:19 DEBUG : rcat_metadata_uploadcutoff0: Size of src and dst objects identical --- PASS: TestRcatMetadata (0.85s) --- PASS: TestRcatMetadata/ViaDisk (0.74s) === RUN TestRcatSize run.go:185: Remote "S3 bucket rclone-test-bomevoq4nipe", Local "Local file system at /tmp/rclone4292539765", Modify Window "1ns" 2026/01/02 03:52:19 DEBUG : S3 bucket rclone-test-bomevoq4nipe: File to upload is small (60 bytes), uploading instead of streaming 2026/01/02 03:52:20 DEBUG : potato2: size = 60 OK 2026/01/02 03:52:20 DEBUG : potato2: md5 = d6548b156ea68a4e003e786df99eee76 OK 2026/01/02 03:52:20 DEBUG : potato2: Size and md5 of src and dst objects identical --- PASS: TestRcatSize (0.89s) PASS 2026/01/02 03:52:20 DEBUG : S3 bucket rclone-test-bomevoq4nipe: Purge remote 2026/01/02 03:52:20 DEBUG : S3 bucket rclone-test-bomevoq4nipe: bucket is versioned: false 2026/01/02 03:52:20 DEBUG : Waiting for deletions to finish 2026/01/02 03:52:21 INFO : S3 bucket rclone-test-bomevoq4nipe: Bucket "rclone-test-bomevoq4nipe" deleted "./operations.test -test.v -test.timeout 1h0m0s -remote TestS3R2: -verbose -test.run '^(TestCopyURL|TestDirMove|TestMoveFileBackupDir|TestMoveFileWithIgnoreExisting|TestRcatSize)$|^TestRcat$/^(withChecksum=false,ignoreChecksum=false|withChecksum=false,ignoreChecksum=true|withChecksum=true,ignoreChecksum=false|withChecksum=true,ignoreChecksum=true)$|^TestRcatMetadata$/^ViaDisk$'" - Finished OK in 17.037644905s (try 2/5)