"./operations.test -test.v -test.timeout 1h0m0s -remote TestS3GCS: -verbose -fast-list" - Starting (try 1/5) 2026/02/15 01:21:44 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu" 2026/02/15 01:21:44 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:21:44 DEBUG : Creating backend with remote "/tmp/rclone2360491256" === RUN TestDoMultiThreadCopy --- PASS: TestDoMultiThreadCopy (0.00s) === RUN TestMultithreadCalculateNumChunks === RUN TestMultithreadCalculateNumChunks/{size:1_chunkSize:65536_wantNumChunks:1} === RUN TestMultithreadCalculateNumChunks/{size:1048576_chunkSize:1_wantNumChunks:1048576} === RUN TestMultithreadCalculateNumChunks/{size:1048576_chunkSize:2_wantNumChunks:524288} === RUN TestMultithreadCalculateNumChunks/{size:1048577_chunkSize:2_wantNumChunks:524289} === RUN TestMultithreadCalculateNumChunks/{size:1048575_chunkSize:2_wantNumChunks:524288} --- PASS: TestMultithreadCalculateNumChunks (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1_chunkSize:65536_wantNumChunks:1} (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1048576_chunkSize:1_wantNumChunks:1048576} (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1048576_chunkSize:2_wantNumChunks:524288} (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1048577_chunkSize:2_wantNumChunks:524289} (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1048575_chunkSize:2_wantNumChunks:524288} (0.00s) === RUN TestMultithreadCopy run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:21:45 INFO : S3 bucket rclone-test-daxariv2nomu: Bucket "rclone-test-daxariv2nomu" created with ACL "" 2026/02/15 01:21:46 DEBUG : chunksize-probe: open chunk writer: started multipart upload: ABPnzm4pSiPtCFeyOsLgyByL-bS3Ro3EvCArGdTfLv-Hw5uN33pA5FzLo-uOeYt2x-t0wII 2026/02/15 01:21:46 DEBUG : chunksize-probe: multipart upload "ABPnzm4pSiPtCFeyOsLgyByL-bS3Ro3EvCArGdTfLv-Hw5uN33pA5FzLo-uOeYt2x-t0wII" aborted === RUN TestMultithreadCopy/upload=false,size=10485759,streams=2 2026/02/15 01:21:49 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: disabling buffering because destination uses OpenWriterAt 2026/02/15 01:21:49 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: write buffer set to 131072 2026/02/15 01:21:49 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: chunk size 64Mi was bigger than source file size 10.000Mi 2026/02/15 01:21:49 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2026/02/15 01:21:49 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: number of streams 4 was bigger than number of chunks 1 2026/02/15 01:21:49 DEBUG : test-multithread-copy-false-10485759-2: Starting multi-thread copy with 1 chunks of size 10.000Mi with 1 parallel streams 2026/02/15 01:21:49 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: chunk 1/1 (0-10485759) size 10.000Mi starting 2026/02/15 01:21:50 DEBUG : test-multithread-copy-false-10485759-2: writing chunk 0 2026/02/15 01:21:51 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: chunk 1/1 (0-10485759) size 10.000Mi finished 2026/02/15 01:21:51 DEBUG : test-multithread-copy-false-10485759-2: Finished multi-thread copy with 1 parts of size 10.000Mi === RUN TestMultithreadCopy/upload=false,size=10485760,streams=2 2026/02/15 01:21:56 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: disabling buffering because destination uses OpenWriterAt 2026/02/15 01:21:56 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: write buffer set to 131072 2026/02/15 01:21:56 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: chunk size 64Mi was bigger than source file size 10Mi 2026/02/15 01:21:56 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2026/02/15 01:21:56 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: number of streams 4 was bigger than number of chunks 1 2026/02/15 01:21:56 DEBUG : test-multithread-copy-false-10485760-2: Starting multi-thread copy with 1 chunks of size 10Mi with 1 parallel streams 2026/02/15 01:21:56 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: chunk 1/1 (0-10485760) size 10Mi starting 2026/02/15 01:21:56 DEBUG : test-multithread-copy-false-10485760-2: writing chunk 0 2026/02/15 01:21:57 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: chunk 1/1 (0-10485760) size 10Mi finished 2026/02/15 01:21:57 DEBUG : test-multithread-copy-false-10485760-2: Finished multi-thread copy with 1 parts of size 10Mi === RUN TestMultithreadCopy/upload=false,size=10485761,streams=2 2026/02/15 01:22:03 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: disabling buffering because destination uses OpenWriterAt 2026/02/15 01:22:03 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: write buffer set to 131072 2026/02/15 01:22:03 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: chunk size 64Mi was bigger than source file size 10.000Mi 2026/02/15 01:22:03 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2026/02/15 01:22:03 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: number of streams 4 was bigger than number of chunks 1 2026/02/15 01:22:03 DEBUG : test-multithread-copy-false-10485761-2: Starting multi-thread copy with 1 chunks of size 10.000Mi with 1 parallel streams 2026/02/15 01:22:03 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: chunk 1/1 (0-10485761) size 10.000Mi starting 2026/02/15 01:22:03 DEBUG : test-multithread-copy-false-10485761-2: writing chunk 0 2026/02/15 01:22:04 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: chunk 1/1 (0-10485761) size 10.000Mi finished 2026/02/15 01:22:04 DEBUG : test-multithread-copy-false-10485761-2: Finished multi-thread copy with 1 parts of size 10.000Mi === RUN TestMultithreadCopy/upload=true,size=10485759,streams=2 2026/02/15 01:22:07 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: disabling buffering because source is local disk 2026/02/15 01:22:07 DEBUG : test-multithread-copy-true-10485759-2: open chunk writer: started multipart upload: ABPnzm63xI9gkbN62ucDDXwZnYQjaiBoXUvG9KEn4TwWqWkSJ7EfY-mjzlVHhMn7aDDT4n8 2026/02/15 01:22:07 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2026/02/15 01:22:07 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: number of streams 4 was bigger than number of chunks 2 2026/02/15 01:22:07 DEBUG : test-multithread-copy-true-10485759-2: Starting multi-thread copy with 2 chunks of size 5Mi with 2 parallel streams 2026/02/15 01:22:07 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: chunk 2/2 (5242880-10485759) size 5.000Mi starting 2026/02/15 01:22:07 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: chunk 1/2 (0-5242880) size 5Mi starting 2026/02/15 01:22:07 DEBUG : test-multithread-copy-true-10485759-2: Seek from 5242879 to 0 2026/02/15 01:22:07 DEBUG : test-multithread-copy-true-10485759-2: Seek from 5242880 to 0 2026/02/15 01:22:09 DEBUG : test-multithread-copy-true-10485759-2: multipart upload wrote chunk 1 with 5242880 bytes and etag "42d04de7728868d6a9e89a5baf99cbf2" 2026/02/15 01:22:09 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: chunk 1/2 (0-5242880) size 5Mi finished 2026/02/15 01:22:09 DEBUG : test-multithread-copy-true-10485759-2: multipart upload wrote chunk 2 with 5242879 bytes and etag "7fae2a409128828ec51b5a7f7d668daa" 2026/02/15 01:22:09 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: chunk 2/2 (5242880-10485759) size 5.000Mi finished 2026/02/15 01:22:10 DEBUG : test-multithread-copy-true-10485759-2: multipart upload "ABPnzm63xI9gkbN62ucDDXwZnYQjaiBoXUvG9KEn4TwWqWkSJ7EfY-mjzlVHhMn7aDDT4n8" finished 2026/02/15 01:22:10 DEBUG : test-multithread-copy-true-10485759-2: Finished multi-thread copy with 2 parts of size 5Mi === RUN TestMultithreadCopy/upload=true,size=10485760,streams=2 2026/02/15 01:22:13 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: disabling buffering because source is local disk 2026/02/15 01:22:13 DEBUG : test-multithread-copy-true-10485760-2: open chunk writer: started multipart upload: ABPnzm54KN20J6kckDuyzPuKp-fUi-GqdMbjG4F0OQsGstICWGeCb27MUDvPws-D4st_NEM 2026/02/15 01:22:13 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2026/02/15 01:22:13 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: number of streams 4 was bigger than number of chunks 2 2026/02/15 01:22:13 DEBUG : test-multithread-copy-true-10485760-2: Starting multi-thread copy with 2 chunks of size 5Mi with 2 parallel streams 2026/02/15 01:22:13 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: chunk 2/2 (5242880-10485760) size 5Mi starting 2026/02/15 01:22:13 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: chunk 1/2 (0-5242880) size 5Mi starting 2026/02/15 01:22:13 DEBUG : test-multithread-copy-true-10485760-2: Seek from 5242880 to 0 2026/02/15 01:22:13 DEBUG : test-multithread-copy-true-10485760-2: Seek from 5242880 to 0 2026/02/15 01:22:15 DEBUG : test-multithread-copy-true-10485760-2: multipart upload wrote chunk 1 with 5242880 bytes and etag "5b413f9ead6f362f300336a51a58fb50" 2026/02/15 01:22:15 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: chunk 1/2 (0-5242880) size 5Mi finished 2026/02/15 01:22:15 DEBUG : test-multithread-copy-true-10485760-2: multipart upload wrote chunk 2 with 5242880 bytes and etag "c2123959d92ae5211f12188dd0b779d9" 2026/02/15 01:22:15 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: chunk 2/2 (5242880-10485760) size 5Mi finished 2026/02/15 01:22:16 DEBUG : test-multithread-copy-true-10485760-2: multipart upload "ABPnzm54KN20J6kckDuyzPuKp-fUi-GqdMbjG4F0OQsGstICWGeCb27MUDvPws-D4st_NEM" finished 2026/02/15 01:22:16 DEBUG : test-multithread-copy-true-10485760-2: Finished multi-thread copy with 2 parts of size 5Mi === RUN TestMultithreadCopy/upload=true,size=10485761,streams=2 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: disabling buffering because source is local disk 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: open chunk writer: started multipart upload: ABPnzm6j0EVrWVInGMFlKLybNdqhuRsWHVnD5pK813Dom1hoiabF6mkKG7A0fvXlhqNNOIY 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: number of streams 4 was bigger than number of chunks 3 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: Starting multi-thread copy with 3 chunks of size 5Mi with 3 parallel streams 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 3/3 (10485760-10485761) size 1 starting 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 2/3 (5242880-10485760) size 5Mi starting 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 1/3 (0-5242880) size 5Mi starting 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: Seek from 1 to 0 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: Seek from 5242880 to 0 2026/02/15 01:22:19 DEBUG : test-multithread-copy-true-10485761-2: Seek from 5242880 to 0 2026/02/15 01:22:20 DEBUG : test-multithread-copy-true-10485761-2: multipart upload wrote chunk 3 with 1 bytes and etag "7694f4a66316e53c8cdd9d9954bd611d" 2026/02/15 01:22:20 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 3/3 (10485760-10485761) size 1 finished 2026/02/15 01:22:21 DEBUG : test-multithread-copy-true-10485761-2: multipart upload wrote chunk 2 with 5242880 bytes and etag "9f29fba54c8f7f6c8db4dee37739a48f" 2026/02/15 01:22:21 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 2/3 (5242880-10485760) size 5Mi finished 2026/02/15 01:22:21 DEBUG : test-multithread-copy-true-10485761-2: multipart upload wrote chunk 1 with 5242880 bytes and etag "1b88e79a8beb2d262bd8920519061ac2" 2026/02/15 01:22:21 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 1/3 (0-5242880) size 5Mi finished 2026/02/15 01:22:21 DEBUG : test-multithread-copy-true-10485761-2: multipart upload "ABPnzm6j0EVrWVInGMFlKLybNdqhuRsWHVnD5pK813Dom1hoiabF6mkKG7A0fvXlhqNNOIY" finished 2026/02/15 01:22:22 DEBUG : test-multithread-copy-true-10485761-2: Finished multi-thread copy with 3 parts of size 5Mi --- PASS: TestMultithreadCopy (39.55s) --- PASS: TestMultithreadCopy/upload=false,size=10485759,streams=2 (6.49s) --- PASS: TestMultithreadCopy/upload=false,size=10485760,streams=2 (6.60s) --- PASS: TestMultithreadCopy/upload=false,size=10485761,streams=2 (6.60s) --- PASS: TestMultithreadCopy/upload=true,size=10485759,streams=2 (6.00s) --- PASS: TestMultithreadCopy/upload=true,size=10485760,streams=2 (5.70s) --- PASS: TestMultithreadCopy/upload=true,size=10485761,streams=2 (5.70s) === RUN TestMultithreadCopyAbort run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:22:24 DEBUG : chunksize-probe: open chunk writer: started multipart upload: ABPnzm5e_JBc7jKvfx_bgMAT1XpwrqMmOP-mDbXpM4GGxWqnrQ1WPyFE2WZ3QlJh206_Cjk 2026/02/15 01:22:24 DEBUG : chunksize-probe: multipart upload "ABPnzm5e_JBc7jKvfx_bgMAT1XpwrqMmOP-mDbXpM4GGxWqnrQ1WPyFE2WZ3QlJh206_Cjk" aborted 2026/02/15 01:22:26 DEBUG : test-multithread-abort: multi-thread copy: disabling buffering because source is local disk 2026/02/15 01:22:27 DEBUG : test-multithread-abort: open chunk writer: started multipart upload: ABPnzm5ybL6URZDGCoEL8UiOn3gUA_9JP4TVlcPNy-IARIBH7la34xFi3f0NywEpTPeNIFI 2026/02/15 01:22:27 DEBUG : test-multithread-abort: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 1 2026/02/15 01:22:27 DEBUG : test-multithread-abort: multi-thread copy: number of streams 4 was bigger than number of chunks 3 2026/02/15 01:22:27 DEBUG : test-multithread-abort: Starting multi-thread copy with 3 chunks of size 5Mi with 3 parallel streams 2026/02/15 01:22:27 DEBUG : test-multithread-abort: multi-thread copy: chunk 3/3 (10485760-10485761) size 1 starting 2026/02/15 01:22:27 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:27 DEBUG : test-multithread-abort: multi-thread copy: chunk 1/3 (0-5242880) size 5Mi starting 2026/02/15 01:22:27 DEBUG : Open with options = [RangeOption(0,5242879)] 2026/02/15 01:22:27 DEBUG : test-multithread-abort: multi-thread copy: chunk 2/3 (5242880-10485760) size 5Mi starting 2026/02/15 01:22:27 DEBUG : Open with options = [RangeOption(5242880,10485759)] 2026/02/15 01:22:27 DEBUG : test-multithread-abort: Seek from 5242880 to 0 2026/02/15 01:22:27 DEBUG : Open with options = [RangeOption(0,5242879)] 2026/02/15 01:22:27 DEBUG : test-multithread-abort: Seek from 5242880 to 0 2026/02/15 01:22:27 DEBUG : Open with options = [RangeOption(5242880,10485759)] 2026/02/15 01:22:28 DEBUG : test-multithread-abort: multipart upload wrote chunk 2 with 5242880 bytes and etag "1f02878f5d3acbe7c04486a7f8c5fcd4" 2026/02/15 01:22:28 DEBUG : test-multithread-abort: multi-thread copy: chunk 2/3 (5242880-10485760) size 5Mi finished 2026/02/15 01:22:28 DEBUG : test-multithread-abort: multipart upload wrote chunk 1 with 5242880 bytes and etag "5895274993bde7fa1fef6dff0a212b6f" 2026/02/15 01:22:28 DEBUG : test-multithread-abort: multi-thread copy: chunk 1/3 (0-5242880) size 5Mi finished 2026/02/15 01:22:28 DEBUG : Returning error reader 2026/02/15 01:22:28 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:28 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 1/10: BOOM: simulated read failure 2026/02/15 01:22:28 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:29 DEBUG : Returning error reader 2026/02/15 01:22:29 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:29 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 2/10: BOOM: simulated read failure 2026/02/15 01:22:29 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:30 DEBUG : Returning error reader 2026/02/15 01:22:30 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:30 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 3/10: BOOM: simulated read failure 2026/02/15 01:22:30 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:31 DEBUG : Returning error reader 2026/02/15 01:22:31 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:31 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 4/10: BOOM: simulated read failure 2026/02/15 01:22:31 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:32 DEBUG : Returning error reader 2026/02/15 01:22:32 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:32 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 5/10: BOOM: simulated read failure 2026/02/15 01:22:32 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:33 DEBUG : Returning error reader 2026/02/15 01:22:33 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:33 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 6/10: BOOM: simulated read failure 2026/02/15 01:22:33 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:34 DEBUG : Returning error reader 2026/02/15 01:22:34 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:34 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 7/10: BOOM: simulated read failure 2026/02/15 01:22:34 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:35 DEBUG : Returning error reader 2026/02/15 01:22:35 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:35 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 8/10: BOOM: simulated read failure 2026/02/15 01:22:35 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:36 DEBUG : Returning error reader 2026/02/15 01:22:36 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:36 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 9/10: BOOM: simulated read failure 2026/02/15 01:22:36 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2026/02/15 01:22:37 DEBUG : Returning error reader 2026/02/15 01:22:37 DEBUG : BOOM: simulated read failure 2026/02/15 01:22:37 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 10/10: BOOM: simulated read failure 2026/02/15 01:22:37 DEBUG : test-multithread-abort: Reopen failed after offset 0 bytes read: failed to reopen: too many retries 2026/02/15 01:22:37 DEBUG : test-multithread-abort: multi-thread copy: chunk 3/3 failed: multi-thread copy: failed to write chunk: BOOM: simulated read failure 2026/02/15 01:22:37 DEBUG : test-multithread-abort: multi-thread copy: cancelling transfer on exit 2026/02/15 01:22:38 DEBUG : test-multithread-abort: multipart upload "ABPnzm5ybL6URZDGCoEL8UiOn3gUA_9JP4TVlcPNy-IARIBH7la34xFi3f0NywEpTPeNIFI" aborted --- PASS: TestMultithreadCopyAbort (15.60s) === RUN TestSizeDiffers 2026/02/15 01:22:39 DEBUG : a: size = 0 OK 2026/02/15 01:22:39 DEBUG : a: size = 1 (memory) 2026/02/15 01:22:39 DEBUG : a: size = 2 (memory) --- PASS: TestSizeDiffers (0.00s) === RUN TestReOpen === RUN TestReOpen/Normal === RUN TestReOpen/Normal/Basics 2026/02/15 01:22:39 DEBUG : potato: Seek from 10 to 0 === RUN TestReOpen/Normal/ErrorAtStart === RUN TestReOpen/Normal/WithErrors 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/10: test error === RUN TestReOpen/Normal/TooManyErrors 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopen failed after offset 6 bytes read: failed to reopen: too many retries === RUN TestReOpen/Normal/ReadAt 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Seek from 5 to 1 === RUN TestReOpen/Normal/Seek 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Seek from 5 to 2 === RUN TestReOpen/Normal/AccountRead === RUN TestReOpen/Normal/AccountReadDelay 2026/02/15 01:22:39 DEBUG : potato: Seek from 10 to 0 2026/02/15 01:22:39 DEBUG : potato: Seek from 10 to 0 2026/02/15 01:22:39 DEBUG : potato: Seek from 10 to 0 === RUN TestReOpen/Normal/AccountReadError === RUN TestReOpen/WithRangeOption === RUN TestReOpen/WithRangeOption/Basics 2026/02/15 01:22:39 DEBUG : potato: Seek from 7 to 0 === RUN TestReOpen/WithRangeOption/ErrorAtStart === RUN TestReOpen/WithRangeOption/WithErrors 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/10: test error === RUN TestReOpen/WithRangeOption/TooManyErrors 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopen failed after offset 6 bytes read: failed to reopen: too many retries === RUN TestReOpen/WithRangeOption/ReadAt 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Seek from 5 to 1 === RUN TestReOpen/WithRangeOption/Seek 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Seek from 5 to 2 2026/02/15 01:22:39 DEBUG : potato: Seek from 7 to 4 === RUN TestReOpen/WithRangeOption/AccountRead === RUN TestReOpen/WithRangeOption/AccountReadDelay 2026/02/15 01:22:39 DEBUG : potato: Seek from 7 to 0 2026/02/15 01:22:39 DEBUG : potato: Seek from 7 to 0 2026/02/15 01:22:39 DEBUG : potato: Seek from 7 to 0 === RUN TestReOpen/WithRangeOption/AccountReadError === RUN TestReOpen/WithSeekOption === RUN TestReOpen/WithSeekOption/Basics 2026/02/15 01:22:39 DEBUG : potato: Seek from 8 to 0 === RUN TestReOpen/WithSeekOption/ErrorAtStart === RUN TestReOpen/WithSeekOption/WithErrors 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/10: test error === RUN TestReOpen/WithSeekOption/TooManyErrors 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopen failed after offset 6 bytes read: failed to reopen: too many retries === RUN TestReOpen/WithSeekOption/ReadAt 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Seek from 5 to 1 === RUN TestReOpen/WithSeekOption/Seek 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Seek from 5 to 2 2026/02/15 01:22:39 DEBUG : potato: Seek from 7 to 5 === RUN TestReOpen/WithSeekOption/AccountRead === RUN TestReOpen/WithSeekOption/AccountReadDelay 2026/02/15 01:22:39 DEBUG : potato: Seek from 8 to 0 2026/02/15 01:22:39 DEBUG : potato: Seek from 8 to 0 2026/02/15 01:22:39 DEBUG : potato: Seek from 8 to 0 === RUN TestReOpen/WithSeekOption/AccountReadError === RUN TestReOpen/UnknownSize === RUN TestReOpen/UnknownSize/Basics 2026/02/15 01:22:39 DEBUG : potato: Seek from 9 to 0 === RUN TestReOpen/UnknownSize/ErrorAtStart === RUN TestReOpen/UnknownSize/WithErrors 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/10: test error === RUN TestReOpen/UnknownSize/TooManyErrors 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/3: test error 2026/02/15 01:22:39 DEBUG : potato: Reopen failed after offset 6 bytes read: failed to reopen: too many retries === RUN TestReOpen/UnknownSize/ReadAt 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Seek from 5 to 1 === RUN TestReOpen/UnknownSize/Seek 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2026/02/15 01:22:39 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2026/02/15 01:22:39 DEBUG : potato: Seek from 5 to 2 2026/02/15 01:22:39 DEBUG : potato: Seek from 7 to 6 === RUN TestReOpen/UnknownSize/AccountRead === RUN TestReOpen/UnknownSize/AccountReadDelay 2026/02/15 01:22:39 DEBUG : potato: Seek from 9 to 0 2026/02/15 01:22:39 DEBUG : potato: Seek from 9 to 0 2026/02/15 01:22:39 DEBUG : potato: Seek from 9 to 0 === RUN TestReOpen/UnknownSize/AccountReadError --- PASS: TestReOpen (0.00s) --- PASS: TestReOpen/Normal (0.00s) --- PASS: TestReOpen/Normal/Basics (0.00s) --- PASS: TestReOpen/Normal/ErrorAtStart (0.00s) --- PASS: TestReOpen/Normal/WithErrors (0.00s) --- PASS: TestReOpen/Normal/TooManyErrors (0.00s) --- PASS: TestReOpen/Normal/ReadAt (0.00s) --- PASS: TestReOpen/Normal/Seek (0.00s) --- PASS: TestReOpen/Normal/AccountRead (0.00s) --- PASS: TestReOpen/Normal/AccountReadDelay (0.00s) --- PASS: TestReOpen/Normal/AccountReadError (0.00s) --- PASS: TestReOpen/WithRangeOption (0.00s) --- PASS: TestReOpen/WithRangeOption/Basics (0.00s) --- PASS: TestReOpen/WithRangeOption/ErrorAtStart (0.00s) --- PASS: TestReOpen/WithRangeOption/WithErrors (0.00s) --- PASS: TestReOpen/WithRangeOption/TooManyErrors (0.00s) --- PASS: TestReOpen/WithRangeOption/ReadAt (0.00s) --- PASS: TestReOpen/WithRangeOption/Seek (0.00s) --- PASS: TestReOpen/WithRangeOption/AccountRead (0.00s) --- PASS: TestReOpen/WithRangeOption/AccountReadDelay (0.00s) --- PASS: TestReOpen/WithRangeOption/AccountReadError (0.00s) --- PASS: TestReOpen/WithSeekOption (0.00s) --- PASS: TestReOpen/WithSeekOption/Basics (0.00s) --- PASS: TestReOpen/WithSeekOption/ErrorAtStart (0.00s) --- PASS: TestReOpen/WithSeekOption/WithErrors (0.00s) --- PASS: TestReOpen/WithSeekOption/TooManyErrors (0.00s) --- PASS: TestReOpen/WithSeekOption/ReadAt (0.00s) --- PASS: TestReOpen/WithSeekOption/Seek (0.00s) --- PASS: TestReOpen/WithSeekOption/AccountRead (0.00s) --- PASS: TestReOpen/WithSeekOption/AccountReadDelay (0.00s) --- PASS: TestReOpen/WithSeekOption/AccountReadError (0.00s) --- PASS: TestReOpen/UnknownSize (0.00s) --- PASS: TestReOpen/UnknownSize/Basics (0.00s) --- PASS: TestReOpen/UnknownSize/ErrorAtStart (0.00s) --- PASS: TestReOpen/UnknownSize/WithErrors (0.00s) --- PASS: TestReOpen/UnknownSize/TooManyErrors (0.00s) --- PASS: TestReOpen/UnknownSize/ReadAt (0.00s) --- PASS: TestReOpen/UnknownSize/Seek (0.00s) --- PASS: TestReOpen/UnknownSize/AccountRead (0.00s) --- PASS: TestReOpen/UnknownSize/AccountReadDelay (0.00s) --- PASS: TestReOpen/UnknownSize/AccountReadError (0.00s) === RUN TestCheck run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" === RUN TestCheck/1 === RUN TestCheck/2 === RUN TestCheck/3 === RUN TestCheck/4 === RUN TestCheck/5 === RUN TestCheck/6 === RUN TestCheck/7 --- PASS: TestCheck (6.67s) --- PASS: TestCheck/1 (0.13s) --- PASS: TestCheck/2 (0.13s) --- PASS: TestCheck/3 (0.13s) --- PASS: TestCheck/4 (0.14s) --- PASS: TestCheck/5 (0.13s) --- PASS: TestCheck/6 (0.13s) --- PASS: TestCheck/7 (0.13s) === RUN TestCheckFsError 2026/02/15 01:22:46 DEBUG : Creating backend with remote "nonexistent" 2026/02/15 01:22:46 DEBUG : Config file has changed externally - reloading 2026/02/15 01:22:46 DEBUG : Creating backend with remote "nonexistent" 2026/02/15 01:22:46 DEBUG : Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/nonexistent: Waiting for checks to finish 2026/02/15 01:22:46 ERROR : Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/nonexistent: error reading source root directory: directory not found 2026/02/15 01:22:46 NOTICE: Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/nonexistent: 0 differences found 2026/02/15 01:22:46 NOTICE: Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/nonexistent: 2 errors while checking --- PASS: TestCheckFsError (0.00s) === RUN TestCheckDownload run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" === RUN TestCheckDownload/1 === RUN TestCheckDownload/2 === RUN TestCheckDownload/3 === RUN TestCheckDownload/4 === RUN TestCheckDownload/5 === RUN TestCheckDownload/6 === RUN TestCheckDownload/7 --- PASS: TestCheckDownload (8.08s) --- PASS: TestCheckDownload/1 (0.28s) --- PASS: TestCheckDownload/2 (0.29s) --- PASS: TestCheckDownload/3 (0.28s) --- PASS: TestCheckDownload/4 (0.30s) --- PASS: TestCheckDownload/5 (0.27s) --- PASS: TestCheckDownload/6 (0.29s) --- PASS: TestCheckDownload/7 (0.27s) === RUN TestCheckSizeOnly run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" === RUN TestCheckSizeOnly/1 === RUN TestCheckSizeOnly/2 === RUN TestCheckSizeOnly/3 === RUN TestCheckSizeOnly/4 === RUN TestCheckSizeOnly/5 === RUN TestCheckSizeOnly/6 === RUN TestCheckSizeOnly/7 --- PASS: TestCheckSizeOnly (6.97s) --- PASS: TestCheckSizeOnly/1 (0.13s) --- PASS: TestCheckSizeOnly/2 (0.14s) --- PASS: TestCheckSizeOnly/3 (0.14s) --- PASS: TestCheckSizeOnly/4 (0.13s) --- PASS: TestCheckSizeOnly/5 (0.15s) --- PASS: TestCheckSizeOnly/6 (0.14s) --- PASS: TestCheckSizeOnly/7 (0.14s) === RUN TestCheckEqualReaders --- PASS: TestCheckEqualReaders (0.00s) === RUN TestParseSumFile run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:23:02 NOTICE: test.sum: improperly formatted checksum line 4 2026/02/15 01:23:02 NOTICE: test.sum: improperly formatted checksum line 5 2026/02/15 01:23:02 NOTICE: test.sum: improperly formatted checksum line 6 2026/02/15 01:23:02 NOTICE: test.sum: 2 warning(s) suppressed... 2026/02/15 01:23:03 NOTICE: test.sum: improperly formatted checksum line 4 2026/02/15 01:23:03 NOTICE: test.sum: improperly formatted checksum line 5 2026/02/15 01:23:03 NOTICE: test.sum: improperly formatted checksum line 6 2026/02/15 01:23:03 NOTICE: test.sum: 2 warning(s) suppressed... --- PASS: TestParseSumFile (2.61s) === RUN TestCheckSum run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:23:03 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/data" === RUN TestCheckSum/subtest1 === RUN TestCheckSum/subtest2 === RUN TestCheckSum/subtest3 === RUN TestCheckSum/subtest4 === RUN TestCheckSum/subtest5 === RUN TestCheckSum/subtest6 === RUN TestCheckSum/subtest7 --- PASS: TestCheckSum (16.04s) --- PASS: TestCheckSum/subtest1 (0.41s) --- PASS: TestCheckSum/subtest2 (0.40s) --- PASS: TestCheckSum/subtest3 (0.41s) --- PASS: TestCheckSum/subtest4 (0.42s) --- PASS: TestCheckSum/subtest5 (0.40s) --- PASS: TestCheckSum/subtest6 (0.40s) --- PASS: TestCheckSum/subtest7 (0.40s) === RUN TestCheckSumDownload run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:23:19 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/data" === RUN TestCheckSumDownload/subtest1 === RUN TestCheckSumDownload/subtest2 === RUN TestCheckSumDownload/subtest3 === RUN TestCheckSumDownload/subtest4 === RUN TestCheckSumDownload/subtest5 === RUN TestCheckSumDownload/subtest6 === RUN TestCheckSumDownload/subtest7 --- PASS: TestCheckSumDownload (17.71s) --- PASS: TestCheckSumDownload/subtest1 (0.55s) --- PASS: TestCheckSumDownload/subtest2 (0.53s) --- PASS: TestCheckSumDownload/subtest3 (0.66s) --- PASS: TestCheckSumDownload/subtest4 (0.66s) --- PASS: TestCheckSumDownload/subtest5 (1.06s) --- PASS: TestCheckSumDownload/subtest6 (0.64s) --- PASS: TestCheckSumDownload/subtest7 (0.66s) === RUN TestApplyTransforms 2026/02/15 01:23:37 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-fadoyed1feme" 2026/02/15 01:23:37 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:23:37 DEBUG : Creating backend with remote "/tmp/rclone174616051" run.go:185: Remote "S3 bucket rclone-test-fadoyed1feme", Local "Local file system at /tmp/rclone174616051", Modify Window "1ns" 2026/02/15 01:23:38 INFO : S3 bucket rclone-test-fadoyed1feme: Bucket "rclone-test-fadoyed1feme" created with ACL "" upper checkfile vs. lower remote (without normalization) 2026/02/15 01:23:40 ERROR : hello, world!: sum not found 2026/02/15 01:23:40 ERROR : HELLO, WORLD!: file not in S3 bucket rclone-test-fadoyed1feme 2026/02/15 01:23:40 NOTICE: S3 bucket rclone-test-fadoyed1feme: 1 files missing 2026/02/15 01:23:40 NOTICE: 1 hashes missing 2026/02/15 01:23:40 NOTICE: S3 bucket rclone-test-fadoyed1feme: 1 differences found 2026/02/15 01:23:40 NOTICE: S3 bucket rclone-test-fadoyed1feme: 2 errors while checking upper checkfile vs. lower remote (with normalization) 2026/02/15 01:23:40 DEBUG : hello, world!: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:23:40 NOTICE: S3 bucket rclone-test-fadoyed1feme: 0 differences found 2026/02/15 01:23:40 NOTICE: S3 bucket rclone-test-fadoyed1feme: 1 matching files 2026/02/15 01:23:40 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-yifubex6yene" 2026/02/15 01:23:40 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:23:40 DEBUG : Creating backend with remote "/tmp/rclone2818640401" run.go:185: Remote "S3 bucket rclone-test-yifubex6yene", Local "Local file system at /tmp/rclone2818640401", Modify Window "1ns" 2026/02/15 01:23:41 INFO : S3 bucket rclone-test-yifubex6yene: Bucket "rclone-test-yifubex6yene" created with ACL "" lower checkfile vs. upper remote (without normalization) 2026/02/15 01:23:43 ERROR : HELLO, WORLD!: sum not found 2026/02/15 01:23:43 ERROR : hello, world!: file not in S3 bucket rclone-test-yifubex6yene 2026/02/15 01:23:43 NOTICE: S3 bucket rclone-test-yifubex6yene: 1 files missing 2026/02/15 01:23:43 NOTICE: 1 hashes missing 2026/02/15 01:23:43 NOTICE: S3 bucket rclone-test-yifubex6yene: 1 differences found 2026/02/15 01:23:43 NOTICE: S3 bucket rclone-test-yifubex6yene: 2 errors while checking lower checkfile vs. upper remote (with normalization) 2026/02/15 01:23:44 DEBUG : HELLO, WORLD!: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:23:44 NOTICE: S3 bucket rclone-test-yifubex6yene: 0 differences found 2026/02/15 01:23:44 NOTICE: S3 bucket rclone-test-yifubex6yene: 1 matching files 2026/02/15 01:23:44 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-fapufel0hifa" 2026/02/15 01:23:44 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:23:44 DEBUG : Creating backend with remote "/tmp/rclone3297625733" run.go:185: Remote "S3 bucket rclone-test-fapufel0hifa", Local "Local file system at /tmp/rclone3297625733", Modify Window "1ns" 2026/02/15 01:23:45 INFO : S3 bucket rclone-test-fapufel0hifa: Bucket "rclone-test-fapufel0hifa" created with ACL "" lower checkfile vs. upperlowermixed remote (without normalization) 2026/02/15 01:23:47 ERROR : HeLlO, wOrLd!: sum not found 2026/02/15 01:23:47 ERROR : hello, world!: file not in S3 bucket rclone-test-fapufel0hifa 2026/02/15 01:23:47 NOTICE: S3 bucket rclone-test-fapufel0hifa: 1 files missing 2026/02/15 01:23:47 NOTICE: 1 hashes missing 2026/02/15 01:23:47 NOTICE: S3 bucket rclone-test-fapufel0hifa: 1 differences found 2026/02/15 01:23:47 NOTICE: S3 bucket rclone-test-fapufel0hifa: 2 errors while checking lower checkfile vs. upperlowermixed remote (with normalization) 2026/02/15 01:23:47 DEBUG : HeLlO, wOrLd!: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:23:47 NOTICE: S3 bucket rclone-test-fapufel0hifa: 0 differences found 2026/02/15 01:23:47 NOTICE: S3 bucket rclone-test-fapufel0hifa: 1 matching files 2026/02/15 01:23:47 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-darajuj4voro" 2026/02/15 01:23:47 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:23:47 DEBUG : Creating backend with remote "/tmp/rclone2858763755" run.go:185: Remote "S3 bucket rclone-test-darajuj4voro", Local "Local file system at /tmp/rclone2858763755", Modify Window "1ns" 2026/02/15 01:23:49 INFO : S3 bucket rclone-test-darajuj4voro: Bucket "rclone-test-darajuj4voro" created with ACL "" upperlowermixed checkfile vs. upper remote (without normalization) 2026/02/15 01:23:50 ERROR : HELLO, WORLD!: sum not found 2026/02/15 01:23:50 ERROR : HeLlO, wOrLd!: file not in S3 bucket rclone-test-darajuj4voro 2026/02/15 01:23:50 NOTICE: S3 bucket rclone-test-darajuj4voro: 1 files missing 2026/02/15 01:23:50 NOTICE: 1 hashes missing 2026/02/15 01:23:50 NOTICE: S3 bucket rclone-test-darajuj4voro: 1 differences found 2026/02/15 01:23:50 NOTICE: S3 bucket rclone-test-darajuj4voro: 2 errors while checking upperlowermixed checkfile vs. upper remote (with normalization) 2026/02/15 01:23:51 DEBUG : HELLO, WORLD!: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:23:51 NOTICE: S3 bucket rclone-test-darajuj4voro: 0 differences found 2026/02/15 01:23:51 NOTICE: S3 bucket rclone-test-darajuj4voro: 1 matching files 2026/02/15 01:23:51 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-felalot7hiho" 2026/02/15 01:23:51 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:23:51 DEBUG : Creating backend with remote "/tmp/rclone2675280672" run.go:185: Remote "S3 bucket rclone-test-felalot7hiho", Local "Local file system at /tmp/rclone2675280672", Modify Window "1ns" 2026/02/15 01:23:52 INFO : S3 bucket rclone-test-felalot7hiho: Bucket "rclone-test-felalot7hiho" created with ACL "" NFD checkfile vs. NFC remote (without normalization) 2026/02/15 01:23:54 ERROR : 測試_Русский___ě_áñ: sum not found 2026/02/15 01:23:54 ERROR : 測試_Русский___ě_áñ: file not in S3 bucket rclone-test-felalot7hiho 2026/02/15 01:23:54 NOTICE: S3 bucket rclone-test-felalot7hiho: 1 files missing 2026/02/15 01:23:54 NOTICE: 1 hashes missing 2026/02/15 01:23:54 NOTICE: S3 bucket rclone-test-felalot7hiho: 1 differences found 2026/02/15 01:23:54 NOTICE: S3 bucket rclone-test-felalot7hiho: 2 errors while checking NFD checkfile vs. NFC remote (with normalization) 2026/02/15 01:23:54 DEBUG : 測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:23:54 NOTICE: S3 bucket rclone-test-felalot7hiho: 0 differences found 2026/02/15 01:23:54 NOTICE: S3 bucket rclone-test-felalot7hiho: 1 matching files 2026/02/15 01:23:54 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-poxenes7zifu" 2026/02/15 01:23:54 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:23:54 DEBUG : Creating backend with remote "/tmp/rclone1379615960" run.go:185: Remote "S3 bucket rclone-test-poxenes7zifu", Local "Local file system at /tmp/rclone1379615960", Modify Window "1ns" 2026/02/15 01:23:56 INFO : S3 bucket rclone-test-poxenes7zifu: Bucket "rclone-test-poxenes7zifu" created with ACL "" NFC checkfile vs. NFD remote (without normalization) 2026/02/15 01:23:57 ERROR : 測試_Русский___ě_áñ: sum not found 2026/02/15 01:23:57 ERROR : 測試_Русский___ě_áñ: file not in S3 bucket rclone-test-poxenes7zifu 2026/02/15 01:23:57 NOTICE: S3 bucket rclone-test-poxenes7zifu: 1 files missing 2026/02/15 01:23:57 NOTICE: 1 hashes missing 2026/02/15 01:23:57 NOTICE: S3 bucket rclone-test-poxenes7zifu: 1 differences found 2026/02/15 01:23:57 NOTICE: S3 bucket rclone-test-poxenes7zifu: 2 errors while checking NFC checkfile vs. NFD remote (with normalization) 2026/02/15 01:23:57 DEBUG : 測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:23:57 NOTICE: S3 bucket rclone-test-poxenes7zifu: 0 differences found 2026/02/15 01:23:57 NOTICE: S3 bucket rclone-test-poxenes7zifu: 1 matching files 2026/02/15 01:23:57 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-toloqiv5yuqa" 2026/02/15 01:23:57 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:23:57 DEBUG : Creating backend with remote "/tmp/rclone2065089842" run.go:185: Remote "S3 bucket rclone-test-toloqiv5yuqa", Local "Local file system at /tmp/rclone2065089842", Modify Window "1ns" 2026/02/15 01:23:58 INFO : S3 bucket rclone-test-toloqiv5yuqa: Bucket "rclone-test-toloqiv5yuqa" created with ACL "" NFDx2 checkfile vs. both remote (without normalization) 2026/02/15 01:24:00 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: sum not found 2026/02/15 01:24:00 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: file not in S3 bucket rclone-test-toloqiv5yuqa 2026/02/15 01:24:00 NOTICE: S3 bucket rclone-test-toloqiv5yuqa: 1 files missing 2026/02/15 01:24:00 NOTICE: 1 hashes missing 2026/02/15 01:24:00 NOTICE: S3 bucket rclone-test-toloqiv5yuqa: 1 differences found 2026/02/15 01:24:00 NOTICE: S3 bucket rclone-test-toloqiv5yuqa: 2 errors while checking NFDx2 checkfile vs. both remote (with normalization) 2026/02/15 01:24:01 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:24:01 NOTICE: S3 bucket rclone-test-toloqiv5yuqa: 0 differences found 2026/02/15 01:24:01 NOTICE: S3 bucket rclone-test-toloqiv5yuqa: 1 matching files 2026/02/15 01:24:01 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-xoqujuc8nejo" 2026/02/15 01:24:01 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:24:01 DEBUG : Creating backend with remote "/tmp/rclone2483985568" run.go:185: Remote "S3 bucket rclone-test-xoqujuc8nejo", Local "Local file system at /tmp/rclone2483985568", Modify Window "1ns" 2026/02/15 01:24:02 INFO : S3 bucket rclone-test-xoqujuc8nejo: Bucket "rclone-test-xoqujuc8nejo" created with ACL "" NFCx2 checkfile vs. both remote (without normalization) 2026/02/15 01:24:03 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: sum not found 2026/02/15 01:24:03 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: file not in S3 bucket rclone-test-xoqujuc8nejo 2026/02/15 01:24:03 NOTICE: S3 bucket rclone-test-xoqujuc8nejo: 1 files missing 2026/02/15 01:24:03 NOTICE: 1 hashes missing 2026/02/15 01:24:03 NOTICE: S3 bucket rclone-test-xoqujuc8nejo: 1 differences found 2026/02/15 01:24:03 NOTICE: S3 bucket rclone-test-xoqujuc8nejo: 2 errors while checking NFCx2 checkfile vs. both remote (with normalization) 2026/02/15 01:24:04 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:24:04 NOTICE: S3 bucket rclone-test-xoqujuc8nejo: 0 differences found 2026/02/15 01:24:04 NOTICE: S3 bucket rclone-test-xoqujuc8nejo: 1 matching files 2026/02/15 01:24:04 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-fehetij4dota" 2026/02/15 01:24:04 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:24:04 DEBUG : Creating backend with remote "/tmp/rclone644183989" run.go:185: Remote "S3 bucket rclone-test-fehetij4dota", Local "Local file system at /tmp/rclone644183989", Modify Window "1ns" 2026/02/15 01:24:05 INFO : S3 bucket rclone-test-fehetij4dota: Bucket "rclone-test-fehetij4dota" created with ACL "" both checkfile vs. NFDx2 remote (without normalization) 2026/02/15 01:24:07 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: sum not found 2026/02/15 01:24:07 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: file not in S3 bucket rclone-test-fehetij4dota 2026/02/15 01:24:07 NOTICE: S3 bucket rclone-test-fehetij4dota: 1 files missing 2026/02/15 01:24:07 NOTICE: 1 hashes missing 2026/02/15 01:24:07 NOTICE: S3 bucket rclone-test-fehetij4dota: 1 differences found 2026/02/15 01:24:07 NOTICE: S3 bucket rclone-test-fehetij4dota: 2 errors while checking both checkfile vs. NFDx2 remote (with normalization) 2026/02/15 01:24:07 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:24:07 NOTICE: S3 bucket rclone-test-fehetij4dota: 0 differences found 2026/02/15 01:24:07 NOTICE: S3 bucket rclone-test-fehetij4dota: 1 matching files 2026/02/15 01:24:07 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-teqorul0hiwa" 2026/02/15 01:24:07 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:24:07 DEBUG : Creating backend with remote "/tmp/rclone3404220541" run.go:185: Remote "S3 bucket rclone-test-teqorul0hiwa", Local "Local file system at /tmp/rclone3404220541", Modify Window "1ns" 2026/02/15 01:24:08 INFO : S3 bucket rclone-test-teqorul0hiwa: Bucket "rclone-test-teqorul0hiwa" created with ACL "" both checkfile vs. NFCx2 remote (without normalization) 2026/02/15 01:24:10 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: sum not found 2026/02/15 01:24:10 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: file not in S3 bucket rclone-test-teqorul0hiwa 2026/02/15 01:24:10 NOTICE: S3 bucket rclone-test-teqorul0hiwa: 1 files missing 2026/02/15 01:24:10 NOTICE: 1 hashes missing 2026/02/15 01:24:10 NOTICE: S3 bucket rclone-test-teqorul0hiwa: 1 differences found 2026/02/15 01:24:10 NOTICE: S3 bucket rclone-test-teqorul0hiwa: 2 errors while checking both checkfile vs. NFCx2 remote (with normalization) 2026/02/15 01:24:11 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2026/02/15 01:24:11 NOTICE: S3 bucket rclone-test-teqorul0hiwa: 0 differences found 2026/02/15 01:24:11 NOTICE: S3 bucket rclone-test-teqorul0hiwa: 1 matching files 2026/02/15 01:24:11 DEBUG : S3 bucket rclone-test-teqorul0hiwa: Purge remote 2026/02/15 01:24:11 DEBUG : S3 bucket rclone-test-teqorul0hiwa: bucket is versioned: false 2026/02/15 01:24:11 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:11 DEBUG : "測試_Русский___ě_áñ測試_Русский___ě_áñ" version false 2026/02/15 01:24:11 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleting (id "") 2026/02/15 01:24:12 INFO : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleted 2026/02/15 01:24:13 INFO : S3 bucket rclone-test-teqorul0hiwa: Bucket "rclone-test-teqorul0hiwa" deleted 2026/02/15 01:24:13 DEBUG : S3 bucket rclone-test-fehetij4dota: Purge remote 2026/02/15 01:24:13 DEBUG : S3 bucket rclone-test-fehetij4dota: bucket is versioned: false 2026/02/15 01:24:13 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:13 DEBUG : "測試_Русский___ě_áñ測試_Русский___ě_áñ" version false 2026/02/15 01:24:13 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleting (id "") 2026/02/15 01:24:13 INFO : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleted 2026/02/15 01:24:14 INFO : S3 bucket rclone-test-fehetij4dota: Bucket "rclone-test-fehetij4dota" deleted 2026/02/15 01:24:14 DEBUG : S3 bucket rclone-test-xoqujuc8nejo: Purge remote 2026/02/15 01:24:14 DEBUG : S3 bucket rclone-test-xoqujuc8nejo: bucket is versioned: false 2026/02/15 01:24:14 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:15 DEBUG : "測試_Русский___ě_áñ測試_Русский___ě_áñ" version false 2026/02/15 01:24:15 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleting (id "") 2026/02/15 01:24:15 INFO : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleted 2026/02/15 01:24:16 INFO : S3 bucket rclone-test-xoqujuc8nejo: Bucket "rclone-test-xoqujuc8nejo" deleted 2026/02/15 01:24:16 DEBUG : S3 bucket rclone-test-toloqiv5yuqa: Purge remote 2026/02/15 01:24:16 DEBUG : S3 bucket rclone-test-toloqiv5yuqa: bucket is versioned: false 2026/02/15 01:24:16 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:17 DEBUG : "測試_Русский___ě_áñ測試_Русский___ě_áñ" version false 2026/02/15 01:24:17 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleting (id "") 2026/02/15 01:24:17 INFO : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleted 2026/02/15 01:24:18 INFO : S3 bucket rclone-test-toloqiv5yuqa: Bucket "rclone-test-toloqiv5yuqa" deleted 2026/02/15 01:24:18 DEBUG : S3 bucket rclone-test-poxenes7zifu: Purge remote 2026/02/15 01:24:18 DEBUG : S3 bucket rclone-test-poxenes7zifu: bucket is versioned: false 2026/02/15 01:24:18 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:18 DEBUG : "測試_Русский___ě_áñ" version false 2026/02/15 01:24:18 DEBUG : 測試_Русский___ě_áñ: Deleting (id "") 2026/02/15 01:24:19 INFO : 測試_Русский___ě_áñ: Deleted 2026/02/15 01:24:19 INFO : S3 bucket rclone-test-poxenes7zifu: Bucket "rclone-test-poxenes7zifu" deleted 2026/02/15 01:24:19 DEBUG : S3 bucket rclone-test-felalot7hiho: Purge remote 2026/02/15 01:24:19 DEBUG : S3 bucket rclone-test-felalot7hiho: bucket is versioned: false 2026/02/15 01:24:19 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:20 DEBUG : "測試_Русский___ě_áñ" version false 2026/02/15 01:24:20 DEBUG : 測試_Русский___ě_áñ: Deleting (id "") 2026/02/15 01:24:20 INFO : 測試_Русский___ě_áñ: Deleted 2026/02/15 01:24:21 INFO : S3 bucket rclone-test-felalot7hiho: Bucket "rclone-test-felalot7hiho" deleted 2026/02/15 01:24:21 DEBUG : S3 bucket rclone-test-darajuj4voro: Purge remote 2026/02/15 01:24:21 DEBUG : S3 bucket rclone-test-darajuj4voro: bucket is versioned: false 2026/02/15 01:24:21 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:22 DEBUG : "HELLO, WORLD!" version false 2026/02/15 01:24:22 DEBUG : HELLO, WORLD!: Deleting (id "") 2026/02/15 01:24:22 INFO : HELLO, WORLD!: Deleted 2026/02/15 01:24:23 INFO : S3 bucket rclone-test-darajuj4voro: Bucket "rclone-test-darajuj4voro" deleted 2026/02/15 01:24:23 DEBUG : S3 bucket rclone-test-fapufel0hifa: Purge remote 2026/02/15 01:24:23 DEBUG : S3 bucket rclone-test-fapufel0hifa: bucket is versioned: false 2026/02/15 01:24:23 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:23 DEBUG : "HeLlO, wOrLd!" version false 2026/02/15 01:24:23 DEBUG : HeLlO, wOrLd!: Deleting (id "") 2026/02/15 01:24:24 INFO : HeLlO, wOrLd!: Deleted 2026/02/15 01:24:24 INFO : S3 bucket rclone-test-fapufel0hifa: Bucket "rclone-test-fapufel0hifa" deleted 2026/02/15 01:24:24 DEBUG : S3 bucket rclone-test-yifubex6yene: Purge remote 2026/02/15 01:24:25 DEBUG : S3 bucket rclone-test-yifubex6yene: bucket is versioned: false 2026/02/15 01:24:25 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:25 DEBUG : "HELLO, WORLD!" version false 2026/02/15 01:24:25 DEBUG : HELLO, WORLD!: Deleting (id "") 2026/02/15 01:24:26 INFO : HELLO, WORLD!: Deleted 2026/02/15 01:24:26 INFO : S3 bucket rclone-test-yifubex6yene: Bucket "rclone-test-yifubex6yene" deleted 2026/02/15 01:24:26 DEBUG : S3 bucket rclone-test-fadoyed1feme: Purge remote 2026/02/15 01:24:27 DEBUG : S3 bucket rclone-test-fadoyed1feme: bucket is versioned: false 2026/02/15 01:24:27 DEBUG : Waiting for deletions to finish 2026/02/15 01:24:27 DEBUG : "hello, world!" version false 2026/02/15 01:24:27 DEBUG : hello, world!: Deleting (id "") 2026/02/15 01:24:27 INFO : hello, world!: Deleted 2026/02/15 01:24:28 INFO : S3 bucket rclone-test-fadoyed1feme: Bucket "rclone-test-fadoyed1feme" deleted --- PASS: TestApplyTransforms (50.55s) === RUN TestTruncateString --- PASS: TestTruncateString (0.00s) === RUN TestCopyFile run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:24:28 DEBUG : file1: Need to transfer - File not found at Destination 2026/02/15 01:24:29 DEBUG : sub/file2: size = 14 OK 2026/02/15 01:24:29 DEBUG : file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2026/02/15 01:24:29 INFO : file1: Copied (new) to: sub/file2 2026/02/15 01:24:29 DEBUG : sub/file2: size = 14 OK 2026/02/15 01:24:29 DEBUG : file1: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/02/15 01:24:29 DEBUG : file1: Unchanged skipping 2026/02/15 01:24:29 DEBUG : S3 bucket rclone-test-daxariv2nomu: don't need to copy/move sub/file2, it is already at target location --- PASS: TestCopyFile (2.39s) === RUN TestCopyLongFile run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" copy_test.go:154: Test only runs on local --- SKIP: TestCopyLongFile (0.29s) === RUN TestCopyFileBackupDir run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:24:31 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/backup" 2026/02/15 01:24:32 DEBUG : dst/file1: size = 14 (Local file system at /tmp/rclone2360491256) 2026/02/15 01:24:32 DEBUG : dst/file1: size = 18 (S3 bucket rclone-test-daxariv2nomu) 2026/02/15 01:24:32 DEBUG : dst/file1: Sizes differ 2026/02/15 01:24:34 DEBUG : dst/file1: size = 18 OK 2026/02/15 01:24:34 DEBUG : dst/file1: md5 = 05164b153084ba910184c26e561a7c18 OK 2026/02/15 01:24:34 INFO : dst/file1: Copied (server-side copy) 2026/02/15 01:24:34 INFO : dst/file1: Deleted 2026/02/15 01:24:34 DEBUG : dst/file1: size = 14 OK 2026/02/15 01:24:34 DEBUG : dst/file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2026/02/15 01:24:34 INFO : dst/file1: Copied (new) --- PASS: TestCopyFileBackupDir (5.19s) === RUN TestCopyFileCompareDest run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:24:36 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/dst" 2026/02/15 01:24:36 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/CompareDest" 2026/02/15 01:24:37 DEBUG : one: Need to transfer - File not found at Destination 2026/02/15 01:24:38 DEBUG : one: size = 3 OK 2026/02/15 01:24:38 DEBUG : one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2026/02/15 01:24:38 INFO : one: Copied (new) 2026/02/15 01:24:39 DEBUG : one: size = 5 (Local file system at /tmp/rclone2360491256) 2026/02/15 01:24:39 DEBUG : one: size = 3 (S3 bucket rclone-test-daxariv2nomu path dst) 2026/02/15 01:24:39 DEBUG : one: Sizes differ 2026/02/15 01:24:40 DEBUG : one: size = 5 OK 2026/02/15 01:24:40 DEBUG : one: md5 = 07912d142f5d63ee918b34796b5a2432 OK 2026/02/15 01:24:40 INFO : one: Copied (replaced existing) 2026/02/15 01:24:42 DEBUG : one: size = 5 (Local file system at /tmp/rclone2360491256) 2026/02/15 01:24:42 DEBUG : one: size = 3 (S3 bucket rclone-test-daxariv2nomu path dst) 2026/02/15 01:24:42 DEBUG : one: Sizes differ 2026/02/15 01:24:42 DEBUG : one: size = 5 OK 2026/02/15 01:24:42 DEBUG : one: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/02/15 01:24:42 DEBUG : one: Destination found in --compare-dest, skipping 2026/02/15 01:24:44 DEBUG : two: Need to transfer - File not found at Destination 2026/02/15 01:24:44 DEBUG : two: size = 3 OK 2026/02/15 01:24:44 DEBUG : two: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/02/15 01:24:44 DEBUG : two: Destination found in --compare-dest, skipping 2026/02/15 01:24:45 DEBUG : two: Need to transfer - File not found at Destination 2026/02/15 01:24:45 DEBUG : two: size = 3 OK 2026/02/15 01:24:45 DEBUG : two: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/02/15 01:24:45 DEBUG : two: Destination found in --compare-dest, skipping 2026/02/15 01:24:46 DEBUG : two: Need to transfer - File not found at Destination 2026/02/15 01:24:46 DEBUG : two: size = 5 (Local file system at /tmp/rclone2360491256) 2026/02/15 01:24:46 DEBUG : two: size = 3 (S3 bucket rclone-test-daxariv2nomu path CompareDest) 2026/02/15 01:24:46 DEBUG : two: Sizes differ 2026/02/15 01:24:47 DEBUG : two: size = 5 OK 2026/02/15 01:24:47 DEBUG : two: md5 = 2379e4ce8c3380e996ab0509f17069ad OK 2026/02/15 01:24:47 INFO : two: Copied (new) --- PASS: TestCopyFileCompareDest (13.00s) === RUN TestCopyFileCopyDest run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:24:49 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/dst" 2026/02/15 01:24:49 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/CopyDest" 2026/02/15 01:24:49 DEBUG : one: Need to transfer - File not found at Destination 2026/02/15 01:24:51 DEBUG : one: size = 3 OK 2026/02/15 01:24:51 DEBUG : one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2026/02/15 01:24:51 INFO : one: Copied (new) 2026/02/15 01:24:52 DEBUG : one: size = 5 (Local file system at /tmp/rclone2360491256) 2026/02/15 01:24:52 DEBUG : one: size = 3 (S3 bucket rclone-test-daxariv2nomu path dst) 2026/02/15 01:24:52 DEBUG : one: Sizes differ 2026/02/15 01:24:52 DEBUG : one: size = 5 OK 2026/02/15 01:24:52 DEBUG : one: md5 = 07912d142f5d63ee918b34796b5a2432 OK 2026/02/15 01:24:52 INFO : one: Copied (replaced existing) 2026/02/15 01:24:54 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/BackupDir" 2026/02/15 01:24:54 DEBUG : Config file has changed externally - reloading 2026/02/15 01:24:54 DEBUG : one: size = 5 (Local file system at /tmp/rclone2360491256) 2026/02/15 01:24:54 DEBUG : one: size = 3 (S3 bucket rclone-test-daxariv2nomu path dst) 2026/02/15 01:24:54 DEBUG : one: Sizes differ 2026/02/15 01:24:54 DEBUG : one: size = 5 OK 2026/02/15 01:24:54 DEBUG : one: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/02/15 01:24:54 DEBUG : one: size = 5 (Local file system at /tmp/rclone2360491256) 2026/02/15 01:24:54 DEBUG : one: size = 3 (S3 bucket rclone-test-daxariv2nomu path dst) 2026/02/15 01:24:54 DEBUG : one: Sizes differ 2026/02/15 01:24:56 DEBUG : one: size = 3 OK 2026/02/15 01:24:56 DEBUG : one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2026/02/15 01:24:56 INFO : one: Copied (server-side copy) 2026/02/15 01:24:56 INFO : one: Deleted 2026/02/15 01:24:57 DEBUG : one: size = 5 OK 2026/02/15 01:24:57 DEBUG : one: md5 = 07912d142f5d63ee918b34796b5a2432 OK 2026/02/15 01:24:57 INFO : one: Copied (server-side copy) 2026/02/15 01:24:57 DEBUG : one: Destination found in --copy-dest, using server-side copy 2026/02/15 01:24:58 DEBUG : two: Need to transfer - File not found at Destination 2026/02/15 01:24:58 DEBUG : two: size = 3 OK 2026/02/15 01:24:58 DEBUG : two: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/02/15 01:24:59 DEBUG : two: size = 3 OK 2026/02/15 01:24:59 DEBUG : two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2026/02/15 01:24:59 INFO : two: Copied (server-side copy) 2026/02/15 01:24:59 DEBUG : two: Destination found in --copy-dest, using server-side copy 2026/02/15 01:25:00 DEBUG : two: size = 3 OK 2026/02/15 01:25:00 DEBUG : two: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/02/15 01:25:00 DEBUG : two: Unchanged skipping 2026/02/15 01:25:02 DEBUG : three: Need to transfer - File not found at Destination 2026/02/15 01:25:02 DEBUG : three: size = 7 (Local file system at /tmp/rclone2360491256) 2026/02/15 01:25:02 DEBUG : three: size = 5 (S3 bucket rclone-test-daxariv2nomu path CopyDest) 2026/02/15 01:25:02 DEBUG : three: Sizes differ 2026/02/15 01:25:02 DEBUG : three: Destination not found in --copy-dest 2026/02/15 01:25:03 DEBUG : three: size = 7 OK 2026/02/15 01:25:03 DEBUG : three: md5 = 1bccb9dccb3e9f6a3f9d2a8bdb54b7f5 OK 2026/02/15 01:25:03 INFO : three: Copied (new) --- PASS: TestCopyFileCopyDest (17.08s) === RUN TestCopyInplace run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" copy_test.go:371: Partial uploads not supported --- SKIP: TestCopyInplace (0.26s) === RUN TestCopyLongFileName run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" copy_test.go:404: Partial uploads not supported --- SKIP: TestCopyLongFileName (0.26s) === RUN TestCopyLongFileNameCollision run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" copy_test.go:437: Partial uploads not supported --- SKIP: TestCopyLongFileNameCollision (0.27s) === RUN TestCopyFileMaxTransfer run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:25:07 DEBUG : TestCopyFileMaxTransfer/file1: Need to transfer - File not found at Destination 2026/02/15 01:25:07 DEBUG : TestCopyFileMaxTransfer/file1: size = 14 OK 2026/02/15 01:25:07 DEBUG : TestCopyFileMaxTransfer/file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2026/02/15 01:25:07 INFO : TestCopyFileMaxTransfer/file1: Copied (new) 2026/02/15 01:25:08 DEBUG : TestCopyFileMaxTransfer/file2: Need to transfer - File not found at Destination 2026/02/15 01:25:08 ERROR : TestCopyFileMaxTransfer/file2: Failed to copy: operation error S3: PutObject, exceeded maximum number of attempts, 1, https response error StatusCode: 0, RequestID: , HostID: , request send failed, Put "https://rclone-test-daxariv2nomu.storage.googleapis.com/TestCopyFileMaxTransfer/file2": max transfer limit reached as set by --max-transfer 2026/02/15 01:25:08 DEBUG : TestCopyFileMaxTransfer/file3: Need to transfer - File not found at Destination 2026/02/15 01:25:09 DEBUG : TestCopyFileMaxTransfer/file4: Need to transfer - File not found at Destination 2026/02/15 01:25:09 DEBUG : TestCopyFileMaxTransfer/file4: size = 2062 OK 2026/02/15 01:25:09 DEBUG : TestCopyFileMaxTransfer/file4: md5 = adafb4b5aed2e61c0ee1e61a6f800d42 OK 2026/02/15 01:25:09 INFO : TestCopyFileMaxTransfer/file4: Copied (new) --- PASS: TestCopyFileMaxTransfer (3.72s) === RUN TestDeduplicateInteractive run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateInteractive (0.26s) === RUN TestDeduplicateSkip run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateSkip (0.27s) === RUN TestDeduplicateSizeOnly run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateSizeOnly (0.29s) === RUN TestDeduplicateFirst run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateFirst (0.28s) === RUN TestDeduplicateNewest run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateNewest (0.27s) === RUN TestDeduplicateNewestByHash run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:25:14 INFO : S3 bucket rclone-test-daxariv2nomu: Looking for duplicate md5 hashes using newest mode. 2026/02/15 01:25:14 NOTICE: cfcd6f692f5369684692fb5044a3823d: Found 3 files with duplicate md5 hashes 2026/02/15 01:25:15 INFO : one: Deleted 2026/02/15 01:25:15 INFO : also/one: Deleted 2026/02/15 01:25:15 NOTICE: cfcd6f692f5369684692fb5044a3823d: Deleted 2 extra copies --- PASS: TestDeduplicateNewestByHash (4.45s) === RUN TestDeduplicateOldest run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateOldest (0.27s) === RUN TestDeduplicateLargest run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateLargest (0.26s) === RUN TestDeduplicateSmallest run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateSmallest (0.26s) === RUN TestDeduplicateRename run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateRename (0.26s) === RUN TestMergeDirs run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" dedupe_test.go:256: Can't merge directories --- SKIP: TestMergeDirs (0.26s) === RUN TestListDirSorted run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:25:21 DEBUG : a.txt: Excluded (Size Filter) 2026/02/15 01:25:21 DEBUG : a.txt: Excluded 2026/02/15 01:25:22 DEBUG : sub dir/hello world: Excluded (Size Filter) 2026/02/15 01:25:22 DEBUG : sub dir/hello world: Excluded 2026/02/15 01:25:22 DEBUG : sub dir/hello world2: Excluded (Size Filter) 2026/02/15 01:25:22 DEBUG : sub dir/hello world2: Excluded 2026/02/15 01:25:22 DEBUG : sub dir/ignore dir: Excluded 2026/02/15 01:25:22 DEBUG : sub dir/hello world: Excluded (Size Filter) 2026/02/15 01:25:22 DEBUG : sub dir/hello world: Excluded 2026/02/15 01:25:22 DEBUG : sub dir/hello world2: Excluded (Size Filter) 2026/02/15 01:25:22 DEBUG : sub dir/hello world2: Excluded 2026/02/15 01:25:22 DEBUG : sub dir/ignore dir: Excluded --- PASS: TestListDirSorted (7.27s) === RUN TestListDirSortedFn run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:25:29 DEBUG : a.txt: Excluded (Size Filter) 2026/02/15 01:25:29 DEBUG : a.txt: Excluded 2026/02/15 01:25:29 DEBUG : sub dir/hello world: Excluded (Size Filter) 2026/02/15 01:25:29 DEBUG : sub dir/hello world: Excluded 2026/02/15 01:25:29 DEBUG : sub dir/hello world2: Excluded (Size Filter) 2026/02/15 01:25:29 DEBUG : sub dir/hello world2: Excluded 2026/02/15 01:25:29 DEBUG : sub dir/ignore dir: Excluded 2026/02/15 01:25:29 DEBUG : sub dir/hello world: Excluded (Size Filter) 2026/02/15 01:25:29 DEBUG : sub dir/hello world: Excluded 2026/02/15 01:25:29 DEBUG : sub dir/hello world2: Excluded (Size Filter) 2026/02/15 01:25:29 DEBUG : sub dir/hello world2: Excluded 2026/02/15 01:25:29 DEBUG : sub dir/ignore dir: Excluded --- PASS: TestListDirSortedFn (7.26s) === RUN TestListJSON run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" === RUN TestListJSON/Default === RUN TestListJSON/FilesOnly === RUN TestListJSON/DirsOnly === RUN TestListJSON/Recurse === RUN TestListJSON/SubDir === RUN TestListJSON/NoModTime === RUN TestListJSON/NoMimeType === RUN TestListJSON/ShowHash === RUN TestListJSON/HashTypes === RUN TestListJSON/Metadata --- PASS: TestListJSON (4.55s) --- PASS: TestListJSON/Default (0.25s) --- PASS: TestListJSON/FilesOnly (0.25s) --- PASS: TestListJSON/DirsOnly (0.13s) --- PASS: TestListJSON/Recurse (0.39s) --- PASS: TestListJSON/SubDir (0.26s) --- PASS: TestListJSON/NoModTime (0.26s) --- PASS: TestListJSON/NoMimeType (0.25s) --- PASS: TestListJSON/ShowHash (0.26s) --- PASS: TestListJSON/HashTypes (0.25s) --- PASS: TestListJSON/Metadata (0.26s) === RUN TestStatJSON run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" === RUN TestStatJSON/Root === RUN TestStatJSON/RootFilesOnly === RUN TestStatJSON/RootDirsOnly === RUN TestStatJSON/Dir === RUN TestStatJSON/DirWithTrailingSlash === RUN TestStatJSON/File === RUN TestStatJSON/NotFound === RUN TestStatJSON/DirFilesOnly === RUN TestStatJSON/FileFilesOnly === RUN TestStatJSON/NotFoundFilesOnly === RUN TestStatJSON/DirDirsOnly === RUN TestStatJSON/FileDirsOnly === RUN TestStatJSON/NotFoundDirsOnly === RUN TestStatJSON/RootNotFound 2026/02/15 01:25:40 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/notfound" --- PASS: TestStatJSON (4.54s) --- PASS: TestStatJSON/Root (0.13s) --- PASS: TestStatJSON/RootFilesOnly (0.00s) --- PASS: TestStatJSON/RootDirsOnly (0.13s) --- PASS: TestStatJSON/Dir (0.30s) --- PASS: TestStatJSON/DirWithTrailingSlash (0.13s) --- PASS: TestStatJSON/File (0.12s) --- PASS: TestStatJSON/NotFound (0.36s) --- PASS: TestStatJSON/DirFilesOnly (0.25s) --- PASS: TestStatJSON/FileFilesOnly (0.12s) --- PASS: TestStatJSON/NotFoundFilesOnly (0.28s) --- PASS: TestStatJSON/DirDirsOnly (0.13s) --- PASS: TestStatJSON/FileDirsOnly (0.14s) --- PASS: TestStatJSON/NotFoundDirsOnly (0.13s) --- PASS: TestStatJSON/RootNotFound (0.34s) === RUN TestMkdir run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:25:41 INFO : S3 bucket rclone-test-daxariv2nomu: Making directory 2026/02/15 01:25:41 INFO : S3 bucket rclone-test-daxariv2nomu: Making directory --- PASS: TestMkdir (0.39s) === RUN TestLsd run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" --- PASS: TestLsd (1.28s) === RUN TestLs run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" --- PASS: TestLs (2.08s) === RUN TestLsWithFilesFrom run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:25:46 DEBUG : empty space: Excluded (FilesFrom Filter) --- PASS: TestLsWithFilesFrom (2.31s) === RUN TestLsLong run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" --- PASS: TestLsLong (2.32s) === RUN TestHashSums run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" === RUN TestHashSums/Md5 === RUN TestHashSums/Md5Download --- PASS: TestHashSums (2.36s) --- PASS: TestHashSums/Md5 (0.13s) --- PASS: TestHashSums/Md5Download (0.27s) === RUN TestHashSumsWithErrors 2026/02/15 01:25:52 DEBUG : Creating backend with remote ":memory:" 2026/02/15 01:25:52 ERROR : file1: hash unsupported: hash type not supported --- PASS: TestHashSumsWithErrors (0.00s) === RUN TestHashStream 2026/02/15 01:25:52 DEBUG : Creating md5 hash of 0 bytes read from input stream 2026/02/15 01:25:52 DEBUG : Creating md5 hash of 0 bytes read from input stream 2026/02/15 01:25:52 DEBUG : Creating sha1 hash of 0 bytes read from input stream 2026/02/15 01:25:52 DEBUG : Creating sha1 hash of 0 bytes read from input stream 2026/02/15 01:25:52 DEBUG : Creating md5 hash of 12 bytes read from input stream 2026/02/15 01:25:52 DEBUG : Creating md5 hash of 12 bytes read from input stream 2026/02/15 01:25:52 DEBUG : Creating sha1 hash of 12 bytes read from input stream 2026/02/15 01:25:52 DEBUG : Creating sha1 hash of 12 bytes read from input stream --- PASS: TestHashStream (0.00s) === RUN TestSuffixName --- PASS: TestSuffixName (0.00s) === RUN TestCount run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" --- PASS: TestCount (2.88s) === RUN TestDelete run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:25:56 DEBUG : Waiting for deletions to finish 2026/02/15 01:25:56 DEBUG : large: Excluded (Size Filter) 2026/02/15 01:25:57 INFO : medium: Deleted 2026/02/15 01:25:57 INFO : small: Deleted --- PASS: TestDelete (2.82s) === RUN TestMaxDelete run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:25:59 DEBUG : Waiting for deletions to finish 2026/02/15 01:25:59 ERROR : large: Got fatal error on delete: --max-delete threshold reached 2026/02/15 01:25:59 INFO : small: Deleted 2026/02/15 01:25:59 INFO : medium: Deleted --- PASS: TestMaxDelete (3.01s) === RUN TestMaxDeleteSizeLargeFile run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:02 DEBUG : Waiting for deletions to finish 2026/02/15 01:26:02 ERROR : large: Got fatal error on delete: --max-delete-size threshold reached 2026/02/15 01:26:02 INFO : medium: Deleted 2026/02/15 01:26:02 INFO : small: Deleted --- PASS: TestMaxDeleteSizeLargeFile (3.13s) === RUN TestMaxDeleteSize run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:05 DEBUG : Waiting for deletions to finish 2026/02/15 01:26:05 ERROR : large: Got fatal error on delete: --max-delete-size threshold reached 2026/02/15 01:26:06 INFO : medium: Deleted 2026/02/15 01:26:06 INFO : small: Deleted --- PASS: TestMaxDeleteSize (3.08s) === RUN TestReadFile run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" --- PASS: TestReadFile (1.69s) === RUN TestRetry 2026/02/15 01:26:08 DEBUG : Received error: Wrapped EOF is retriable: EOF - low level retry 1/5 2026/02/15 01:26:08 DEBUG : Received error: Wrapped EOF is retriable: EOF - low level retry 2/5 2026/02/15 01:26:08 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG: trying again in 10ms 2026/02/15 01:26:08 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG: trying again in 10ms 2026/02/15 01:26:08 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG: trying again in 10ms 2026/02/15 01:26:08 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG: trying again in 10ms 2026/02/15 01:26:08 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG: trying again in 10ms --- PASS: TestRetry (0.05s) === RUN TestCat run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" --- PASS: TestCat (4.06s) === RUN TestPurge 2026/02/15 01:26:12 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-facilup6yode" 2026/02/15 01:26:12 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2026/02/15 01:26:12 DEBUG : Creating backend with remote "/tmp/rclone997877587" run.go:185: Remote "S3 bucket rclone-test-facilup6yode", Local "Local file system at /tmp/rclone997877587", Modify Window "1ns" 2026/02/15 01:26:14 INFO : S3 bucket rclone-test-facilup6yode: Bucket "rclone-test-facilup6yode" created with ACL "" 2026/02/15 01:26:14 INFO : A2: Making directory 2026/02/15 01:26:14 INFO : A1/B2: Making directory 2026/02/15 01:26:14 INFO : A1/B2/C2: Making directory 2026/02/15 01:26:14 INFO : A1/B1/C3: Making directory 2026/02/15 01:26:14 INFO : A3: Making directory 2026/02/15 01:26:14 INFO : A3/B3: Making directory 2026/02/15 01:26:14 INFO : A3/B3/C4: Making directory fstest.go:250: Filtering empty directory "A2" fstest.go:250: Filtering empty directory "A1/B2" fstest.go:250: Filtering empty directory "A1/B2/C2" fstest.go:250: Filtering empty directory "A1/B1/C3" fstest.go:250: Filtering empty directory "A3" fstest.go:250: Filtering empty directory "A3/B3" fstest.go:250: Filtering empty directory "A3/B3/C4" 2026/02/15 01:26:17 DEBUG : S3 bucket rclone-test-facilup6yode: bucket is versioned: false 2026/02/15 01:26:17 DEBUG : Waiting for deletions to finish 2026/02/15 01:26:17 DEBUG : "A1/B1/C1/one" version false 2026/02/15 01:26:17 DEBUG : A1/B1/C1/one: Deleting (id "") 2026/02/15 01:26:18 INFO : A1/B1/C1/one: Deleted fstest.go:250: Filtering empty directory "A2" fstest.go:250: Filtering empty directory "A1/B2" fstest.go:250: Filtering empty directory "A1/B2/C2" fstest.go:250: Filtering empty directory "A3" fstest.go:250: Filtering empty directory "A3/B3" fstest.go:250: Filtering empty directory "A3/B3/C4" 2026/02/15 01:26:18 DEBUG : Waiting for deletions to finish 2026/02/15 01:26:19 DEBUG : "A1/two" version false 2026/02/15 01:26:19 DEBUG : A1/two: Deleting (id "") 2026/02/15 01:26:19 INFO : A1/two: Deleted 2026/02/15 01:26:20 INFO : S3 bucket rclone-test-facilup6yode: Bucket "rclone-test-facilup6yode" deleted 2026/02/15 01:26:20 DEBUG : S3 bucket rclone-test-facilup6yode: Purge remote 2026/02/15 01:26:20 DEBUG : Waiting for deletions to finish 2026/02/15 01:26:20 NOTICE: purge failed: directory not found --- PASS: TestPurge (7.96s) === RUN TestRmdirsNoLeaveRoot run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:21 INFO : A2: Making directory 2026/02/15 01:26:21 INFO : A1/B2: Making directory 2026/02/15 01:26:21 INFO : A1/B2/C2: Making directory 2026/02/15 01:26:21 INFO : A1/B1/C3: Making directory 2026/02/15 01:26:21 INFO : A3: Making directory 2026/02/15 01:26:21 INFO : A3/B3: Making directory 2026/02/15 01:26:21 INFO : A3/B3/C4: Making directory fstest.go:250: Filtering empty directory "A2" fstest.go:250: Filtering empty directory "A1/B2" fstest.go:250: Filtering empty directory "A1/B2/C2" fstest.go:250: Filtering empty directory "A1/B1/C3" fstest.go:250: Filtering empty directory "A3" fstest.go:250: Filtering empty directory "A3/B3" fstest.go:250: Filtering empty directory "A3/B3/C4" 2026/02/15 01:26:22 DEBUG : removing 1 level 3 directories 2026/02/15 01:26:22 INFO : A3/B3/C4: Removing directory fstest.go:250: Filtering empty directory "A2" fstest.go:250: Filtering empty directory "A1/B2" fstest.go:250: Filtering empty directory "A1/B2/C2" fstest.go:250: Filtering empty directory "A1/B1/C3" fstest.go:250: Filtering empty directory "A3" fstest.go:250: Filtering empty directory "A3/B3" 2026/02/15 01:26:24 DEBUG : removing 1 level 0 directories 2026/02/15 01:26:24 INFO : S3 bucket rclone-test-daxariv2nomu: Removing directory 2026/02/15 01:26:24 INFO : S3 bucket rclone-test-daxariv2nomu: Bucket "rclone-test-daxariv2nomu" deleted --- PASS: TestRmdirsNoLeaveRoot (3.97s) === RUN TestRmdirsLeaveRoot run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:25 INFO : S3 bucket rclone-test-daxariv2nomu: Bucket "rclone-test-daxariv2nomu" created with ACL "" 2026/02/15 01:26:25 INFO : A1: Making directory 2026/02/15 01:26:25 INFO : A1/B1: Making directory 2026/02/15 01:26:25 INFO : A1/B1/C1: Making directory fstest.go:250: Filtering empty directory "A1" fstest.go:250: Filtering empty directory "A1/B1" fstest.go:250: Filtering empty directory "A1/B1/C1" fstest.go:250: Filtering empty directory "A1" --- PASS: TestRmdirsLeaveRoot (2.36s) === RUN TestRmdirsWithFilter run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:27 INFO : A1: Making directory 2026/02/15 01:26:27 INFO : A1/B1: Making directory 2026/02/15 01:26:27 INFO : A1/B1/C1: Making directory fstest.go:250: Filtering empty directory "A1" fstest.go:250: Filtering empty directory "A1/B1" fstest.go:250: Filtering empty directory "A1/B1/C1" fstest.go:250: Filtering empty directory "A1" --- PASS: TestRmdirsWithFilter (1.20s) === RUN TestCopyURL run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" operations_test.go:850: Error Trace: /home/rclone/go/src/github.com/rclone/rclone/fs/operations/operations_test.go:850 Error: An error is expected but got nil. Test: TestCopyURL --- FAIL: TestCopyURL (2.69s) === RUN TestCopyURLToWriter --- PASS: TestCopyURLToWriter (0.00s) === RUN TestMoveFile run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:31 DEBUG : file1: Need to transfer - File not found at Destination 2026/02/15 01:26:31 DEBUG : sub/file2: size = 14 OK 2026/02/15 01:26:31 DEBUG : file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2026/02/15 01:26:31 INFO : file1: Copied (new) to: sub/file2 2026/02/15 01:26:31 INFO : file1: Deleted 2026/02/15 01:26:32 DEBUG : sub/file2: size = 14 OK 2026/02/15 01:26:32 DEBUG : file1: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/02/15 01:26:32 DEBUG : file1: Unchanged skipping 2026/02/15 01:26:32 INFO : file1: Deleted 2026/02/15 01:26:32 DEBUG : S3 bucket rclone-test-daxariv2nomu: don't need to copy/move sub/file2, it is already at target location --- PASS: TestMoveFile (3.00s) === RUN TestMoveFileWithIgnoreExisting run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:34 DEBUG : file1: Need to transfer - File not found at Destination 2026/02/15 01:26:34 DEBUG : file1: size = 14 OK 2026/02/15 01:26:34 DEBUG : file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2026/02/15 01:26:34 INFO : file1: Copied (new) 2026/02/15 01:26:34 INFO : file1: Deleted 2026/02/15 01:26:35 DEBUG : file1: Destination exists, skipping 2026/02/15 01:26:35 DEBUG : file1: Not removing source file as destination file exists and --ignore-existing is set --- PASS: TestMoveFileWithIgnoreExisting (2.69s) === RUN TestCaseInsensitiveMoveFile run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" --- PASS: TestCaseInsensitiveMoveFile (0.61s) === RUN TestCaseInsensitiveMoveFileDryRun run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" --- PASS: TestCaseInsensitiveMoveFileDryRun (0.49s) === RUN TestMoveFileBackupDir run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:38 DEBUG : Creating backend with remote "TestS3GCS:rclone-test-daxariv2nomu/backup" 2026/02/15 01:26:39 DEBUG : dst/file1: size = 14 (Local file system at /tmp/rclone2360491256) 2026/02/15 01:26:39 DEBUG : dst/file1: size = 18 (S3 bucket rclone-test-daxariv2nomu) 2026/02/15 01:26:39 DEBUG : dst/file1: Sizes differ 2026/02/15 01:26:41 DEBUG : dst/file1: size = 18 OK 2026/02/15 01:26:41 DEBUG : dst/file1: md5 = 05164b153084ba910184c26e561a7c18 OK 2026/02/15 01:26:41 INFO : dst/file1: Copied (server-side copy) 2026/02/15 01:26:41 INFO : dst/file1: Deleted 2026/02/15 01:26:42 DEBUG : dst/file1: size = 14 OK 2026/02/15 01:26:42 DEBUG : dst/file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2026/02/15 01:26:42 INFO : dst/file1: Copied (new) 2026/02/15 01:26:42 INFO : dst/file1: Deleted --- PASS: TestMoveFileBackupDir (6.01s) === RUN TestSameConfig --- PASS: TestSameConfig (0.00s) === RUN TestSame --- PASS: TestSame (0.00s) === RUN TestOverlappingFilterCheckWithoutFilter --- PASS: TestOverlappingFilterCheckWithoutFilter (0.00s) === RUN TestOverlappingFilterCheckWithFilter --- PASS: TestOverlappingFilterCheckWithFilter (0.00s) === RUN TestListFormat --- PASS: TestListFormat (0.00s) === RUN TestDirMove run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:46 INFO : A1/B2: Making directory 2026/02/15 01:26:46 INFO : A1/B1/C3: Making directory fstest.go:250: Filtering empty directory "A1/B2" fstest.go:250: Filtering empty directory "A1/B1/C3" 2026/02/15 01:26:49 DEBUG : A2/two: size = 3 OK 2026/02/15 01:26:49 DEBUG : A1/two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2026/02/15 01:26:49 INFO : A1/two: Copied (server-side copy) to: A2/two 2026/02/15 01:26:49 DEBUG : A2/B1/three: size = 5 OK 2026/02/15 01:26:49 DEBUG : A1/B1/three: md5 = 35d6d33467aae9a2e3dccb4b6b027878 OK 2026/02/15 01:26:49 INFO : A1/B1/three: Copied (server-side copy) to: A2/B1/three 2026/02/15 01:26:49 DEBUG : A2/B1/C1/four: size = 4 OK 2026/02/15 01:26:49 DEBUG : A2/B1/C2/five: size = 4 OK 2026/02/15 01:26:49 DEBUG : A1/B1/C1/four: md5 = 8cbad96aced40b3838dd9f07f6ef5772 OK 2026/02/15 01:26:49 DEBUG : A1/B1/C2/five: md5 = 30056e1cab7a61d256fc8edd970d14f5 OK 2026/02/15 01:26:49 INFO : A1/B1/C1/four: Copied (server-side copy) to: A2/B1/C1/four 2026/02/15 01:26:49 INFO : A1/B1/C2/five: Copied (server-side copy) to: A2/B1/C2/five 2026/02/15 01:26:49 DEBUG : A2/one: size = 3 OK 2026/02/15 01:26:49 DEBUG : A1/one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2026/02/15 01:26:49 INFO : A1/one: Copied (server-side copy) to: A2/one 2026/02/15 01:26:49 INFO : A1/one: Deleted 2026/02/15 01:26:49 INFO : A1/B1/C2/five: Deleted 2026/02/15 01:26:49 INFO : A1/two: Deleted 2026/02/15 01:26:49 INFO : A1/B1/C1/four: Deleted 2026/02/15 01:26:49 INFO : A1/B1/three: Deleted fstest.go:250: Filtering empty directory "A2/B2" fstest.go:250: Filtering empty directory "A2/B1/C3" 2026/02/15 01:26:51 DEBUG : A3/B1/C2/five: size = 4 OK 2026/02/15 01:26:51 DEBUG : A2/B1/C2/five: md5 = 30056e1cab7a61d256fc8edd970d14f5 OK 2026/02/15 01:26:51 INFO : A2/B1/C2/five: Copied (server-side copy) to: A3/B1/C2/five 2026/02/15 01:26:51 DEBUG : A3/one: size = 3 OK 2026/02/15 01:26:51 DEBUG : A2/one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2026/02/15 01:26:51 INFO : A2/one: Copied (server-side copy) to: A3/one 2026/02/15 01:26:51 DEBUG : A3/two: size = 3 OK 2026/02/15 01:26:51 DEBUG : A2/two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2026/02/15 01:26:51 INFO : A2/two: Copied (server-side copy) to: A3/two 2026/02/15 01:26:51 DEBUG : A3/B1/three: size = 5 OK 2026/02/15 01:26:51 DEBUG : A2/B1/three: md5 = 35d6d33467aae9a2e3dccb4b6b027878 OK 2026/02/15 01:26:51 INFO : A2/B1/three: Copied (server-side copy) to: A3/B1/three 2026/02/15 01:26:51 DEBUG : A3/B1/C1/four: size = 4 OK 2026/02/15 01:26:51 DEBUG : A2/B1/C1/four: md5 = 8cbad96aced40b3838dd9f07f6ef5772 OK 2026/02/15 01:26:51 INFO : A2/B1/C1/four: Copied (server-side copy) to: A3/B1/C1/four 2026/02/15 01:26:52 INFO : A2/two: Deleted 2026/02/15 01:26:52 INFO : A2/B1/three: Deleted 2026/02/15 01:26:52 INFO : A2/one: Deleted 2026/02/15 01:26:52 INFO : A2/B1/C2/five: Deleted 2026/02/15 01:26:52 INFO : A2/B1/C1/four: Deleted fstest.go:250: Filtering empty directory "A3/B2" fstest.go:250: Filtering empty directory "A3/B1/C3" 2026/02/15 01:26:53 INFO : S3 bucket rclone-test-daxariv2nomu: Can't DirMove - falling back to file moves: can't move directory - incompatible remotes 2026/02/15 01:26:54 DEBUG : A4/two: size = 3 OK 2026/02/15 01:26:54 DEBUG : A4/B1/C2/five: size = 4 OK 2026/02/15 01:26:54 DEBUG : A3/B1/C2/five: md5 = 30056e1cab7a61d256fc8edd970d14f5 OK 2026/02/15 01:26:54 INFO : A3/B1/C2/five: Copied (server-side copy) to: A4/B1/C2/five 2026/02/15 01:26:54 DEBUG : A3/two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2026/02/15 01:26:54 INFO : A3/two: Copied (server-side copy) to: A4/two 2026/02/15 01:26:54 DEBUG : A4/one: size = 3 OK 2026/02/15 01:26:54 DEBUG : A3/one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2026/02/15 01:26:54 INFO : A3/one: Copied (server-side copy) to: A4/one 2026/02/15 01:26:54 DEBUG : A4/B1/three: size = 5 OK 2026/02/15 01:26:54 DEBUG : A3/B1/three: md5 = 35d6d33467aae9a2e3dccb4b6b027878 OK 2026/02/15 01:26:54 INFO : A3/B1/three: Copied (server-side copy) to: A4/B1/three 2026/02/15 01:26:54 DEBUG : A4/B1/C1/four: size = 4 OK 2026/02/15 01:26:54 DEBUG : A3/B1/C1/four: md5 = 8cbad96aced40b3838dd9f07f6ef5772 OK 2026/02/15 01:26:54 INFO : A3/B1/C1/four: Copied (server-side copy) to: A4/B1/C1/four 2026/02/15 01:26:54 INFO : A3/B1/three: Deleted 2026/02/15 01:26:54 INFO : A3/two: Deleted 2026/02/15 01:26:54 INFO : A3/B1/C2/five: Deleted 2026/02/15 01:26:54 INFO : A3/B1/C1/four: Deleted 2026/02/15 01:26:54 INFO : A3/one: Deleted fstest.go:250: Filtering empty directory "A4/B2" fstest.go:250: Filtering empty directory "A4/B1/C3" --- PASS: TestDirMove (14.20s) === RUN TestGetFsInfo run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" --- PASS: TestGetFsInfo (0.50s) === RUN TestRcat === RUN TestRcat/withChecksum=false,ignoreChecksum=false run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:26:58 DEBUG : S3 bucket rclone-test-daxariv2nomu: File to upload is small (34 bytes), uploading instead of streaming 2026/02/15 01:26:59 DEBUG : no_checksum_small_file_from_pipe: size = 34 OK 2026/02/15 01:26:59 DEBUG : no_checksum_small_file_from_pipe: md5 = 4c762bc6ec18f21b23fcae426c7863b5 OK 2026/02/15 01:26:59 DEBUG : no_checksum_small_file_from_pipe: Size and md5 of src and dst objects identical 2026/02/15 01:26:59 NOTICE: S3 bucket rclone-test-daxariv2nomu: Streaming uploads using chunk size 5Mi will have maximum file size of 48.828Gi 2026/02/15 01:26:59 DEBUG : no_checksum_big_file_from_pipe: open chunk writer: started multipart upload: ABPnzm5IoEGU44NdOegvHk3sLjoMrPlgwJWnR0ZJWSayasRsHsdv7GIHh1IaKkr_BP4DpdM 2026/02/15 01:26:59 DEBUG : no_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2026/02/15 01:27:00 DEBUG : no_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2026/02/15 01:27:00 DEBUG : no_checksum_big_file_from_pipe: multipart upload "ABPnzm5IoEGU44NdOegvHk3sLjoMrPlgwJWnR0ZJWSayasRsHsdv7GIHh1IaKkr_BP4DpdM" finished 2026/02/15 01:27:01 DEBUG : no_checksum_big_file_from_pipe: Multipart upload Etag: bbeee02c608c6e85736b35e536fba719-1 OK 2026/02/15 01:27:01 DEBUG : no_checksum_big_file_from_pipe: size = 102401 OK 2026/02/15 01:27:01 DEBUG : no_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2026/02/15 01:27:01 DEBUG : no_checksum_big_file_from_pipe: Size of src and dst objects identical === RUN TestRcat/withChecksum=true,ignoreChecksum=false run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:27:02 DEBUG : S3 bucket rclone-test-daxariv2nomu: File to upload is small (34 bytes), uploading instead of streaming 2026/02/15 01:27:03 DEBUG : with_checksum_small_file_from_pipe: size = 34 OK 2026/02/15 01:27:03 DEBUG : with_checksum_small_file_from_pipe: md5 = 4c762bc6ec18f21b23fcae426c7863b5 OK 2026/02/15 01:27:03 DEBUG : with_checksum_small_file_from_pipe: Size and md5 of src and dst objects identical 2026/02/15 01:27:03 DEBUG : with_checksum_big_file_from_pipe: open chunk writer: started multipart upload: ABPnzm6EbeN3oK56T7pKLLLiiOGCn5yWKWq8HR-_MJSa4x-t939z66cXNr2-BX423Ba7_Ho 2026/02/15 01:27:03 DEBUG : with_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2026/02/15 01:27:04 DEBUG : with_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2026/02/15 01:27:05 DEBUG : with_checksum_big_file_from_pipe: multipart upload "ABPnzm6EbeN3oK56T7pKLLLiiOGCn5yWKWq8HR-_MJSa4x-t939z66cXNr2-BX423Ba7_Ho" finished 2026/02/15 01:27:05 DEBUG : with_checksum_big_file_from_pipe: Multipart upload Etag: bbeee02c608c6e85736b35e536fba719-1 OK 2026/02/15 01:27:05 DEBUG : with_checksum_big_file_from_pipe: size = 102401 OK 2026/02/15 01:27:05 DEBUG : with_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2026/02/15 01:27:05 DEBUG : with_checksum_big_file_from_pipe: Size of src and dst objects identical === RUN TestRcat/withChecksum=false,ignoreChecksum=true run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:27:07 DEBUG : S3 bucket rclone-test-daxariv2nomu: File to upload is small (34 bytes), uploading instead of streaming 2026/02/15 01:27:07 DEBUG : ignore_checksum_small_file_from_pipe: size = 34 OK 2026/02/15 01:27:07 DEBUG : ignore_checksum_small_file_from_pipe: Size and modification time the same (differ by 0s, within tolerance 1ns) 2026/02/15 01:27:07 DEBUG : ignore_checksum_big_file_from_pipe: open chunk writer: started multipart upload: ABPnzm6QYDiUs_JqyDMpuR8smgptSt9ulPXLQppwWMncaodlWn8pG9kpZTqw7T_CoJHaBq0 2026/02/15 01:27:07 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2026/02/15 01:27:08 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2026/02/15 01:27:09 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload "ABPnzm6QYDiUs_JqyDMpuR8smgptSt9ulPXLQppwWMncaodlWn8pG9kpZTqw7T_CoJHaBq0" finished 2026/02/15 01:27:09 DEBUG : ignore_checksum_big_file_from_pipe: Multipart upload Etag: bbeee02c608c6e85736b35e536fba719-1 OK 2026/02/15 01:27:09 DEBUG : ignore_checksum_big_file_from_pipe: size = 102401 OK 2026/02/15 01:27:09 DEBUG : ignore_checksum_big_file_from_pipe: Size and modification time the same (differ by 0s, within tolerance 1ns) === RUN TestRcat/withChecksum=true,ignoreChecksum=true run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:27:11 DEBUG : S3 bucket rclone-test-daxariv2nomu: File to upload is small (34 bytes), uploading instead of streaming 2026/02/15 01:27:11 DEBUG : ignore_checksum_small_file_from_pipe: size = 34 OK 2026/02/15 01:27:11 DEBUG : ignore_checksum_small_file_from_pipe: Src hash empty - aborting Dst hash check 2026/02/15 01:27:11 DEBUG : ignore_checksum_small_file_from_pipe: Size of src and dst objects identical 2026/02/15 01:27:12 DEBUG : ignore_checksum_big_file_from_pipe: open chunk writer: started multipart upload: ABPnzm5Cj6vFnETUcM7MIaZ-17f4fKeeGVe8sIdZiHU8090RByMmhNBy7D9VD7kJk-ckOCw 2026/02/15 01:27:12 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2026/02/15 01:27:12 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2026/02/15 01:27:13 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload "ABPnzm5Cj6vFnETUcM7MIaZ-17f4fKeeGVe8sIdZiHU8090RByMmhNBy7D9VD7kJk-ckOCw" finished 2026/02/15 01:27:13 DEBUG : ignore_checksum_big_file_from_pipe: Multipart upload Etag: bbeee02c608c6e85736b35e536fba719-1 OK 2026/02/15 01:27:13 DEBUG : ignore_checksum_big_file_from_pipe: size = 102401 OK 2026/02/15 01:27:13 DEBUG : ignore_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2026/02/15 01:27:13 DEBUG : ignore_checksum_big_file_from_pipe: Src hash empty - aborting Dst hash check 2026/02/15 01:27:13 DEBUG : ignore_checksum_big_file_from_pipe: Size of src and dst objects identical --- PASS: TestRcat (16.70s) --- PASS: TestRcat/withChecksum=false,ignoreChecksum=false (4.06s) --- PASS: TestRcat/withChecksum=true,ignoreChecksum=false (4.25s) --- PASS: TestRcat/withChecksum=false,ignoreChecksum=true (4.20s) --- PASS: TestRcat/withChecksum=true,ignoreChecksum=true (4.20s) === RUN TestRcatMetadata run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" === RUN TestRcatMetadata/Normal 2026/02/15 01:27:15 DEBUG : S3 bucket rclone-test-daxariv2nomu: File to upload is small (48 bytes), uploading instead of streaming 2026/02/15 01:27:16 DEBUG : rcat_metadata: size = 48 OK 2026/02/15 01:27:16 DEBUG : rcat_metadata: md5 = dee10c33b9fe5e2ac4eb8ad8467b7063 OK 2026/02/15 01:27:16 DEBUG : rcat_metadata: Size and md5 of src and dst objects identical === RUN TestRcatMetadata/ViaDisk 2026/02/15 01:27:17 DEBUG : rcat_metadata_uploadcutoff0: open chunk writer: started multipart upload: ABPnzm6OaFPE7e5M9Td3680r_UYzhwqxngGlxtsMoRNLvy9uCZCX2A7dqUURjUXNhp8jKOY 2026/02/15 01:27:17 DEBUG : rcat_metadata_uploadcutoff0: multipart upload: starting chunk 0 size 63 offset 0/off 2026/02/15 01:27:17 DEBUG : rcat_metadata_uploadcutoff0: multipart upload wrote chunk 1 with 63 bytes and etag "51ca8560f35b9b87370a65d72356b86c" 2026/02/15 01:27:18 DEBUG : rcat_metadata_uploadcutoff0: multipart upload "ABPnzm6OaFPE7e5M9Td3680r_UYzhwqxngGlxtsMoRNLvy9uCZCX2A7dqUURjUXNhp8jKOY" finished 2026/02/15 01:27:18 DEBUG : rcat_metadata_uploadcutoff0: Multipart upload Etag: ed80e8f4a79a14666b4280dc1353888f-1 OK 2026/02/15 01:27:18 DEBUG : rcat_metadata_uploadcutoff0: size = 63 OK 2026/02/15 01:27:18 DEBUG : rcat_metadata_uploadcutoff0: Dst hash empty - aborting Src hash check 2026/02/15 01:27:18 DEBUG : rcat_metadata_uploadcutoff0: Size of src and dst objects identical --- PASS: TestRcatMetadata (4.70s) --- PASS: TestRcatMetadata/Normal (1.63s) --- PASS: TestRcatMetadata/ViaDisk (2.50s) === RUN TestRcatSize run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:27:20 DEBUG : S3 bucket rclone-test-daxariv2nomu: File to upload is small (60 bytes), uploading instead of streaming 2026/02/15 01:27:20 DEBUG : potato2: size = 60 OK 2026/02/15 01:27:20 DEBUG : potato2: md5 = d6548b156ea68a4e003e786df99eee76 OK 2026/02/15 01:27:20 DEBUG : potato2: Size and md5 of src and dst objects identical --- PASS: TestRcatSize (2.00s) === RUN TestRcatSizeMetadata run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:27:22 DEBUG : S3 bucket rclone-test-daxariv2nomu: File to upload is small (60 bytes), uploading instead of streaming 2026/02/15 01:27:22 DEBUG : potato2: size = 60 OK 2026/02/15 01:27:22 DEBUG : potato2: md5 = d6548b156ea68a4e003e786df99eee76 OK 2026/02/15 01:27:22 DEBUG : potato2: Size and md5 of src and dst objects identical --- PASS: TestRcatSizeMetadata (2.27s) === RUN TestTouchDir run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" 2026/02/15 01:27:26 DEBUG : S3 bucket rclone-test-daxariv2nomu: Touching "sub dir/potato3" 2026/02/15 01:27:26 DEBUG : S3 bucket rclone-test-daxariv2nomu: Touching "empty space" 2026/02/15 01:27:26 DEBUG : S3 bucket rclone-test-daxariv2nomu: Touching "potato2" --- PASS: TestTouchDir (3.87s) === RUN TestMkdirMetadata run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" operations_test.go:1711: Skipping test as remote does not support MkdirMetadata --- SKIP: TestMkdirMetadata (0.26s) === RUN TestMkdirModTime run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" operations_test.go:1730: Skipping test as remote does not support DirSetModTime or MkdirMetadata --- SKIP: TestMkdirModTime (0.27s) === RUN TestCopyDirMetadata run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" operations_test.go:1751: Skipping test as remote does not support WriteDirMetadata or MkdirMetadata --- SKIP: TestCopyDirMetadata (0.27s) === RUN TestSetDirModTime run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" operations_test.go:1787: Skipping test as remote does not support DirSetModTime or WriteDirSetModTime --- SKIP: TestSetDirModTime (0.27s) === RUN TestDirsEqual run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" operations_test.go:1831: Skipping test as remote does not support WriteDirMetadata or MkdirMetadata --- SKIP: TestDirsEqual (0.27s) === RUN TestRemoveExisting run.go:185: Remote "S3 bucket rclone-test-daxariv2nomu", Local "Local file system at /tmp/rclone2360491256", Modify Window "1ns" operations_test.go:1900: Skipping as remote can't Move --- SKIP: TestRemoveExisting (0.27s) === RUN TestRcAbout rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcAbout (0.00s) === RUN TestRcCleanup rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCleanup (0.00s) === RUN TestRcCopyfile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCopyfile (0.00s) === RUN TestRcCopyurl rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCopyurl (0.00s) === RUN TestRcDelete rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcDelete (0.00s) === RUN TestRcDeletefile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcDeletefile (0.00s) === RUN TestRcList rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcList (0.00s) === RUN TestRcStat rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcStat (0.00s) === RUN TestRcSetTier rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcSetTier (0.00s) === RUN TestRcSetTierFile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcSetTierFile (0.00s) === RUN TestRcMkdir rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcMkdir (0.00s) === RUN TestRcMovefile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcMovefile (0.00s) === RUN TestRcPurge rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcPurge (0.00s) === RUN TestRcRmdir rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcRmdir (0.00s) === RUN TestRcRmdirs rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcRmdirs (0.00s) === RUN TestRcSize rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcSize (0.00s) === RUN TestRcPublicLink rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcPublicLink (0.00s) === RUN TestRcFsInfo rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcFsInfo (0.00s) === RUN TestUploadFile rc_test.go:30: Skipping test on non local remote --- SKIP: TestUploadFile (0.00s) === RUN TestRcCommand rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCommand (0.00s) === RUN TestRcDu rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcDu (0.00s) === RUN TestRcCheck rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCheck (0.00s) === RUN TestRcHashsum rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcHashsum (0.00s) === RUN TestRcHashsumSingleFile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcHashsumSingleFile (0.00s) === RUN TestRcHashsumFile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcHashsumFile (0.00s) FAIL 2026/02/15 01:27:29 DEBUG : S3 bucket rclone-test-daxariv2nomu: Purge remote 2026/02/15 01:27:29 DEBUG : S3 bucket rclone-test-daxariv2nomu: bucket is versioned: false 2026/02/15 01:27:29 DEBUG : Waiting for deletions to finish 2026/02/15 01:27:30 INFO : S3 bucket rclone-test-daxariv2nomu: Bucket "rclone-test-daxariv2nomu" deleted "./operations.test -test.v -test.timeout 1h0m0s -remote TestS3GCS: -verbose -fast-list" - Finished ERROR in 5m46.107053187s (try 1/5): exit status 1: Failed [TestCopyURL]