"./operations.test -test.v -test.timeout 1h0m0s -remote TestS3R2: -verbose -fast-list" - Starting (try 1/5) 2024/12/10 01:19:41 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi" 2024/12/10 01:19:41 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:19:41 DEBUG : Creating backend with remote "/tmp/rclone2474542083" === RUN TestDoMultiThreadCopy --- PASS: TestDoMultiThreadCopy (0.00s) === RUN TestMultithreadCalculateNumChunks === RUN TestMultithreadCalculateNumChunks/{size:1_chunkSize:65536_wantNumChunks:1} === RUN TestMultithreadCalculateNumChunks/{size:1048576_chunkSize:1_wantNumChunks:1048576} === RUN TestMultithreadCalculateNumChunks/{size:1048576_chunkSize:2_wantNumChunks:524288} === RUN TestMultithreadCalculateNumChunks/{size:1048577_chunkSize:2_wantNumChunks:524289} === RUN TestMultithreadCalculateNumChunks/{size:1048575_chunkSize:2_wantNumChunks:524288} --- PASS: TestMultithreadCalculateNumChunks (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1_chunkSize:65536_wantNumChunks:1} (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1048576_chunkSize:1_wantNumChunks:1048576} (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1048576_chunkSize:2_wantNumChunks:524288} (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1048577_chunkSize:2_wantNumChunks:524289} (0.00s) --- PASS: TestMultithreadCalculateNumChunks/{size:1048575_chunkSize:2_wantNumChunks:524288} (0.00s) === RUN TestMultithreadCopy run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:19:41 INFO : S3 bucket rclone-test-nutecod8gavi: Bucket "rclone-test-nutecod8gavi" created with ACL "" 2024/12/10 01:19:42 DEBUG : chunksize-probe: open chunk writer: started multipart upload: AEF-xTmojg1PFD1AbFItEfr-sV2R9-Ewusgml8uZY1A9d6HB-xFxQt1TaMHbTre2j2IIJY_ewehlqUjMFqmQLEy7lSh633fwPFz74eFzl8FWCHbugeW7kO7KkWH_CGBArs1L09HioNmhI6R30lDDrUN6MBZPQykrPjSz0W0yboAHbYq3nD0TPi4JmXJRSdKOiH49bjUexRb8ixY463awa1XG3sSUE04s-aFDNlNc6eWsk9UMPN7r3HkbmCwXiAhDA5OKsvpL5VizN3_-qktCy2EjhE62aWkKZ9vUgJpn1wk4urcC6g7xxVHUt3jIhmzrjgcBlmTVl9tz5bCPU2iMhTY 2024/12/10 01:19:42 DEBUG : chunksize-probe: multipart upload "AEF-xTmojg1PFD1AbFItEfr-sV2R9-Ewusgml8uZY1A9d6HB-xFxQt1TaMHbTre2j2IIJY_ewehlqUjMFqmQLEy7lSh633fwPFz74eFzl8FWCHbugeW7kO7KkWH_CGBArs1L09HioNmhI6R30lDDrUN6MBZPQykrPjSz0W0yboAHbYq3nD0TPi4JmXJRSdKOiH49bjUexRb8ixY463awa1XG3sSUE04s-aFDNlNc6eWsk9UMPN7r3HkbmCwXiAhDA5OKsvpL5VizN3_-qktCy2EjhE62aWkKZ9vUgJpn1wk4urcC6g7xxVHUt3jIhmzrjgcBlmTVl9tz5bCPU2iMhTY" aborted === RUN TestMultithreadCopy/upload=false,size=10485759,streams=2 2024/12/10 01:19:43 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: disabling buffering because destination uses OpenWriterAt 2024/12/10 01:19:43 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: write buffer set to 131072 2024/12/10 01:19:43 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: chunk size 64Mi was bigger than source file size 10.000Mi 2024/12/10 01:19:43 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2024/12/10 01:19:43 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: number of streams 4 was bigger than number of chunks 1 2024/12/10 01:19:43 DEBUG : test-multithread-copy-false-10485759-2: Starting multi-thread copy with 1 chunks of size 10.000Mi with 1 parallel streams 2024/12/10 01:19:43 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: chunk 1/1 (0-10485759) size 10.000Mi starting 2024/12/10 01:19:44 DEBUG : test-multithread-copy-false-10485759-2: writing chunk 0 2024/12/10 01:19:44 DEBUG : test-multithread-copy-false-10485759-2: multi-thread copy: chunk 1/1 (0-10485759) size 10.000Mi finished 2024/12/10 01:19:44 DEBUG : test-multithread-copy-false-10485759-2: Finished multi-thread copy with 1 parts of size 10.000Mi === RUN TestMultithreadCopy/upload=false,size=10485760,streams=2 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: disabling buffering because destination uses OpenWriterAt 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: write buffer set to 131072 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: chunk size 64Mi was bigger than source file size 10Mi 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: number of streams 4 was bigger than number of chunks 1 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: Starting multi-thread copy with 1 chunks of size 10Mi with 1 parallel streams 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: chunk 1/1 (0-10485760) size 10Mi starting 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: writing chunk 0 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: multi-thread copy: chunk 1/1 (0-10485760) size 10Mi finished 2024/12/10 01:19:46 DEBUG : test-multithread-copy-false-10485760-2: Finished multi-thread copy with 1 parts of size 10Mi === RUN TestMultithreadCopy/upload=false,size=10485761,streams=2 2024/12/10 01:19:48 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: disabling buffering because destination uses OpenWriterAt 2024/12/10 01:19:48 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: write buffer set to 131072 2024/12/10 01:19:48 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: chunk size 64Mi was bigger than source file size 10.000Mi 2024/12/10 01:19:48 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2024/12/10 01:19:48 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: number of streams 4 was bigger than number of chunks 1 2024/12/10 01:19:48 DEBUG : test-multithread-copy-false-10485761-2: Starting multi-thread copy with 1 chunks of size 10.000Mi with 1 parallel streams 2024/12/10 01:19:48 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: chunk 1/1 (0-10485761) size 10.000Mi starting 2024/12/10 01:19:48 DEBUG : test-multithread-copy-false-10485761-2: writing chunk 0 2024/12/10 01:19:49 DEBUG : test-multithread-copy-false-10485761-2: multi-thread copy: chunk 1/1 (0-10485761) size 10.000Mi finished 2024/12/10 01:19:49 DEBUG : test-multithread-copy-false-10485761-2: Finished multi-thread copy with 1 parts of size 10.000Mi === RUN TestMultithreadCopy/upload=true,size=10485759,streams=2 2024/12/10 01:19:50 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: disabling buffering because source is local disk 2024/12/10 01:19:50 DEBUG : test-multithread-copy-true-10485759-2: open chunk writer: started multipart upload: AJcU6W9xiY6f37EYHrEiNUaSu3LeajBwI4fi0duJhH-UPvML9cF4P0Yj0Qi4X64-xvTIfcRLRT8WGtLIR3y0RaSqaK3oRBwz7M3yFpfBTb_8tgMnN5Px_sxbb3Xrcm1jxtK3D6i0NLYdBKHnT_gEJtwtSGMjYgM72sS2S_7YU8AuuBHwvCFrYg5m4GGG4w7Si8weX1EO03-cRMu3ghy4Q1eQ6BpwnTROi7wY-RaOhiw-J2HbvA8zrSoAJzmkr9Ebi-roDt3xjN5D75kaTTnkbo8ejLymMos7yj_z7z6YJZ_WWfk-HiVpMFVpZkUy-OTe4J80iOGd_yC7jivGzlBu1ZQ 2024/12/10 01:19:50 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2024/12/10 01:19:50 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: number of streams 4 was bigger than number of chunks 2 2024/12/10 01:19:50 DEBUG : test-multithread-copy-true-10485759-2: Starting multi-thread copy with 2 chunks of size 5Mi with 2 parallel streams 2024/12/10 01:19:50 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: chunk 2/2 (5242880-10485759) size 5.000Mi starting 2024/12/10 01:19:50 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: chunk 1/2 (0-5242880) size 5Mi starting 2024/12/10 01:19:50 DEBUG : test-multithread-copy-true-10485759-2: Seek from 5242879 to 0 2024/12/10 01:19:50 DEBUG : test-multithread-copy-true-10485759-2: Seek from 5242880 to 0 2024/12/10 01:19:51 DEBUG : test-multithread-copy-true-10485759-2: multipart upload wrote chunk 1 with 5242880 bytes and etag "d7f885ab117f313a00b4f17280848038" 2024/12/10 01:19:51 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: chunk 1/2 (0-5242880) size 5Mi finished 2024/12/10 01:19:51 DEBUG : test-multithread-copy-true-10485759-2: multipart upload wrote chunk 2 with 5242879 bytes and etag "6d6adc83f1dd0f1b41a6e15369f7739b" 2024/12/10 01:19:51 DEBUG : test-multithread-copy-true-10485759-2: multi-thread copy: chunk 2/2 (5242880-10485759) size 5.000Mi finished 2024/12/10 01:19:51 DEBUG : test-multithread-copy-true-10485759-2: multipart upload "AJcU6W9xiY6f37EYHrEiNUaSu3LeajBwI4fi0duJhH-UPvML9cF4P0Yj0Qi4X64-xvTIfcRLRT8WGtLIR3y0RaSqaK3oRBwz7M3yFpfBTb_8tgMnN5Px_sxbb3Xrcm1jxtK3D6i0NLYdBKHnT_gEJtwtSGMjYgM72sS2S_7YU8AuuBHwvCFrYg5m4GGG4w7Si8weX1EO03-cRMu3ghy4Q1eQ6BpwnTROi7wY-RaOhiw-J2HbvA8zrSoAJzmkr9Ebi-roDt3xjN5D75kaTTnkbo8ejLymMos7yj_z7z6YJZ_WWfk-HiVpMFVpZkUy-OTe4J80iOGd_yC7jivGzlBu1ZQ" finished 2024/12/10 01:19:51 DEBUG : test-multithread-copy-true-10485759-2: Finished multi-thread copy with 2 parts of size 5Mi === RUN TestMultithreadCopy/upload=true,size=10485760,streams=2 2024/12/10 01:19:52 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: disabling buffering because source is local disk 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: open chunk writer: started multipart upload: AOe8SG2mgRXHkmBN-Zz3oMTtTvZNeoQzSVpdSR6sstGnC6p0yZqZKDK3N8gLud5NNRvJMAVpZyQaxSqv-59fG1-wxNsZQF8gBsbrTsTiarlnMc7LXvr4gPKcivA-rTOsVU088d1RCNVGFaDVjwkVETWM3bAexGDOUJyvRwIqsFic8bktdG96vskJSZC90yYxT7LB0X3ZSatcviFsioEqVfmYFwV2yLlAWkkk_axwkBgCXbPxPYkbIWjJnU9ykI2eLQWLUogR4NVPG9Rjz9HU3HD9VY3LZZXSSA2W1SUFPYFpdOrk4kr_gNERvjxcr92oHH6weqkhXwwyUvx0z397uek 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: number of streams 4 was bigger than number of chunks 2 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: Starting multi-thread copy with 2 chunks of size 5Mi with 2 parallel streams 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: chunk 2/2 (5242880-10485760) size 5Mi starting 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: chunk 1/2 (0-5242880) size 5Mi starting 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: Seek from 5242880 to 0 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: Seek from 5242880 to 0 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: multipart upload wrote chunk 1 with 5242880 bytes and etag "a7b9b849800c71ef63856545583b8eb0" 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: chunk 1/2 (0-5242880) size 5Mi finished 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: multipart upload wrote chunk 2 with 5242880 bytes and etag "ca75362d8c28034b133dc405a65ac887" 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: multi-thread copy: chunk 2/2 (5242880-10485760) size 5Mi finished 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: multipart upload "AOe8SG2mgRXHkmBN-Zz3oMTtTvZNeoQzSVpdSR6sstGnC6p0yZqZKDK3N8gLud5NNRvJMAVpZyQaxSqv-59fG1-wxNsZQF8gBsbrTsTiarlnMc7LXvr4gPKcivA-rTOsVU088d1RCNVGFaDVjwkVETWM3bAexGDOUJyvRwIqsFic8bktdG96vskJSZC90yYxT7LB0X3ZSatcviFsioEqVfmYFwV2yLlAWkkk_axwkBgCXbPxPYkbIWjJnU9ykI2eLQWLUogR4NVPG9Rjz9HU3HD9VY3LZZXSSA2W1SUFPYFpdOrk4kr_gNERvjxcr92oHH6weqkhXwwyUvx0z397uek" finished 2024/12/10 01:19:53 DEBUG : test-multithread-copy-true-10485760-2: Finished multi-thread copy with 2 parts of size 5Mi === RUN TestMultithreadCopy/upload=true,size=10485761,streams=2 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: disabling buffering because source is local disk 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: open chunk writer: started multipart upload: AJicf08rur_hfUg4NOTgFGO0TjYMEZbFcuGg8ot4cY3lf3S-meXMwg3DJm8Mw4iEayff9jqD6jbUjNrtP_ouV31EgqtEzlw9ZY68o-uEH3q4I7p6_uKvI2Gu2x1eHlV_Y2CGS20iTpWT2HZXJyhnTr7Cc8t9Ezhu6y_Xmn0KtOOL_-yJJeyFxF7RRyrEWdrW66OIdY5spDvnaF0IERHV-yjpUasquZaYr9YulP4toACDYOkt5sdsL7VqHBBG5qG10pbBRjZOSdLvFdLaMsMCcWOOwGmhI8GrIZgNf_CmxRurBZwZ59QP0oo_MAx_WyoimQlT3mTa6xsL-CDmc2HWpGM 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 2 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: number of streams 4 was bigger than number of chunks 3 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: Starting multi-thread copy with 3 chunks of size 5Mi with 3 parallel streams 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 3/3 (10485760-10485761) size 1 starting 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 2/3 (5242880-10485760) size 5Mi starting 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 1/3 (0-5242880) size 5Mi starting 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: Seek from 1 to 0 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: Seek from 5242880 to 0 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: Seek from 5242880 to 0 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multipart upload wrote chunk 3 with 1 bytes and etag "83878c91171338902e0fe0fb97a8c47a" 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 3/3 (10485760-10485761) size 1 finished 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multipart upload wrote chunk 1 with 5242880 bytes and etag "8dcee45765d4f7d385692e9aab8a5f4e" 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 1/3 (0-5242880) size 5Mi finished 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multipart upload wrote chunk 2 with 5242880 bytes and etag "3ec3bf22bf3729f262713a9200204441" 2024/12/10 01:19:55 DEBUG : test-multithread-copy-true-10485761-2: multi-thread copy: chunk 2/3 (5242880-10485760) size 5Mi finished 2024/12/10 01:19:56 DEBUG : test-multithread-copy-true-10485761-2: multipart upload "AJicf08rur_hfUg4NOTgFGO0TjYMEZbFcuGg8ot4cY3lf3S-meXMwg3DJm8Mw4iEayff9jqD6jbUjNrtP_ouV31EgqtEzlw9ZY68o-uEH3q4I7p6_uKvI2Gu2x1eHlV_Y2CGS20iTpWT2HZXJyhnTr7Cc8t9Ezhu6y_Xmn0KtOOL_-yJJeyFxF7RRyrEWdrW66OIdY5spDvnaF0IERHV-yjpUasquZaYr9YulP4toACDYOkt5sdsL7VqHBBG5qG10pbBRjZOSdLvFdLaMsMCcWOOwGmhI8GrIZgNf_CmxRurBZwZ59QP0oo_MAx_WyoimQlT3mTa6xsL-CDmc2HWpGM" finished 2024/12/10 01:19:56 DEBUG : test-multithread-copy-true-10485761-2: Finished multi-thread copy with 3 parts of size 5Mi --- PASS: TestMultithreadCopy (15.78s) --- PASS: TestMultithreadCopy/upload=false,size=10485759,streams=2 (3.00s) --- PASS: TestMultithreadCopy/upload=false,size=10485760,streams=2 (2.43s) --- PASS: TestMultithreadCopy/upload=false,size=10485761,streams=2 (2.12s) --- PASS: TestMultithreadCopy/upload=true,size=10485759,streams=2 (2.39s) --- PASS: TestMultithreadCopy/upload=true,size=10485760,streams=2 (2.32s) --- PASS: TestMultithreadCopy/upload=true,size=10485761,streams=2 (2.40s) === RUN TestMultithreadCopyAbort run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:19:57 DEBUG : chunksize-probe: open chunk writer: started multipart upload: ALS6Z6EQpoZHdM2TCov--bd8_ElvMUysVzNfq39dV_YW-yujUX1Mx7uDQniKvi3P3Rs_W1fefPhoucYy-lteWuXrYUs4isqc6aCbVrsx5tMNyhgt5matooRBBn-HVmHcVvayPYqNAbQSan-MX3mEAQizFAfozMtmROUp7j2c50v64iOtQytrbysPM62bLNdk2O4uLRvFgfFWqfXJmnYlm9FvZw7_19K2MlIxNKUF0o2oJ_AVUK-ZpqCnPV1ckSKEFB9AX3RyIrXNWj815nA9BdZkrd6vCtoP2-wMg9EY_3Yfr8pjb2pYg_Q2TuFl4CO4bBSJv6PWUMxchQ4nll0YZZk 2024/12/10 01:19:57 DEBUG : chunksize-probe: multipart upload "ALS6Z6EQpoZHdM2TCov--bd8_ElvMUysVzNfq39dV_YW-yujUX1Mx7uDQniKvi3P3Rs_W1fefPhoucYy-lteWuXrYUs4isqc6aCbVrsx5tMNyhgt5matooRBBn-HVmHcVvayPYqNAbQSan-MX3mEAQizFAfozMtmROUp7j2c50v64iOtQytrbysPM62bLNdk2O4uLRvFgfFWqfXJmnYlm9FvZw7_19K2MlIxNKUF0o2oJ_AVUK-ZpqCnPV1ckSKEFB9AX3RyIrXNWj815nA9BdZkrd6vCtoP2-wMg9EY_3Yfr8pjb2pYg_Q2TuFl4CO4bBSJv6PWUMxchQ4nll0YZZk" aborted 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multi-thread copy: disabling buffering because source is local disk 2024/12/10 01:19:58 DEBUG : test-multithread-abort: open chunk writer: started multipart upload: AE2J-6oeBWKgYBknJZpKN_jRK0q9-yUj3AVQt80bhrJLr-I6hRgqz_7C7EObe0bycaK0N4ZR-tF0rZlj4ePi8JvAsMUF69ZsCT0wHj1Z5yXtRpghw3zCmazifuwgNfRLJqTbDNjQd4nRhHgk3dI9rEsoS0XbWJV0d7MgCwBsLYngqlExMh2jgBRM1mjGLkfv9wRBYxD7zo9luNMOFdPsfAUeYTki8msr2XYjB87RJ1VqZuyZCryTWrOiOXLr5PRYvqq30GqjjqnFNsludYtys6s7cF-XbV4prrHYbKkWVL0xFW6s0eQyOmuJCjiMHGRNqN_oy8EkRU0iuiHYs4U9Hmo 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multi-thread copy: using backend concurrency of 4 instead of --multi-thread-streams 1 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multi-thread copy: number of streams 4 was bigger than number of chunks 3 2024/12/10 01:19:58 DEBUG : test-multithread-abort: Starting multi-thread copy with 3 chunks of size 5Mi with 3 parallel streams 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multi-thread copy: chunk 3/3 (10485760-10485761) size 1 starting 2024/12/10 01:19:58 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multi-thread copy: chunk 1/3 (0-5242880) size 5Mi starting 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multi-thread copy: chunk 2/3 (5242880-10485760) size 5Mi starting 2024/12/10 01:19:58 DEBUG : Open with options = [RangeOption(0,5242879)] 2024/12/10 01:19:58 DEBUG : Open with options = [RangeOption(5242880,10485759)] 2024/12/10 01:19:58 DEBUG : test-multithread-abort: Seek from 5242880 to 0 2024/12/10 01:19:58 DEBUG : Open with options = [RangeOption(5242880,10485759)] 2024/12/10 01:19:58 DEBUG : test-multithread-abort: Seek from 5242880 to 0 2024/12/10 01:19:58 DEBUG : Open with options = [RangeOption(0,5242879)] 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multipart upload wrote chunk 2 with 5242880 bytes and etag "914c9b170056ebd03f02220c3ec598c5" 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multi-thread copy: chunk 2/3 (5242880-10485760) size 5Mi finished 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multipart upload wrote chunk 1 with 5242880 bytes and etag "b041541eb16e531604a2af6c28f857cc" 2024/12/10 01:19:58 DEBUG : test-multithread-abort: multi-thread copy: chunk 1/3 (0-5242880) size 5Mi finished 2024/12/10 01:19:59 DEBUG : Returning error reader 2024/12/10 01:19:59 DEBUG : BOOM: simulated read failure 2024/12/10 01:19:59 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 1/10: BOOM: simulated read failure 2024/12/10 01:19:59 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:20:00 DEBUG : Returning error reader 2024/12/10 01:20:00 DEBUG : BOOM: simulated read failure 2024/12/10 01:20:00 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 2/10: BOOM: simulated read failure 2024/12/10 01:20:00 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:20:01 DEBUG : Returning error reader 2024/12/10 01:20:01 DEBUG : BOOM: simulated read failure 2024/12/10 01:20:01 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 3/10: BOOM: simulated read failure 2024/12/10 01:20:01 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:20:02 DEBUG : Returning error reader 2024/12/10 01:20:02 DEBUG : BOOM: simulated read failure 2024/12/10 01:20:02 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 4/10: BOOM: simulated read failure 2024/12/10 01:20:02 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:20:03 DEBUG : Returning error reader 2024/12/10 01:20:03 DEBUG : BOOM: simulated read failure 2024/12/10 01:20:03 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 5/10: BOOM: simulated read failure 2024/12/10 01:20:03 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:20:04 DEBUG : Returning error reader 2024/12/10 01:20:04 DEBUG : BOOM: simulated read failure 2024/12/10 01:20:04 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 6/10: BOOM: simulated read failure 2024/12/10 01:20:04 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:20:05 DEBUG : Returning error reader 2024/12/10 01:20:05 DEBUG : BOOM: simulated read failure 2024/12/10 01:20:05 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 7/10: BOOM: simulated read failure 2024/12/10 01:20:05 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:20:06 DEBUG : Returning error reader 2024/12/10 01:20:06 DEBUG : BOOM: simulated read failure 2024/12/10 01:20:06 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 8/10: BOOM: simulated read failure 2024/12/10 01:20:06 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:20:07 DEBUG : Returning error reader 2024/12/10 01:20:07 DEBUG : BOOM: simulated read failure 2024/12/10 01:20:07 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 9/10: BOOM: simulated read failure 2024/12/10 01:20:07 DEBUG : Open with options = [RangeOption(10485760,10485760)] 2024/12/10 01:20:08 DEBUG : Returning error reader 2024/12/10 01:20:08 DEBUG : BOOM: simulated read failure 2024/12/10 01:20:08 DEBUG : test-multithread-abort: Reopening on read failure after offset 0 bytes: retry 10/10: BOOM: simulated read failure 2024/12/10 01:20:08 DEBUG : test-multithread-abort: Reopen failed after offset 0 bytes read: failed to reopen: too many retries 2024/12/10 01:20:08 DEBUG : test-multithread-abort: multi-thread copy: chunk 3/3 failed: multi-thread copy: failed to write chunk: BOOM: simulated read failure 2024/12/10 01:20:08 DEBUG : test-multithread-abort: multi-thread copy: cancelling transfer on exit 2024/12/10 01:20:08 DEBUG : test-multithread-abort: multipart upload "AE2J-6oeBWKgYBknJZpKN_jRK0q9-yUj3AVQt80bhrJLr-I6hRgqz_7C7EObe0bycaK0N4ZR-tF0rZlj4ePi8JvAsMUF69ZsCT0wHj1Z5yXtRpghw3zCmazifuwgNfRLJqTbDNjQd4nRhHgk3dI9rEsoS0XbWJV0d7MgCwBsLYngqlExMh2jgBRM1mjGLkfv9wRBYxD7zo9luNMOFdPsfAUeYTki8msr2XYjB87RJ1VqZuyZCryTWrOiOXLr5PRYvqq30GqjjqnFNsludYtys6s7cF-XbV4prrHYbKkWVL0xFW6s0eQyOmuJCjiMHGRNqN_oy8EkRU0iuiHYs4U9Hmo" aborted --- PASS: TestMultithreadCopyAbort (12.05s) === RUN TestSizeDiffers --- PASS: TestSizeDiffers (0.00s) === RUN TestReOpen === RUN TestReOpen/Normal === RUN TestReOpen/Normal/Basics 2024/12/10 01:20:09 DEBUG : potato: Seek from 10 to 0 === RUN TestReOpen/Normal/ErrorAtStart === RUN TestReOpen/Normal/WithErrors 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/10: test error === RUN TestReOpen/Normal/TooManyErrors 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopen failed after offset 6 bytes read: failed to reopen: too many retries === RUN TestReOpen/Normal/Seek 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2024/12/10 01:20:09 DEBUG : potato: Seek from 5 to 2 === RUN TestReOpen/Normal/AccountRead === RUN TestReOpen/Normal/AccountReadDelay 2024/12/10 01:20:09 DEBUG : potato: Seek from 10 to 0 2024/12/10 01:20:09 DEBUG : potato: Seek from 10 to 0 2024/12/10 01:20:09 DEBUG : potato: Seek from 10 to 0 === RUN TestReOpen/Normal/AccountReadError === RUN TestReOpen/WithRangeOption === RUN TestReOpen/WithRangeOption/Basics 2024/12/10 01:20:09 DEBUG : potato: Seek from 7 to 0 === RUN TestReOpen/WithRangeOption/ErrorAtStart === RUN TestReOpen/WithRangeOption/WithErrors 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/10: test error === RUN TestReOpen/WithRangeOption/TooManyErrors 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopen failed after offset 6 bytes read: failed to reopen: too many retries === RUN TestReOpen/WithRangeOption/Seek 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2024/12/10 01:20:09 DEBUG : potato: Seek from 5 to 2 2024/12/10 01:20:09 DEBUG : potato: Seek from 7 to 4 === RUN TestReOpen/WithRangeOption/AccountRead === RUN TestReOpen/WithRangeOption/AccountReadDelay 2024/12/10 01:20:09 DEBUG : potato: Seek from 7 to 0 2024/12/10 01:20:09 DEBUG : potato: Seek from 7 to 0 2024/12/10 01:20:09 DEBUG : potato: Seek from 7 to 0 === RUN TestReOpen/WithRangeOption/AccountReadError === RUN TestReOpen/WithSeekOption === RUN TestReOpen/WithSeekOption/Basics 2024/12/10 01:20:09 DEBUG : potato: Seek from 8 to 0 === RUN TestReOpen/WithSeekOption/ErrorAtStart === RUN TestReOpen/WithSeekOption/WithErrors 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/10: test error === RUN TestReOpen/WithSeekOption/TooManyErrors 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopen failed after offset 6 bytes read: failed to reopen: too many retries === RUN TestReOpen/WithSeekOption/Seek 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2024/12/10 01:20:09 DEBUG : potato: Seek from 5 to 2 2024/12/10 01:20:09 DEBUG : potato: Seek from 7 to 5 === RUN TestReOpen/WithSeekOption/AccountRead === RUN TestReOpen/WithSeekOption/AccountReadDelay 2024/12/10 01:20:09 DEBUG : potato: Seek from 8 to 0 2024/12/10 01:20:09 DEBUG : potato: Seek from 8 to 0 2024/12/10 01:20:09 DEBUG : potato: Seek from 8 to 0 === RUN TestReOpen/WithSeekOption/AccountReadError === RUN TestReOpen/UnknownSize === RUN TestReOpen/UnknownSize/Basics 2024/12/10 01:20:09 DEBUG : potato: Seek from 9 to 0 === RUN TestReOpen/UnknownSize/ErrorAtStart === RUN TestReOpen/UnknownSize/WithErrors 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/10: test error === RUN TestReOpen/UnknownSize/TooManyErrors 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 1/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 2/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 6 bytes: retry 3/3: test error 2024/12/10 01:20:09 DEBUG : potato: Reopen failed after offset 6 bytes read: failed to reopen: too many retries === RUN TestReOpen/UnknownSize/Seek 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 2 bytes: retry 0/10: test error 2024/12/10 01:20:09 DEBUG : potato: Reopening on read failure after offset 3 bytes: retry 1/10: test error 2024/12/10 01:20:09 DEBUG : potato: Seek from 5 to 2 2024/12/10 01:20:09 DEBUG : potato: Seek from 7 to 6 === RUN TestReOpen/UnknownSize/AccountRead === RUN TestReOpen/UnknownSize/AccountReadDelay 2024/12/10 01:20:09 DEBUG : potato: Seek from 9 to 0 2024/12/10 01:20:09 DEBUG : potato: Seek from 9 to 0 2024/12/10 01:20:09 DEBUG : potato: Seek from 9 to 0 === RUN TestReOpen/UnknownSize/AccountReadError --- PASS: TestReOpen (0.00s) --- PASS: TestReOpen/Normal (0.00s) --- PASS: TestReOpen/Normal/Basics (0.00s) --- PASS: TestReOpen/Normal/ErrorAtStart (0.00s) --- PASS: TestReOpen/Normal/WithErrors (0.00s) --- PASS: TestReOpen/Normal/TooManyErrors (0.00s) --- PASS: TestReOpen/Normal/Seek (0.00s) --- PASS: TestReOpen/Normal/AccountRead (0.00s) --- PASS: TestReOpen/Normal/AccountReadDelay (0.00s) --- PASS: TestReOpen/Normal/AccountReadError (0.00s) --- PASS: TestReOpen/WithRangeOption (0.00s) --- PASS: TestReOpen/WithRangeOption/Basics (0.00s) --- PASS: TestReOpen/WithRangeOption/ErrorAtStart (0.00s) --- PASS: TestReOpen/WithRangeOption/WithErrors (0.00s) --- PASS: TestReOpen/WithRangeOption/TooManyErrors (0.00s) --- PASS: TestReOpen/WithRangeOption/Seek (0.00s) --- PASS: TestReOpen/WithRangeOption/AccountRead (0.00s) --- PASS: TestReOpen/WithRangeOption/AccountReadDelay (0.00s) --- PASS: TestReOpen/WithRangeOption/AccountReadError (0.00s) --- PASS: TestReOpen/WithSeekOption (0.00s) --- PASS: TestReOpen/WithSeekOption/Basics (0.00s) --- PASS: TestReOpen/WithSeekOption/ErrorAtStart (0.00s) --- PASS: TestReOpen/WithSeekOption/WithErrors (0.00s) --- PASS: TestReOpen/WithSeekOption/TooManyErrors (0.00s) --- PASS: TestReOpen/WithSeekOption/Seek (0.00s) --- PASS: TestReOpen/WithSeekOption/AccountRead (0.00s) --- PASS: TestReOpen/WithSeekOption/AccountReadDelay (0.00s) --- PASS: TestReOpen/WithSeekOption/AccountReadError (0.00s) --- PASS: TestReOpen/UnknownSize (0.00s) --- PASS: TestReOpen/UnknownSize/Basics (0.00s) --- PASS: TestReOpen/UnknownSize/ErrorAtStart (0.00s) --- PASS: TestReOpen/UnknownSize/WithErrors (0.00s) --- PASS: TestReOpen/UnknownSize/TooManyErrors (0.00s) --- PASS: TestReOpen/UnknownSize/Seek (0.00s) --- PASS: TestReOpen/UnknownSize/AccountRead (0.00s) --- PASS: TestReOpen/UnknownSize/AccountReadDelay (0.00s) --- PASS: TestReOpen/UnknownSize/AccountReadError (0.00s) === RUN TestCheck run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" === RUN TestCheck/1 === RUN TestCheck/2 === RUN TestCheck/3 === RUN TestCheck/4 === RUN TestCheck/5 === RUN TestCheck/6 === RUN TestCheck/7 --- PASS: TestCheck (3.79s) --- PASS: TestCheck/1 (0.07s) --- PASS: TestCheck/2 (0.08s) --- PASS: TestCheck/3 (0.08s) --- PASS: TestCheck/4 (0.08s) --- PASS: TestCheck/5 (0.08s) --- PASS: TestCheck/6 (0.07s) --- PASS: TestCheck/7 (0.21s) === RUN TestCheckFsError 2024/12/10 01:20:12 DEBUG : Creating backend with remote "nonexistent" 2024/12/10 01:20:12 DEBUG : Creating backend with remote "nonexistent" 2024/12/10 01:20:12 DEBUG : Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/nonexistent: Waiting for checks to finish 2024/12/10 01:20:12 ERROR : Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/nonexistent: error reading source root directory: directory not found 2024/12/10 01:20:12 NOTICE: Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/nonexistent: 2 differences found 2024/12/10 01:20:12 NOTICE: Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/nonexistent: 2 errors while checking --- PASS: TestCheckFsError (0.00s) === RUN TestCheckDownload run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" === RUN TestCheckDownload/1 === RUN TestCheckDownload/2 === RUN TestCheckDownload/3 === RUN TestCheckDownload/4 === RUN TestCheckDownload/5 === RUN TestCheckDownload/6 === RUN TestCheckDownload/7 --- PASS: TestCheckDownload (3.99s) --- PASS: TestCheckDownload/1 (0.24s) --- PASS: TestCheckDownload/2 (0.14s) --- PASS: TestCheckDownload/3 (0.17s) --- PASS: TestCheckDownload/4 (0.21s) --- PASS: TestCheckDownload/5 (0.20s) --- PASS: TestCheckDownload/6 (0.24s) --- PASS: TestCheckDownload/7 (0.14s) === RUN TestCheckSizeOnly run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" === RUN TestCheckSizeOnly/1 === RUN TestCheckSizeOnly/2 === RUN TestCheckSizeOnly/3 === RUN TestCheckSizeOnly/4 === RUN TestCheckSizeOnly/5 === RUN TestCheckSizeOnly/6 === RUN TestCheckSizeOnly/7 --- PASS: TestCheckSizeOnly (3.09s) --- PASS: TestCheckSizeOnly/1 (0.07s) --- PASS: TestCheckSizeOnly/2 (0.06s) --- PASS: TestCheckSizeOnly/3 (0.07s) --- PASS: TestCheckSizeOnly/4 (0.07s) --- PASS: TestCheckSizeOnly/5 (0.06s) --- PASS: TestCheckSizeOnly/6 (0.10s) --- PASS: TestCheckSizeOnly/7 (0.09s) === RUN TestCheckEqualReaders --- PASS: TestCheckEqualReaders (0.00s) === RUN TestParseSumFile run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:20:20 NOTICE: test.sum: improperly formatted checksum line 4 2024/12/10 01:20:20 NOTICE: test.sum: improperly formatted checksum line 5 2024/12/10 01:20:20 NOTICE: test.sum: improperly formatted checksum line 6 2024/12/10 01:20:20 NOTICE: test.sum: 2 warning(s) suppressed... 2024/12/10 01:20:20 NOTICE: test.sum: improperly formatted checksum line 4 2024/12/10 01:20:20 NOTICE: test.sum: improperly formatted checksum line 5 2024/12/10 01:20:20 NOTICE: test.sum: improperly formatted checksum line 6 2024/12/10 01:20:20 NOTICE: test.sum: 2 warning(s) suppressed... --- PASS: TestParseSumFile (1.18s) === RUN TestCheckSum run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:20:21 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/data" === RUN TestCheckSum/subtest1 === RUN TestCheckSum/subtest2 === RUN TestCheckSum/subtest3 === RUN TestCheckSum/subtest4 === RUN TestCheckSum/subtest5 === RUN TestCheckSum/subtest6 === RUN TestCheckSum/subtest7 --- PASS: TestCheckSum (7.46s) --- PASS: TestCheckSum/subtest1 (0.31s) --- PASS: TestCheckSum/subtest2 (0.25s) --- PASS: TestCheckSum/subtest3 (0.28s) --- PASS: TestCheckSum/subtest4 (0.32s) --- PASS: TestCheckSum/subtest5 (0.28s) --- PASS: TestCheckSum/subtest6 (0.24s) --- PASS: TestCheckSum/subtest7 (0.26s) === RUN TestCheckSumDownload run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:20:28 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/data" === RUN TestCheckSumDownload/subtest1 === RUN TestCheckSumDownload/subtest2 === RUN TestCheckSumDownload/subtest3 === RUN TestCheckSumDownload/subtest4 === RUN TestCheckSumDownload/subtest5 === RUN TestCheckSumDownload/subtest6 === RUN TestCheckSumDownload/subtest7 --- PASS: TestCheckSumDownload (9.11s) --- PASS: TestCheckSumDownload/subtest1 (0.46s) --- PASS: TestCheckSumDownload/subtest2 (0.35s) --- PASS: TestCheckSumDownload/subtest3 (0.68s) --- PASS: TestCheckSumDownload/subtest4 (0.53s) --- PASS: TestCheckSumDownload/subtest5 (0.39s) --- PASS: TestCheckSumDownload/subtest6 (0.39s) --- PASS: TestCheckSumDownload/subtest7 (0.43s) === RUN TestApplyTransforms 2024/12/10 01:20:37 DEBUG : Creating backend with remote "TestS3R2:rclone-test-jusipaz2qube" 2024/12/10 01:20:37 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:37 DEBUG : Creating backend with remote "/tmp/rclone3365583250" run.go:180: Remote "S3 bucket rclone-test-jusipaz2qube", Local "Local file system at /tmp/rclone3365583250", Modify Window "1ns" 2024/12/10 01:20:38 INFO : S3 bucket rclone-test-jusipaz2qube: Bucket "rclone-test-jusipaz2qube" created with ACL "" upper checkfile vs. lower remote (without normalization) 2024/12/10 01:20:38 ERROR : hello, world!: sum not found 2024/12/10 01:20:38 ERROR : HELLO, WORLD!: file not in S3 bucket rclone-test-jusipaz2qube 2024/12/10 01:20:38 NOTICE: S3 bucket rclone-test-jusipaz2qube: 1 files missing 2024/12/10 01:20:38 NOTICE: 1 hashes missing 2024/12/10 01:20:38 NOTICE: S3 bucket rclone-test-jusipaz2qube: 2 differences found 2024/12/10 01:20:38 NOTICE: S3 bucket rclone-test-jusipaz2qube: 2 errors while checking upper checkfile vs. lower remote (with normalization) 2024/12/10 01:20:38 DEBUG : hello, world!: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:38 NOTICE: S3 bucket rclone-test-jusipaz2qube: 0 differences found 2024/12/10 01:20:38 NOTICE: S3 bucket rclone-test-jusipaz2qube: 1 matching files 2024/12/10 01:20:38 DEBUG : Creating backend with remote "TestS3R2:rclone-test-noviqic8rucu" 2024/12/10 01:20:38 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:38 DEBUG : Creating backend with remote "/tmp/rclone2740669707" run.go:180: Remote "S3 bucket rclone-test-noviqic8rucu", Local "Local file system at /tmp/rclone2740669707", Modify Window "1ns" 2024/12/10 01:20:39 INFO : S3 bucket rclone-test-noviqic8rucu: Bucket "rclone-test-noviqic8rucu" created with ACL "" lower checkfile vs. upper remote (without normalization) 2024/12/10 01:20:40 ERROR : HELLO, WORLD!: sum not found 2024/12/10 01:20:40 ERROR : hello, world!: file not in S3 bucket rclone-test-noviqic8rucu 2024/12/10 01:20:40 NOTICE: S3 bucket rclone-test-noviqic8rucu: 1 files missing 2024/12/10 01:20:40 NOTICE: 1 hashes missing 2024/12/10 01:20:40 NOTICE: S3 bucket rclone-test-noviqic8rucu: 2 differences found 2024/12/10 01:20:40 NOTICE: S3 bucket rclone-test-noviqic8rucu: 2 errors while checking lower checkfile vs. upper remote (with normalization) 2024/12/10 01:20:40 DEBUG : HELLO, WORLD!: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:40 NOTICE: S3 bucket rclone-test-noviqic8rucu: 0 differences found 2024/12/10 01:20:40 NOTICE: S3 bucket rclone-test-noviqic8rucu: 1 matching files 2024/12/10 01:20:40 DEBUG : Creating backend with remote "TestS3R2:rclone-test-xupevij4rase" 2024/12/10 01:20:40 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:40 DEBUG : Creating backend with remote "/tmp/rclone1039317675" run.go:180: Remote "S3 bucket rclone-test-xupevij4rase", Local "Local file system at /tmp/rclone1039317675", Modify Window "1ns" 2024/12/10 01:20:41 INFO : S3 bucket rclone-test-xupevij4rase: Bucket "rclone-test-xupevij4rase" created with ACL "" lower checkfile vs. upperlowermixed remote (without normalization) 2024/12/10 01:20:41 ERROR : HeLlO, wOrLd!: sum not found 2024/12/10 01:20:41 ERROR : hello, world!: file not in S3 bucket rclone-test-xupevij4rase 2024/12/10 01:20:41 NOTICE: S3 bucket rclone-test-xupevij4rase: 1 files missing 2024/12/10 01:20:41 NOTICE: 1 hashes missing 2024/12/10 01:20:41 NOTICE: S3 bucket rclone-test-xupevij4rase: 2 differences found 2024/12/10 01:20:41 NOTICE: S3 bucket rclone-test-xupevij4rase: 2 errors while checking lower checkfile vs. upperlowermixed remote (with normalization) 2024/12/10 01:20:41 DEBUG : HeLlO, wOrLd!: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:41 NOTICE: S3 bucket rclone-test-xupevij4rase: 0 differences found 2024/12/10 01:20:41 NOTICE: S3 bucket rclone-test-xupevij4rase: 1 matching files 2024/12/10 01:20:41 DEBUG : Creating backend with remote "TestS3R2:rclone-test-rodomap5yeqo" 2024/12/10 01:20:41 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:41 DEBUG : Creating backend with remote "/tmp/rclone3081467959" run.go:180: Remote "S3 bucket rclone-test-rodomap5yeqo", Local "Local file system at /tmp/rclone3081467959", Modify Window "1ns" 2024/12/10 01:20:42 INFO : S3 bucket rclone-test-rodomap5yeqo: Bucket "rclone-test-rodomap5yeqo" created with ACL "" upperlowermixed checkfile vs. upper remote (without normalization) 2024/12/10 01:20:42 ERROR : HELLO, WORLD!: sum not found 2024/12/10 01:20:42 ERROR : HeLlO, wOrLd!: file not in S3 bucket rclone-test-rodomap5yeqo 2024/12/10 01:20:42 NOTICE: S3 bucket rclone-test-rodomap5yeqo: 1 files missing 2024/12/10 01:20:42 NOTICE: 1 hashes missing 2024/12/10 01:20:42 NOTICE: S3 bucket rclone-test-rodomap5yeqo: 2 differences found 2024/12/10 01:20:42 NOTICE: S3 bucket rclone-test-rodomap5yeqo: 2 errors while checking upperlowermixed checkfile vs. upper remote (with normalization) 2024/12/10 01:20:43 DEBUG : HELLO, WORLD!: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:43 NOTICE: S3 bucket rclone-test-rodomap5yeqo: 0 differences found 2024/12/10 01:20:43 NOTICE: S3 bucket rclone-test-rodomap5yeqo: 1 matching files 2024/12/10 01:20:43 DEBUG : Creating backend with remote "TestS3R2:rclone-test-ruqeraj6zobu" 2024/12/10 01:20:43 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:43 DEBUG : Creating backend with remote "/tmp/rclone3822505833" run.go:180: Remote "S3 bucket rclone-test-ruqeraj6zobu", Local "Local file system at /tmp/rclone3822505833", Modify Window "1ns" 2024/12/10 01:20:43 INFO : S3 bucket rclone-test-ruqeraj6zobu: Bucket "rclone-test-ruqeraj6zobu" created with ACL "" NFD checkfile vs. NFC remote (without normalization) 2024/12/10 01:20:44 ERROR : 測試_Русский___ě_áñ: sum not found 2024/12/10 01:20:44 ERROR : 測試_Русский___ě_áñ: file not in S3 bucket rclone-test-ruqeraj6zobu 2024/12/10 01:20:44 NOTICE: S3 bucket rclone-test-ruqeraj6zobu: 1 files missing 2024/12/10 01:20:44 NOTICE: 1 hashes missing 2024/12/10 01:20:44 NOTICE: S3 bucket rclone-test-ruqeraj6zobu: 2 differences found 2024/12/10 01:20:44 NOTICE: S3 bucket rclone-test-ruqeraj6zobu: 2 errors while checking NFD checkfile vs. NFC remote (with normalization) 2024/12/10 01:20:44 DEBUG : 測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:44 NOTICE: S3 bucket rclone-test-ruqeraj6zobu: 0 differences found 2024/12/10 01:20:44 NOTICE: S3 bucket rclone-test-ruqeraj6zobu: 1 matching files 2024/12/10 01:20:44 DEBUG : Creating backend with remote "TestS3R2:rclone-test-wosatob6pamu" 2024/12/10 01:20:44 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:44 DEBUG : Creating backend with remote "/tmp/rclone2621339361" run.go:180: Remote "S3 bucket rclone-test-wosatob6pamu", Local "Local file system at /tmp/rclone2621339361", Modify Window "1ns" 2024/12/10 01:20:45 INFO : S3 bucket rclone-test-wosatob6pamu: Bucket "rclone-test-wosatob6pamu" created with ACL "" NFC checkfile vs. NFD remote (without normalization) 2024/12/10 01:20:45 ERROR : 測試_Русский___ě_áñ: sum not found 2024/12/10 01:20:45 ERROR : 測試_Русский___ě_áñ: file not in S3 bucket rclone-test-wosatob6pamu 2024/12/10 01:20:45 NOTICE: S3 bucket rclone-test-wosatob6pamu: 1 files missing 2024/12/10 01:20:45 NOTICE: 1 hashes missing 2024/12/10 01:20:45 NOTICE: S3 bucket rclone-test-wosatob6pamu: 2 differences found 2024/12/10 01:20:45 NOTICE: S3 bucket rclone-test-wosatob6pamu: 2 errors while checking NFC checkfile vs. NFD remote (with normalization) 2024/12/10 01:20:46 DEBUG : 測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:46 NOTICE: S3 bucket rclone-test-wosatob6pamu: 0 differences found 2024/12/10 01:20:46 NOTICE: S3 bucket rclone-test-wosatob6pamu: 1 matching files 2024/12/10 01:20:46 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nijogum2lavi" 2024/12/10 01:20:46 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:46 DEBUG : Creating backend with remote "/tmp/rclone2729417370" run.go:180: Remote "S3 bucket rclone-test-nijogum2lavi", Local "Local file system at /tmp/rclone2729417370", Modify Window "1ns" 2024/12/10 01:20:47 INFO : S3 bucket rclone-test-nijogum2lavi: Bucket "rclone-test-nijogum2lavi" created with ACL "" NFDx2 checkfile vs. both remote (without normalization) 2024/12/10 01:20:47 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: sum not found 2024/12/10 01:20:47 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: file not in S3 bucket rclone-test-nijogum2lavi 2024/12/10 01:20:47 NOTICE: S3 bucket rclone-test-nijogum2lavi: 1 files missing 2024/12/10 01:20:47 NOTICE: 1 hashes missing 2024/12/10 01:20:47 NOTICE: S3 bucket rclone-test-nijogum2lavi: 2 differences found 2024/12/10 01:20:47 NOTICE: S3 bucket rclone-test-nijogum2lavi: 2 errors while checking NFDx2 checkfile vs. both remote (with normalization) 2024/12/10 01:20:47 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:47 NOTICE: S3 bucket rclone-test-nijogum2lavi: 0 differences found 2024/12/10 01:20:47 NOTICE: S3 bucket rclone-test-nijogum2lavi: 1 matching files 2024/12/10 01:20:47 DEBUG : Creating backend with remote "TestS3R2:rclone-test-socitut5xipi" 2024/12/10 01:20:47 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:47 DEBUG : Creating backend with remote "/tmp/rclone1151635221" run.go:180: Remote "S3 bucket rclone-test-socitut5xipi", Local "Local file system at /tmp/rclone1151635221", Modify Window "1ns" 2024/12/10 01:20:48 INFO : S3 bucket rclone-test-socitut5xipi: Bucket "rclone-test-socitut5xipi" created with ACL "" NFCx2 checkfile vs. both remote (without normalization) 2024/12/10 01:20:48 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: sum not found 2024/12/10 01:20:48 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: file not in S3 bucket rclone-test-socitut5xipi 2024/12/10 01:20:48 NOTICE: S3 bucket rclone-test-socitut5xipi: 1 files missing 2024/12/10 01:20:48 NOTICE: 1 hashes missing 2024/12/10 01:20:48 NOTICE: S3 bucket rclone-test-socitut5xipi: 2 differences found 2024/12/10 01:20:48 NOTICE: S3 bucket rclone-test-socitut5xipi: 2 errors while checking NFCx2 checkfile vs. both remote (with normalization) 2024/12/10 01:20:49 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:49 NOTICE: S3 bucket rclone-test-socitut5xipi: 0 differences found 2024/12/10 01:20:49 NOTICE: S3 bucket rclone-test-socitut5xipi: 1 matching files 2024/12/10 01:20:49 DEBUG : Creating backend with remote "TestS3R2:rclone-test-guvokam4wudi" 2024/12/10 01:20:49 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:49 DEBUG : Creating backend with remote "/tmp/rclone399014328" run.go:180: Remote "S3 bucket rclone-test-guvokam4wudi", Local "Local file system at /tmp/rclone399014328", Modify Window "1ns" 2024/12/10 01:20:49 INFO : S3 bucket rclone-test-guvokam4wudi: Bucket "rclone-test-guvokam4wudi" created with ACL "" both checkfile vs. NFDx2 remote (without normalization) 2024/12/10 01:20:50 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: sum not found 2024/12/10 01:20:50 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: file not in S3 bucket rclone-test-guvokam4wudi 2024/12/10 01:20:50 NOTICE: S3 bucket rclone-test-guvokam4wudi: 1 files missing 2024/12/10 01:20:50 NOTICE: 1 hashes missing 2024/12/10 01:20:50 NOTICE: S3 bucket rclone-test-guvokam4wudi: 2 differences found 2024/12/10 01:20:50 NOTICE: S3 bucket rclone-test-guvokam4wudi: 2 errors while checking both checkfile vs. NFDx2 remote (with normalization) 2024/12/10 01:20:50 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:50 NOTICE: S3 bucket rclone-test-guvokam4wudi: 0 differences found 2024/12/10 01:20:50 NOTICE: S3 bucket rclone-test-guvokam4wudi: 1 matching files 2024/12/10 01:20:50 DEBUG : Creating backend with remote "TestS3R2:rclone-test-ruximas1yage" 2024/12/10 01:20:50 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:20:50 DEBUG : Creating backend with remote "/tmp/rclone3635942596" run.go:180: Remote "S3 bucket rclone-test-ruximas1yage", Local "Local file system at /tmp/rclone3635942596", Modify Window "1ns" 2024/12/10 01:20:51 INFO : S3 bucket rclone-test-ruximas1yage: Bucket "rclone-test-ruximas1yage" created with ACL "" both checkfile vs. NFCx2 remote (without normalization) 2024/12/10 01:20:51 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: sum not found 2024/12/10 01:20:51 ERROR : 測試_Русский___ě_áñ測試_Русский___ě_áñ: file not in S3 bucket rclone-test-ruximas1yage 2024/12/10 01:20:51 NOTICE: S3 bucket rclone-test-ruximas1yage: 1 files missing 2024/12/10 01:20:51 NOTICE: 1 hashes missing 2024/12/10 01:20:51 NOTICE: S3 bucket rclone-test-ruximas1yage: 2 differences found 2024/12/10 01:20:51 NOTICE: S3 bucket rclone-test-ruximas1yage: 2 errors while checking both checkfile vs. NFCx2 remote (with normalization) 2024/12/10 01:20:51 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: md5 = 65a8e27d8879283831b664bd8b7f0ad4 OK 2024/12/10 01:20:51 NOTICE: S3 bucket rclone-test-ruximas1yage: 0 differences found 2024/12/10 01:20:51 NOTICE: S3 bucket rclone-test-ruximas1yage: 1 matching files 2024/12/10 01:20:51 DEBUG : S3 bucket rclone-test-ruximas1yage: Purge remote 2024/12/10 01:20:52 DEBUG : S3 bucket rclone-test-ruximas1yage: bucket is versioned: false 2024/12/10 01:20:52 DEBUG : Waiting for deletions to finish 2024/12/10 01:20:52 DEBUG : "測試_Русский___ě_áñ測試_Русский___ě_áñ" version false 2024/12/10 01:20:52 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleting (id "") 2024/12/10 01:20:52 INFO : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleted 2024/12/10 01:20:52 INFO : S3 bucket rclone-test-ruximas1yage: Bucket "rclone-test-ruximas1yage" deleted 2024/12/10 01:20:52 DEBUG : S3 bucket rclone-test-guvokam4wudi: Purge remote 2024/12/10 01:20:53 DEBUG : S3 bucket rclone-test-guvokam4wudi: bucket is versioned: false 2024/12/10 01:20:53 DEBUG : Waiting for deletions to finish 2024/12/10 01:20:53 DEBUG : "測試_Русский___ě_áñ測試_Русский___ě_áñ" version false 2024/12/10 01:20:53 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleting (id "") 2024/12/10 01:20:53 INFO : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleted 2024/12/10 01:20:54 INFO : S3 bucket rclone-test-guvokam4wudi: Bucket "rclone-test-guvokam4wudi" deleted 2024/12/10 01:20:54 DEBUG : S3 bucket rclone-test-socitut5xipi: Purge remote 2024/12/10 01:20:54 DEBUG : S3 bucket rclone-test-socitut5xipi: bucket is versioned: false 2024/12/10 01:20:54 DEBUG : Waiting for deletions to finish 2024/12/10 01:20:54 DEBUG : "測試_Русский___ě_áñ測試_Русский___ě_áñ" version false 2024/12/10 01:20:54 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleting (id "") 2024/12/10 01:20:54 INFO : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleted 2024/12/10 01:20:55 INFO : S3 bucket rclone-test-socitut5xipi: Bucket "rclone-test-socitut5xipi" deleted 2024/12/10 01:20:55 DEBUG : S3 bucket rclone-test-nijogum2lavi: Purge remote 2024/12/10 01:20:55 DEBUG : S3 bucket rclone-test-nijogum2lavi: bucket is versioned: false 2024/12/10 01:20:55 DEBUG : Waiting for deletions to finish 2024/12/10 01:20:55 DEBUG : "測試_Русский___ě_áñ測試_Русский___ě_áñ" version false 2024/12/10 01:20:55 DEBUG : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleting (id "") 2024/12/10 01:20:55 INFO : 測試_Русский___ě_áñ測試_Русский___ě_áñ: Deleted 2024/12/10 01:20:56 INFO : S3 bucket rclone-test-nijogum2lavi: Bucket "rclone-test-nijogum2lavi" deleted 2024/12/10 01:20:56 DEBUG : S3 bucket rclone-test-wosatob6pamu: Purge remote 2024/12/10 01:20:56 DEBUG : S3 bucket rclone-test-wosatob6pamu: bucket is versioned: false 2024/12/10 01:20:56 DEBUG : Waiting for deletions to finish 2024/12/10 01:20:56 DEBUG : "測試_Русский___ě_áñ" version false 2024/12/10 01:20:56 DEBUG : 測試_Русский___ě_áñ: Deleting (id "") 2024/12/10 01:20:56 INFO : 測試_Русский___ě_áñ: Deleted 2024/12/10 01:20:57 INFO : S3 bucket rclone-test-wosatob6pamu: Bucket "rclone-test-wosatob6pamu" deleted 2024/12/10 01:20:57 DEBUG : S3 bucket rclone-test-ruqeraj6zobu: Purge remote 2024/12/10 01:20:57 DEBUG : S3 bucket rclone-test-ruqeraj6zobu: bucket is versioned: false 2024/12/10 01:20:57 DEBUG : Waiting for deletions to finish 2024/12/10 01:20:57 DEBUG : "測試_Русский___ě_áñ" version false 2024/12/10 01:20:57 DEBUG : 測試_Русский___ě_áñ: Deleting (id "") 2024/12/10 01:20:57 INFO : 測試_Русский___ě_áñ: Deleted 2024/12/10 01:20:58 INFO : S3 bucket rclone-test-ruqeraj6zobu: Bucket "rclone-test-ruqeraj6zobu" deleted 2024/12/10 01:20:58 DEBUG : S3 bucket rclone-test-rodomap5yeqo: Purge remote 2024/12/10 01:20:58 DEBUG : S3 bucket rclone-test-rodomap5yeqo: bucket is versioned: false 2024/12/10 01:20:58 DEBUG : Waiting for deletions to finish 2024/12/10 01:20:58 DEBUG : "HELLO, WORLD!" version false 2024/12/10 01:20:58 DEBUG : HELLO, WORLD!: Deleting (id "") 2024/12/10 01:20:58 INFO : HELLO, WORLD!: Deleted 2024/12/10 01:20:59 INFO : S3 bucket rclone-test-rodomap5yeqo: Bucket "rclone-test-rodomap5yeqo" deleted 2024/12/10 01:20:59 DEBUG : S3 bucket rclone-test-xupevij4rase: Purge remote 2024/12/10 01:20:59 DEBUG : S3 bucket rclone-test-xupevij4rase: bucket is versioned: false 2024/12/10 01:20:59 DEBUG : Waiting for deletions to finish 2024/12/10 01:20:59 DEBUG : "HeLlO, wOrLd!" version false 2024/12/10 01:20:59 DEBUG : HeLlO, wOrLd!: Deleting (id "") 2024/12/10 01:20:59 INFO : HeLlO, wOrLd!: Deleted 2024/12/10 01:21:00 INFO : S3 bucket rclone-test-xupevij4rase: Bucket "rclone-test-xupevij4rase" deleted 2024/12/10 01:21:00 DEBUG : S3 bucket rclone-test-noviqic8rucu: Purge remote 2024/12/10 01:21:00 DEBUG : S3 bucket rclone-test-noviqic8rucu: bucket is versioned: false 2024/12/10 01:21:00 DEBUG : Waiting for deletions to finish 2024/12/10 01:21:00 DEBUG : "HELLO, WORLD!" version false 2024/12/10 01:21:00 DEBUG : HELLO, WORLD!: Deleting (id "") 2024/12/10 01:21:00 INFO : HELLO, WORLD!: Deleted 2024/12/10 01:21:01 INFO : S3 bucket rclone-test-noviqic8rucu: Bucket "rclone-test-noviqic8rucu" deleted 2024/12/10 01:21:01 DEBUG : S3 bucket rclone-test-jusipaz2qube: Purge remote 2024/12/10 01:21:01 DEBUG : S3 bucket rclone-test-jusipaz2qube: bucket is versioned: false 2024/12/10 01:21:01 DEBUG : Waiting for deletions to finish 2024/12/10 01:21:01 DEBUG : "hello, world!" version false 2024/12/10 01:21:01 DEBUG : hello, world!: Deleting (id "") 2024/12/10 01:21:01 INFO : hello, world!: Deleted 2024/12/10 01:21:02 INFO : S3 bucket rclone-test-jusipaz2qube: Bucket "rclone-test-jusipaz2qube" deleted --- PASS: TestApplyTransforms (24.91s) === RUN TestTruncateString --- PASS: TestTruncateString (0.00s) === RUN TestCopyFile run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:02 DEBUG : file1: Need to transfer - File not found at Destination 2024/12/10 01:21:02 DEBUG : file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2024/12/10 01:21:02 INFO : file1: Copied (new) to: sub/file2 2024/12/10 01:21:03 DEBUG : file1: Size and modification time the same (differ by 0s, within tolerance 1ns) 2024/12/10 01:21:03 DEBUG : file1: Unchanged skipping 2024/12/10 01:21:03 DEBUG : S3 bucket rclone-test-nutecod8gavi: don't need to copy/move sub/file2, it is already at target location --- PASS: TestCopyFile (1.08s) === RUN TestCopyLongFile run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" copy_test.go:153: Test only runs on local --- SKIP: TestCopyLongFile (0.13s) === RUN TestCopyFileBackupDir run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:04 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/backup" 2024/12/10 01:21:04 DEBUG : dst/file1: Sizes differ (src 14 vs dst 18) 2024/12/10 01:21:05 DEBUG : dst/file1: md5 = 05164b153084ba910184c26e561a7c18 OK 2024/12/10 01:21:05 INFO : dst/file1: Copied (server-side copy) 2024/12/10 01:21:05 INFO : dst/file1: Deleted 2024/12/10 01:21:05 DEBUG : dst/file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2024/12/10 01:21:05 INFO : dst/file1: Copied (new) --- PASS: TestCopyFileBackupDir (2.15s) === RUN TestCopyFileCompareDest run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:05 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/dst" 2024/12/10 01:21:06 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/CompareDest" 2024/12/10 01:21:06 DEBUG : one: Need to transfer - File not found at Destination 2024/12/10 01:21:06 DEBUG : one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2024/12/10 01:21:06 INFO : one: Copied (new) 2024/12/10 01:21:07 DEBUG : one: Sizes differ (src 5 vs dst 3) 2024/12/10 01:21:07 DEBUG : one: md5 = 07912d142f5d63ee918b34796b5a2432 OK 2024/12/10 01:21:07 INFO : one: Copied (replaced existing) 2024/12/10 01:21:08 DEBUG : one: Sizes differ (src 5 vs dst 3) 2024/12/10 01:21:08 DEBUG : one: Size and modification time the same (differ by 0s, within tolerance 1ns) 2024/12/10 01:21:08 DEBUG : one: Destination found in --compare-dest, skipping 2024/12/10 01:21:09 DEBUG : two: Need to transfer - File not found at Destination 2024/12/10 01:21:09 DEBUG : two: Size and modification time the same (differ by 0s, within tolerance 1ns) 2024/12/10 01:21:09 DEBUG : two: Destination found in --compare-dest, skipping 2024/12/10 01:21:09 DEBUG : two: Need to transfer - File not found at Destination 2024/12/10 01:21:09 DEBUG : two: Size and modification time the same (differ by 0s, within tolerance 1ns) 2024/12/10 01:21:09 DEBUG : two: Destination found in --compare-dest, skipping 2024/12/10 01:21:10 DEBUG : two: Need to transfer - File not found at Destination 2024/12/10 01:21:10 DEBUG : two: Sizes differ (src 5 vs dst 3) 2024/12/10 01:21:10 DEBUG : two: md5 = 2379e4ce8c3380e996ab0509f17069ad OK 2024/12/10 01:21:10 INFO : two: Copied (new) --- PASS: TestCopyFileCompareDest (5.58s) === RUN TestCopyFileCopyDest run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:11 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/dst" 2024/12/10 01:21:11 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/CopyDest" 2024/12/10 01:21:11 DEBUG : one: Need to transfer - File not found at Destination 2024/12/10 01:21:12 DEBUG : one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2024/12/10 01:21:12 INFO : one: Copied (new) 2024/12/10 01:21:12 DEBUG : one: Sizes differ (src 5 vs dst 3) 2024/12/10 01:21:12 DEBUG : one: md5 = 07912d142f5d63ee918b34796b5a2432 OK 2024/12/10 01:21:12 INFO : one: Copied (replaced existing) 2024/12/10 01:21:13 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/BackupDir" 2024/12/10 01:21:14 DEBUG : one: Sizes differ (src 5 vs dst 3) 2024/12/10 01:21:14 DEBUG : one: Size and modification time the same (differ by 0s, within tolerance 1ns) 2024/12/10 01:21:14 DEBUG : one: Sizes differ (src 5 vs dst 3) 2024/12/10 01:21:14 DEBUG : one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2024/12/10 01:21:14 INFO : one: Copied (server-side copy) 2024/12/10 01:21:14 INFO : one: Deleted 2024/12/10 01:21:15 DEBUG : one: md5 = 07912d142f5d63ee918b34796b5a2432 OK 2024/12/10 01:21:15 INFO : one: Copied (server-side copy) 2024/12/10 01:21:15 DEBUG : one: Destination found in --copy-dest, using server-side copy 2024/12/10 01:21:16 DEBUG : two: Need to transfer - File not found at Destination 2024/12/10 01:21:16 DEBUG : two: Size and modification time the same (differ by 0s, within tolerance 1ns) 2024/12/10 01:21:16 DEBUG : two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2024/12/10 01:21:16 INFO : two: Copied (server-side copy) 2024/12/10 01:21:16 DEBUG : two: Destination found in --copy-dest, using server-side copy 2024/12/10 01:21:17 DEBUG : two: Size and modification time the same (differ by 0s, within tolerance 1ns) 2024/12/10 01:21:17 DEBUG : two: Unchanged skipping 2024/12/10 01:21:18 DEBUG : three: Need to transfer - File not found at Destination 2024/12/10 01:21:18 DEBUG : three: Sizes differ (src 7 vs dst 5) 2024/12/10 01:21:18 DEBUG : three: Destination not found in --copy-dest 2024/12/10 01:21:18 DEBUG : three: md5 = 1bccb9dccb3e9f6a3f9d2a8bdb54b7f5 OK 2024/12/10 01:21:18 INFO : three: Copied (new) --- PASS: TestCopyFileCopyDest (8.55s) === RUN TestCopyInplace run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" copy_test.go:370: Partial uploads not supported --- SKIP: TestCopyInplace (0.12s) === RUN TestCopyLongFileName run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" copy_test.go:403: Partial uploads not supported --- SKIP: TestCopyLongFileName (0.15s) === RUN TestCopyFileMaxTransfer run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:20 DEBUG : TestCopyFileMaxTransfer/file1: Need to transfer - File not found at Destination 2024/12/10 01:21:20 DEBUG : TestCopyFileMaxTransfer/file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2024/12/10 01:21:20 INFO : TestCopyFileMaxTransfer/file1: Copied (new) 2024/12/10 01:21:20 DEBUG : TestCopyFileMaxTransfer/file2: Need to transfer - File not found at Destination 2024/12/10 01:21:20 ERROR : TestCopyFileMaxTransfer/file2: Failed to copy: operation error S3: PutObject, exceeded maximum number of attempts, 1, https response error StatusCode: 0, RequestID: , HostID: , request send failed, Put "https://14aad7c9ed489151b51557e321b246cf.r2.cloudflarestorage.com/rclone-test-nutecod8gavi/TestCopyFileMaxTransfer/file2?x-id=PutObject": max transfer limit reached as set by --max-transfer 2024/12/10 01:21:21 DEBUG : TestCopyFileMaxTransfer/file3: Need to transfer - File not found at Destination 2024/12/10 01:21:21 DEBUG : TestCopyFileMaxTransfer/file4: Need to transfer - File not found at Destination 2024/12/10 01:21:21 DEBUG : TestCopyFileMaxTransfer/file4: md5 = f515873e16cbca09da05454da3dccf62 OK 2024/12/10 01:21:21 INFO : TestCopyFileMaxTransfer/file4: Copied (new) --- PASS: TestCopyFileMaxTransfer (1.84s) === RUN TestDeduplicateInteractive run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateInteractive (0.12s) === RUN TestDeduplicateSkip run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateSkip (0.14s) === RUN TestDeduplicateSizeOnly run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateSizeOnly (0.13s) === RUN TestDeduplicateFirst run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateFirst (0.14s) === RUN TestDeduplicateNewest run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateNewest (0.16s) === RUN TestDeduplicateNewestByHash run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:24 INFO : S3 bucket rclone-test-nutecod8gavi: Looking for duplicate md5 hashes using newest mode. 2024/12/10 01:21:24 NOTICE: dd9947062a360bf86b1209a385dba002: Found 3 files with duplicate md5 hashes 2024/12/10 01:21:24 INFO : one: Deleted 2024/12/10 01:21:24 INFO : also/one: Deleted 2024/12/10 01:21:24 NOTICE: dd9947062a360bf86b1209a385dba002: Deleted 2 extra copies --- PASS: TestDeduplicateNewestByHash (2.37s) === RUN TestDeduplicateOldest run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateOldest (0.12s) === RUN TestDeduplicateLargest run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateLargest (0.13s) === RUN TestDeduplicateSmallest run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateSmallest (0.15s) === RUN TestDeduplicateRename run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:24: Can't test deduplicate - no duplicate files possible --- SKIP: TestDeduplicateRename (0.12s) === RUN TestMergeDirs run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" dedupe_test.go:256: Can't merge directories --- SKIP: TestMergeDirs (0.15s) === RUN TestListDirSorted run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:28 DEBUG : a.txt: Excluded (Size Filter) 2024/12/10 01:21:28 DEBUG : a.txt: Excluded 2024/12/10 01:21:28 DEBUG : sub dir/hello world2: Excluded (Size Filter) 2024/12/10 01:21:28 DEBUG : sub dir/hello world2: Excluded 2024/12/10 01:21:28 DEBUG : sub dir/hello world: Excluded (Size Filter) 2024/12/10 01:21:28 DEBUG : sub dir/hello world: Excluded 2024/12/10 01:21:28 DEBUG : sub dir/ignore dir: Excluded 2024/12/10 01:21:28 DEBUG : sub dir/hello world2: Excluded (Size Filter) 2024/12/10 01:21:28 DEBUG : sub dir/hello world2: Excluded 2024/12/10 01:21:28 DEBUG : sub dir/hello world: Excluded (Size Filter) 2024/12/10 01:21:28 DEBUG : sub dir/hello world: Excluded 2024/12/10 01:21:28 DEBUG : sub dir/ignore dir: Excluded --- PASS: TestListDirSorted (3.94s) === RUN TestListJSON run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" === RUN TestListJSON/Default === RUN TestListJSON/FilesOnly === RUN TestListJSON/DirsOnly === RUN TestListJSON/Recurse === RUN TestListJSON/SubDir === RUN TestListJSON/NoModTime === RUN TestListJSON/NoMimeType === RUN TestListJSON/ShowHash === RUN TestListJSON/HashTypes === RUN TestListJSON/Metadata --- PASS: TestListJSON (2.24s) --- PASS: TestListJSON/Default (0.15s) --- PASS: TestListJSON/FilesOnly (0.13s) --- PASS: TestListJSON/DirsOnly (0.06s) --- PASS: TestListJSON/Recurse (0.18s) --- PASS: TestListJSON/SubDir (0.14s) --- PASS: TestListJSON/NoModTime (0.12s) --- PASS: TestListJSON/NoMimeType (0.13s) --- PASS: TestListJSON/ShowHash (0.15s) --- PASS: TestListJSON/HashTypes (0.12s) --- PASS: TestListJSON/Metadata (0.12s) === RUN TestStatJSON run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" === RUN TestStatJSON/Root === RUN TestStatJSON/RootFilesOnly === RUN TestStatJSON/RootDirsOnly === RUN TestStatJSON/Dir === RUN TestStatJSON/DirWithTrailingSlash === RUN TestStatJSON/File === RUN TestStatJSON/NotFound === RUN TestStatJSON/DirFilesOnly === RUN TestStatJSON/FileFilesOnly === RUN TestStatJSON/NotFoundFilesOnly === RUN TestStatJSON/DirDirsOnly === RUN TestStatJSON/FileDirsOnly === RUN TestStatJSON/NotFoundDirsOnly === RUN TestStatJSON/RootNotFound 2024/12/10 01:21:33 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/notfound" --- PASS: TestStatJSON (2.12s) --- PASS: TestStatJSON/Root (0.06s) --- PASS: TestStatJSON/RootFilesOnly (0.00s) --- PASS: TestStatJSON/RootDirsOnly (0.07s) --- PASS: TestStatJSON/Dir (0.13s) --- PASS: TestStatJSON/DirWithTrailingSlash (0.07s) --- PASS: TestStatJSON/File (0.07s) --- PASS: TestStatJSON/NotFound (0.14s) --- PASS: TestStatJSON/DirFilesOnly (0.06s) --- PASS: TestStatJSON/FileFilesOnly (0.06s) --- PASS: TestStatJSON/NotFoundFilesOnly (0.07s) --- PASS: TestStatJSON/DirDirsOnly (0.06s) --- PASS: TestStatJSON/FileDirsOnly (0.07s) --- PASS: TestStatJSON/NotFoundDirsOnly (0.06s) --- PASS: TestStatJSON/RootNotFound (0.19s) === RUN TestMkdir run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:34 DEBUG : S3 bucket rclone-test-nutecod8gavi: Making directory 2024/12/10 01:21:34 DEBUG : S3 bucket rclone-test-nutecod8gavi: Making directory --- PASS: TestMkdir (0.22s) === RUN TestLsd run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" --- PASS: TestLsd (0.75s) === RUN TestLs run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" --- PASS: TestLs (1.17s) === RUN TestLsWithFilesFrom run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:37 DEBUG : empty space: Excluded (FilesFrom Filter) --- PASS: TestLsWithFilesFrom (1.11s) === RUN TestLsLong run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" --- PASS: TestLsLong (1.20s) === RUN TestHashSums run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" === RUN TestHashSums/Md5 === RUN TestHashSums/Md5Download --- PASS: TestHashSums (1.26s) --- PASS: TestHashSums/Md5 (0.06s) --- PASS: TestHashSums/Md5Download (0.20s) === RUN TestHashSumsWithErrors 2024/12/10 01:21:39 DEBUG : Creating backend with remote ":memory:" 2024/12/10 01:21:39 ERROR : file1: hash unsupported: hash type not supported --- PASS: TestHashSumsWithErrors (0.00s) === RUN TestHashStream 2024/12/10 01:21:39 DEBUG : Creating md5 hash of 0 bytes read from input stream 2024/12/10 01:21:39 DEBUG : Creating md5 hash of 0 bytes read from input stream 2024/12/10 01:21:39 DEBUG : Creating sha1 hash of 0 bytes read from input stream 2024/12/10 01:21:39 DEBUG : Creating sha1 hash of 0 bytes read from input stream 2024/12/10 01:21:39 DEBUG : Creating md5 hash of 12 bytes read from input stream 2024/12/10 01:21:39 DEBUG : Creating md5 hash of 12 bytes read from input stream 2024/12/10 01:21:39 DEBUG : Creating sha1 hash of 12 bytes read from input stream 2024/12/10 01:21:39 DEBUG : Creating sha1 hash of 12 bytes read from input stream --- PASS: TestHashStream (0.00s) === RUN TestSuffixName --- PASS: TestSuffixName (0.00s) === RUN TestCount run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" --- PASS: TestCount (1.66s) === RUN TestDelete run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:42 DEBUG : Waiting for deletions to finish 2024/12/10 01:21:42 DEBUG : large: Excluded (Size Filter) 2024/12/10 01:21:42 INFO : small: Deleted 2024/12/10 01:21:42 INFO : medium: Deleted --- PASS: TestDelete (1.82s) === RUN TestMaxDelete run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:44 DEBUG : Waiting for deletions to finish 2024/12/10 01:21:44 ERROR : small: Got fatal error on delete: --max-delete threshold reached 2024/12/10 01:21:44 INFO : large: Deleted 2024/12/10 01:21:44 INFO : medium: Deleted --- PASS: TestMaxDelete (1.67s) === RUN TestMaxDeleteSizeLargeFile run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:46 DEBUG : Waiting for deletions to finish 2024/12/10 01:21:46 ERROR : large: Got fatal error on delete: --max-delete-size threshold reached 2024/12/10 01:21:46 INFO : medium: Deleted 2024/12/10 01:21:46 INFO : small: Deleted --- PASS: TestMaxDeleteSizeLargeFile (1.91s) === RUN TestMaxDeleteSize run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:47 DEBUG : Waiting for deletions to finish 2024/12/10 01:21:48 ERROR : small: Got fatal error on delete: --max-delete-size threshold reached 2024/12/10 01:21:48 INFO : medium: Deleted 2024/12/10 01:21:48 INFO : large: Deleted --- PASS: TestMaxDeleteSize (1.57s) === RUN TestReadFile run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" --- PASS: TestReadFile (0.91s) === RUN TestRetry 2024/12/10 01:21:49 DEBUG : Received error: Wrapped EOF is retriable: EOF - low level retry 1/5 2024/12/10 01:21:49 DEBUG : Received error: Wrapped EOF is retriable: EOF - low level retry 2/5 2024/12/10 01:21:49 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG 2024/12/10 01:21:49 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG 2024/12/10 01:21:49 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG 2024/12/10 01:21:49 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG 2024/12/10 01:21:49 DEBUG : Sleeping for 10ms (as indicated by the server) to obey Retry-After error: BANG --- PASS: TestRetry (0.05s) === RUN TestCat run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" --- PASS: TestCat (2.94s) === RUN TestPurge 2024/12/10 01:21:52 DEBUG : Creating backend with remote "TestS3R2:rclone-test-peyinit8yida" 2024/12/10 01:21:52 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2024/12/10 01:21:52 DEBUG : Creating backend with remote "/tmp/rclone3116268994" run.go:180: Remote "S3 bucket rclone-test-peyinit8yida", Local "Local file system at /tmp/rclone3116268994", Modify Window "1ns" 2024/12/10 01:21:53 INFO : S3 bucket rclone-test-peyinit8yida: Bucket "rclone-test-peyinit8yida" created with ACL "" 2024/12/10 01:21:53 DEBUG : A2: Making directory 2024/12/10 01:21:53 DEBUG : A1/B2: Making directory 2024/12/10 01:21:53 DEBUG : A1/B2/C2: Making directory 2024/12/10 01:21:53 DEBUG : A1/B1/C3: Making directory 2024/12/10 01:21:53 DEBUG : A3: Making directory 2024/12/10 01:21:53 DEBUG : A3/B3: Making directory 2024/12/10 01:21:53 DEBUG : A3/B3/C4: Making directory fstest.go:244: Filtering empty directory "A2" fstest.go:244: Filtering empty directory "A1/B2" fstest.go:244: Filtering empty directory "A1/B2/C2" fstest.go:244: Filtering empty directory "A1/B1/C3" fstest.go:244: Filtering empty directory "A3" fstest.go:244: Filtering empty directory "A3/B3" fstest.go:244: Filtering empty directory "A3/B3/C4" 2024/12/10 01:21:54 DEBUG : S3 bucket rclone-test-peyinit8yida: bucket is versioned: false 2024/12/10 01:21:54 DEBUG : Waiting for deletions to finish 2024/12/10 01:21:54 DEBUG : "A1/B1/C1/one" version false 2024/12/10 01:21:54 DEBUG : A1/B1/C1/one: Deleting (id "") 2024/12/10 01:21:54 INFO : A1/B1/C1/one: Deleted fstest.go:244: Filtering empty directory "A2" fstest.go:244: Filtering empty directory "A1/B2" fstest.go:244: Filtering empty directory "A1/B2/C2" fstest.go:244: Filtering empty directory "A3" fstest.go:244: Filtering empty directory "A3/B3" fstest.go:244: Filtering empty directory "A3/B3/C4" 2024/12/10 01:21:54 DEBUG : Waiting for deletions to finish 2024/12/10 01:21:54 DEBUG : "A1/two" version false 2024/12/10 01:21:54 DEBUG : A1/two: Deleting (id "") 2024/12/10 01:21:54 INFO : A1/two: Deleted 2024/12/10 01:21:55 INFO : S3 bucket rclone-test-peyinit8yida: Bucket "rclone-test-peyinit8yida" deleted 2024/12/10 01:21:55 DEBUG : S3 bucket rclone-test-peyinit8yida: Purge remote 2024/12/10 01:21:55 DEBUG : Waiting for deletions to finish 2024/12/10 01:21:55 NOTICE: purge failed: directory not found --- PASS: TestPurge (3.39s) === RUN TestRmdirsNoLeaveRoot run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:56 DEBUG : A2: Making directory 2024/12/10 01:21:56 DEBUG : A1/B2: Making directory 2024/12/10 01:21:56 DEBUG : A1/B2/C2: Making directory 2024/12/10 01:21:56 DEBUG : A1/B1/C3: Making directory 2024/12/10 01:21:56 DEBUG : A3: Making directory 2024/12/10 01:21:56 DEBUG : A3/B3: Making directory 2024/12/10 01:21:56 DEBUG : A3/B3/C4: Making directory fstest.go:244: Filtering empty directory "A2" fstest.go:244: Filtering empty directory "A1/B2" fstest.go:244: Filtering empty directory "A1/B2/C2" fstest.go:244: Filtering empty directory "A1/B1/C3" fstest.go:244: Filtering empty directory "A3" fstest.go:244: Filtering empty directory "A3/B3" fstest.go:244: Filtering empty directory "A3/B3/C4" 2024/12/10 01:21:56 DEBUG : removing 1 level 3 directories 2024/12/10 01:21:56 INFO : A3/B3/C4: Removing directory fstest.go:244: Filtering empty directory "A2" fstest.go:244: Filtering empty directory "A1/B2" fstest.go:244: Filtering empty directory "A1/B2/C2" fstest.go:244: Filtering empty directory "A1/B1/C3" fstest.go:244: Filtering empty directory "A3" fstest.go:244: Filtering empty directory "A3/B3" 2024/12/10 01:21:57 DEBUG : removing 1 level 0 directories 2024/12/10 01:21:57 INFO : S3 bucket rclone-test-nutecod8gavi: Removing directory 2024/12/10 01:21:58 INFO : S3 bucket rclone-test-nutecod8gavi: Bucket "rclone-test-nutecod8gavi" deleted --- PASS: TestRmdirsNoLeaveRoot (2.71s) === RUN TestRmdirsLeaveRoot run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:58 INFO : S3 bucket rclone-test-nutecod8gavi: Bucket "rclone-test-nutecod8gavi" created with ACL "" 2024/12/10 01:21:58 DEBUG : A1: Making directory 2024/12/10 01:21:58 DEBUG : A1/B1: Making directory 2024/12/10 01:21:58 DEBUG : A1/B1/C1: Making directory fstest.go:244: Filtering empty directory "A1" fstest.go:244: Filtering empty directory "A1/B1" fstest.go:244: Filtering empty directory "A1/B1/C1" fstest.go:244: Filtering empty directory "A1" --- PASS: TestRmdirsLeaveRoot (0.67s) === RUN TestRmdirsWithFilter run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:59 DEBUG : A1: Making directory 2024/12/10 01:21:59 DEBUG : A1/B1: Making directory 2024/12/10 01:21:59 DEBUG : A1/B1/C1: Making directory fstest.go:244: Filtering empty directory "A1" fstest.go:244: Filtering empty directory "A1/B1" fstest.go:244: Filtering empty directory "A1/B1/C1" fstest.go:244: Filtering empty directory "A1" --- PASS: TestRmdirsWithFilter (0.30s) === RUN TestCopyURL run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:59 ERROR : file1: Post request put error: operation error S3: PutObject, https response error StatusCode: 409, RequestID: , HostID: , api error NotImplemented: Bucket recreated during request lifetime. Please try your request again. operations_test.go:843: Error Trace: /home/rclone/go/src/github.com/rclone/rclone/fs/operations/operations_test.go:843 Error: Received unexpected error: operation error S3: PutObject, https response error StatusCode: 409, RequestID: , HostID: , api error NotImplemented: Bucket recreated during request lifetime. Please try your request again. Test: TestCopyURL --- FAIL: TestCopyURL (0.38s) === RUN TestCopyURLToWriter --- PASS: TestCopyURLToWriter (0.00s) === RUN TestMoveFile run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:21:59 DEBUG : file1: Need to transfer - File not found at Destination 2024/12/10 01:22:00 ERROR : file1: Failed to copy: operation error S3: PutObject, https response error StatusCode: 409, RequestID: , HostID: , api error NotImplemented: Bucket recreated during request lifetime. Please try your request again. 2024/12/10 01:22:00 ERROR : file1: Not deleting source as copy failed: operation error S3: PutObject, https response error StatusCode: 409, RequestID: , HostID: , api error NotImplemented: Bucket recreated during request lifetime. Please try your request again. operations_test.go:940: Error Trace: /home/rclone/go/src/github.com/rclone/rclone/fs/operations/operations_test.go:940 Error: Received unexpected error: operation error S3: PutObject, https response error StatusCode: 409, RequestID: , HostID: , api error NotImplemented: Bucket recreated during request lifetime. Please try your request again. Test: TestMoveFile --- FAIL: TestMoveFile (0.36s) === RUN TestMoveFileWithIgnoreExisting run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:00 DEBUG : file1: Need to transfer - File not found at Destination 2024/12/10 01:22:00 DEBUG : file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2024/12/10 01:22:00 INFO : file1: Copied (new) 2024/12/10 01:22:00 INFO : file1: Deleted 2024/12/10 01:22:00 DEBUG : file1: Destination exists, skipping 2024/12/10 01:22:00 DEBUG : file1: Not removing source file as destination file exists and --ignore-existing is set --- PASS: TestMoveFileWithIgnoreExisting (0.85s) === RUN TestCaseInsensitiveMoveFile run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" --- PASS: TestCaseInsensitiveMoveFile (0.14s) === RUN TestCaseInsensitiveMoveFileDryRun run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" --- PASS: TestCaseInsensitiveMoveFileDryRun (0.17s) === RUN TestMoveFileBackupDir run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:01 DEBUG : Creating backend with remote "TestS3R2:rclone-test-nutecod8gavi/backup" 2024/12/10 01:22:01 DEBUG : dst/file1: Sizes differ (src 14 vs dst 18) 2024/12/10 01:22:02 DEBUG : dst/file1: md5 = 05164b153084ba910184c26e561a7c18 OK 2024/12/10 01:22:02 INFO : dst/file1: Copied (server-side copy) 2024/12/10 01:22:02 INFO : dst/file1: Deleted 2024/12/10 01:22:02 ERROR : dst/file1: Failed to copy: operation error S3: PutObject, https response error StatusCode: 409, RequestID: , HostID: , api error NotImplemented: Bucket recreated during request lifetime. Please try your request again. 2024/12/10 01:22:02 ERROR : dst/file1: Not deleting source as copy failed: operation error S3: PutObject, https response error StatusCode: 409, RequestID: , HostID: , api error NotImplemented: Bucket recreated during request lifetime. Please try your request again. operations_test.go:1052: Error Trace: /home/rclone/go/src/github.com/rclone/rclone/fs/operations/operations_test.go:1052 Error: Received unexpected error: operation error S3: PutObject, https response error StatusCode: 409, RequestID: , HostID: , api error NotImplemented: Bucket recreated during request lifetime. Please try your request again. Test: TestMoveFileBackupDir --- FAIL: TestMoveFileBackupDir (1.72s) === RUN TestSameConfig --- PASS: TestSameConfig (0.00s) === RUN TestSame --- PASS: TestSame (0.00s) === RUN TestOverlappingFilterCheckWithoutFilter --- PASS: TestOverlappingFilterCheckWithoutFilter (0.00s) === RUN TestOverlappingFilterCheckWithFilter --- PASS: TestOverlappingFilterCheckWithFilter (0.00s) === RUN TestListFormat --- PASS: TestListFormat (0.00s) === RUN TestDirMove run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:04 DEBUG : A1/B2: Making directory 2024/12/10 01:22:04 DEBUG : A1/B1/C3: Making directory fstest.go:244: Filtering empty directory "A1/B2" fstest.go:244: Filtering empty directory "A1/B1/C3" 2024/12/10 01:22:05 DEBUG : A1/B1/C2/five: md5 = 30056e1cab7a61d256fc8edd970d14f5 OK 2024/12/10 01:22:05 INFO : A1/B1/C2/five: Copied (server-side copy) to: A2/B1/C2/five 2024/12/10 01:22:05 DEBUG : A1/two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2024/12/10 01:22:05 INFO : A1/two: Copied (server-side copy) to: A2/two 2024/12/10 01:22:05 DEBUG : A1/B1/three: md5 = 35d6d33467aae9a2e3dccb4b6b027878 OK 2024/12/10 01:22:05 INFO : A1/B1/three: Copied (server-side copy) to: A2/B1/three 2024/12/10 01:22:05 INFO : A1/B1/C2/five: Deleted 2024/12/10 01:22:05 DEBUG : A1/one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2024/12/10 01:22:05 INFO : A1/one: Copied (server-side copy) to: A2/one 2024/12/10 01:22:05 INFO : A1/B1/three: Deleted 2024/12/10 01:22:05 INFO : A1/two: Deleted 2024/12/10 01:22:05 INFO : A1/one: Deleted 2024/12/10 01:22:05 DEBUG : A1/B1/C1/four: md5 = 8cbad96aced40b3838dd9f07f6ef5772 OK 2024/12/10 01:22:05 INFO : A1/B1/C1/four: Copied (server-side copy) to: A2/B1/C1/four 2024/12/10 01:22:06 INFO : A1/B1/C1/four: Deleted fstest.go:244: Filtering empty directory "A2/B2" fstest.go:244: Filtering empty directory "A2/B1/C3" 2024/12/10 01:22:07 DEBUG : A2/B1/C1/four: md5 = 8cbad96aced40b3838dd9f07f6ef5772 OK 2024/12/10 01:22:07 INFO : A2/B1/C1/four: Copied (server-side copy) to: A3/B1/C1/four 2024/12/10 01:22:07 DEBUG : A2/B1/three: md5 = 35d6d33467aae9a2e3dccb4b6b027878 OK 2024/12/10 01:22:07 INFO : A2/B1/three: Copied (server-side copy) to: A3/B1/three 2024/12/10 01:22:07 DEBUG : A2/B1/C2/five: md5 = 30056e1cab7a61d256fc8edd970d14f5 OK 2024/12/10 01:22:07 INFO : A2/B1/C2/five: Copied (server-side copy) to: A3/B1/C2/five 2024/12/10 01:22:07 INFO : A2/B1/C1/four: Deleted 2024/12/10 01:22:07 INFO : A2/B1/three: Deleted 2024/12/10 01:22:07 DEBUG : A2/two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2024/12/10 01:22:07 INFO : A2/two: Copied (server-side copy) to: A3/two 2024/12/10 01:22:07 INFO : A2/B1/C2/five: Deleted 2024/12/10 01:22:07 INFO : A2/two: Deleted 2024/12/10 01:22:07 DEBUG : A2/one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2024/12/10 01:22:07 INFO : A2/one: Copied (server-side copy) to: A3/one 2024/12/10 01:22:07 INFO : A2/one: Deleted fstest.go:244: Filtering empty directory "A3/B2" fstest.go:244: Filtering empty directory "A3/B1/C3" 2024/12/10 01:22:08 INFO : S3 bucket rclone-test-nutecod8gavi: Can't DirMove - falling back to file moves: can't move directory - incompatible remotes 2024/12/10 01:22:08 DEBUG : A3/two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2024/12/10 01:22:08 INFO : A3/two: Copied (server-side copy) to: A4/two 2024/12/10 01:22:08 DEBUG : A3/B1/C2/five: md5 = 30056e1cab7a61d256fc8edd970d14f5 OK 2024/12/10 01:22:08 INFO : A3/B1/C2/five: Copied (server-side copy) to: A4/B1/C2/five 2024/12/10 01:22:08 DEBUG : A3/B1/three: md5 = 35d6d33467aae9a2e3dccb4b6b027878 OK 2024/12/10 01:22:08 INFO : A3/B1/three: Copied (server-side copy) to: A4/B1/three 2024/12/10 01:22:08 INFO : A3/two: Deleted 2024/12/10 01:22:08 INFO : A3/B1/C2/five: Deleted 2024/12/10 01:22:08 DEBUG : A3/B1/C1/four: md5 = 8cbad96aced40b3838dd9f07f6ef5772 OK 2024/12/10 01:22:08 INFO : A3/B1/C1/four: Copied (server-side copy) to: A4/B1/C1/four 2024/12/10 01:22:09 INFO : A3/B1/three: Deleted 2024/12/10 01:22:09 INFO : A3/B1/C1/four: Deleted 2024/12/10 01:22:10 DEBUG : A3/one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2024/12/10 01:22:10 INFO : A3/one: Copied (server-side copy) to: A4/one 2024/12/10 01:22:10 INFO : A3/one: Deleted fstest.go:244: Filtering empty directory "A4/B2" fstest.go:244: Filtering empty directory "A4/B1/C3" --- PASS: TestDirMove (8.55s) === RUN TestGetFsInfo run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" --- PASS: TestGetFsInfo (0.19s) === RUN TestRcat === RUN TestRcat/withChecksum=false,ignoreChecksum=false run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:11 DEBUG : S3 bucket rclone-test-nutecod8gavi: File to upload is small (34 bytes), uploading instead of streaming 2024/12/10 01:22:12 DEBUG : no_checksum_small_file_from_pipe: md5 = 4c762bc6ec18f21b23fcae426c7863b5 OK 2024/12/10 01:22:12 DEBUG : no_checksum_small_file_from_pipe: Size and md5 of src and dst objects identical 2024/12/10 01:22:12 NOTICE: S3 bucket rclone-test-nutecod8gavi: Streaming uploads using chunk size 5Mi will have maximum file size of 48.828Gi 2024/12/10 01:22:12 DEBUG : no_checksum_big_file_from_pipe: open chunk writer: started multipart upload: AGRszfi5F-OX_m1y1VKibr2tB7TTbWklc_Jzr54jgEAjJqf5ZNndO_yvrIEMalBV-KSNTD4ZxjM0hAblnp6iWmReIB0QnnMv2BuhNfvEZX3_xFWcD_TKXR2JVsGVIaAueklwjsyPqL9Yu353pNddOFRl4x1r4ZOL7G9jiZEzRuqrBfyzcwAGAcMI7j1DGnQfsjkPF2EaogGb4v0Kgepu7HBmnfTMGxy27R2iZD6Z1tHKTKufyNT3RJYglvVmN8sz4WfigXo8YP7ZUVzDSTSvwKb2_XLf4PextiiohzCEh6dfUIK9Kx2cV2TOkffjkicMGTwM5AL3D2BzQr1iBOHS7gI 2024/12/10 01:22:12 DEBUG : no_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2024/12/10 01:22:12 DEBUG : no_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2024/12/10 01:22:12 DEBUG : no_checksum_big_file_from_pipe: multipart upload "AGRszfi5F-OX_m1y1VKibr2tB7TTbWklc_Jzr54jgEAjJqf5ZNndO_yvrIEMalBV-KSNTD4ZxjM0hAblnp6iWmReIB0QnnMv2BuhNfvEZX3_xFWcD_TKXR2JVsGVIaAueklwjsyPqL9Yu353pNddOFRl4x1r4ZOL7G9jiZEzRuqrBfyzcwAGAcMI7j1DGnQfsjkPF2EaogGb4v0Kgepu7HBmnfTMGxy27R2iZD6Z1tHKTKufyNT3RJYglvVmN8sz4WfigXo8YP7ZUVzDSTSvwKb2_XLf4PextiiohzCEh6dfUIK9Kx2cV2TOkffjkicMGTwM5AL3D2BzQr1iBOHS7gI" finished 2024/12/10 01:22:12 DEBUG : no_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2024/12/10 01:22:12 DEBUG : no_checksum_big_file_from_pipe: Size of src and dst objects identical === RUN TestRcat/withChecksum=true,ignoreChecksum=false run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:13 DEBUG : S3 bucket rclone-test-nutecod8gavi: File to upload is small (34 bytes), uploading instead of streaming 2024/12/10 01:22:13 DEBUG : with_checksum_small_file_from_pipe: md5 = 4c762bc6ec18f21b23fcae426c7863b5 OK 2024/12/10 01:22:13 DEBUG : with_checksum_small_file_from_pipe: Size and md5 of src and dst objects identical 2024/12/10 01:22:13 DEBUG : with_checksum_big_file_from_pipe: open chunk writer: started multipart upload: AJtWaxhYgZhjae3rbJyUSodSeyrpRE9oe4N2Pv7Mi8oUDR7W7vJn9mrJhj9BaGKowantAa_ckzIg66uuZklkwcs0aPGcvIdhyp-ltJipq8YYC4WIHdJINP4ry3zMHG_w_36iBm_-_Ak5ojRmmwCLD20XhAcFkY30uzBb36LXQHggISyX78JcP1zbkHZetrirb3f4kl1zlX2i1Z1YMwaSY_6iacvwKMfQ3lB5xoKt8PyCRL_mRzwcMktyf1yJLX92usroZfCnJbFGGqtRTe55dheO5_GBazgTCDGvzIbkuzkFyNBTncM8DM4SiZV9vJBrKbc8JHQdazCVWnJuzHxS2Qg 2024/12/10 01:22:13 DEBUG : with_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2024/12/10 01:22:14 DEBUG : with_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2024/12/10 01:22:14 DEBUG : with_checksum_big_file_from_pipe: multipart upload "AJtWaxhYgZhjae3rbJyUSodSeyrpRE9oe4N2Pv7Mi8oUDR7W7vJn9mrJhj9BaGKowantAa_ckzIg66uuZklkwcs0aPGcvIdhyp-ltJipq8YYC4WIHdJINP4ry3zMHG_w_36iBm_-_Ak5ojRmmwCLD20XhAcFkY30uzBb36LXQHggISyX78JcP1zbkHZetrirb3f4kl1zlX2i1Z1YMwaSY_6iacvwKMfQ3lB5xoKt8PyCRL_mRzwcMktyf1yJLX92usroZfCnJbFGGqtRTe55dheO5_GBazgTCDGvzIbkuzkFyNBTncM8DM4SiZV9vJBrKbc8JHQdazCVWnJuzHxS2Qg" finished 2024/12/10 01:22:14 DEBUG : with_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2024/12/10 01:22:14 DEBUG : with_checksum_big_file_from_pipe: Size of src and dst objects identical === RUN TestRcat/withChecksum=false,ignoreChecksum=true run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:15 DEBUG : S3 bucket rclone-test-nutecod8gavi: File to upload is small (34 bytes), uploading instead of streaming 2024/12/10 01:22:15 DEBUG : ignore_checksum_small_file_from_pipe: Size and modification time the same (differ by 0s, within tolerance 1ns) 2024/12/10 01:22:15 DEBUG : ignore_checksum_big_file_from_pipe: open chunk writer: started multipart upload: AGYyzrlOfgy_JUXXB41lRq5F2SJjGG-Ac8j3tH0VAEeypqS5tnUffI1NaNtBF5MYoN8CbN7ZQKzoPwrIRQw6-faHLX36tnyvii30rCGNEW_ehZHntQ0__VLnKHFPBMvcHlYKrH-CllFVt6Edq88GrpBz8_OFQabcwD1fjGXfbh_ESoQsMW2XWvOvwElm25M0LXSRFW92xrrJNWqo0Pjs9kE-guaD2sbpScYHyDIddv7X-uZ82nCnwT7Lt-LeyW9uDC5-dBJKxp7gyIkzpyL9uB9O-mooYm41rRGbRU7T4RAtZhUpByXy3I0RXbY9CgLHCMaVvz0qokmqs2fJq9RHpd8 2024/12/10 01:22:15 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2024/12/10 01:22:15 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2024/12/10 01:22:16 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload "AGYyzrlOfgy_JUXXB41lRq5F2SJjGG-Ac8j3tH0VAEeypqS5tnUffI1NaNtBF5MYoN8CbN7ZQKzoPwrIRQw6-faHLX36tnyvii30rCGNEW_ehZHntQ0__VLnKHFPBMvcHlYKrH-CllFVt6Edq88GrpBz8_OFQabcwD1fjGXfbh_ESoQsMW2XWvOvwElm25M0LXSRFW92xrrJNWqo0Pjs9kE-guaD2sbpScYHyDIddv7X-uZ82nCnwT7Lt-LeyW9uDC5-dBJKxp7gyIkzpyL9uB9O-mooYm41rRGbRU7T4RAtZhUpByXy3I0RXbY9CgLHCMaVvz0qokmqs2fJq9RHpd8" finished 2024/12/10 01:22:16 DEBUG : ignore_checksum_big_file_from_pipe: Size and modification time the same (differ by 0s, within tolerance 1ns) === RUN TestRcat/withChecksum=true,ignoreChecksum=true run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:16 DEBUG : S3 bucket rclone-test-nutecod8gavi: File to upload is small (34 bytes), uploading instead of streaming 2024/12/10 01:22:16 DEBUG : ignore_checksum_small_file_from_pipe: Src hash empty - aborting Dst hash check 2024/12/10 01:22:16 DEBUG : ignore_checksum_small_file_from_pipe: Size of src and dst objects identical 2024/12/10 01:22:17 DEBUG : ignore_checksum_big_file_from_pipe: open chunk writer: started multipart upload: AEwXXCxv0nHzKvQpDzltXMdeT9XTU_7JR4cC0tQxQ7BOvIe5Bl1MKNv2U3fxRMhXgrbpXOcZpetovfl2-vO3ckeLyehl1xD0si1CgfLcg6uCrI05VlbDhulgYq37bXcZoe09AirbPsxkiCxbWVkLRYO1pIX5HTId4ZhPGfeUpXCju_XEqi6SHjsg_uxBBB7awBxROEklBEDi2Hj9xUYBTXLc8ZCvMRmvRrOkw1R60Quo4bXOvxTEWpvKwoPw8rHMLbMM3Q8bsfh3OFe68aSG5gcdfyNxestjfXzr01U6grj0XjNV_yKL-VBEbHltguclht-Q2nco71R1ahpzWCLsJ0E 2024/12/10 01:22:17 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2024/12/10 01:22:17 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2024/12/10 01:22:17 DEBUG : ignore_checksum_big_file_from_pipe: multipart upload "AEwXXCxv0nHzKvQpDzltXMdeT9XTU_7JR4cC0tQxQ7BOvIe5Bl1MKNv2U3fxRMhXgrbpXOcZpetovfl2-vO3ckeLyehl1xD0si1CgfLcg6uCrI05VlbDhulgYq37bXcZoe09AirbPsxkiCxbWVkLRYO1pIX5HTId4ZhPGfeUpXCju_XEqi6SHjsg_uxBBB7awBxROEklBEDi2Hj9xUYBTXLc8ZCvMRmvRrOkw1R60Quo4bXOvxTEWpvKwoPw8rHMLbMM3Q8bsfh3OFe68aSG5gcdfyNxestjfXzr01U6grj0XjNV_yKL-VBEbHltguclht-Q2nco71R1ahpzWCLsJ0E" finished 2024/12/10 01:22:17 DEBUG : ignore_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2024/12/10 01:22:17 DEBUG : ignore_checksum_big_file_from_pipe: Src hash empty - aborting Dst hash check 2024/12/10 01:22:17 DEBUG : ignore_checksum_big_file_from_pipe: Size of src and dst objects identical --- PASS: TestRcat (6.27s) --- PASS: TestRcat/withChecksum=false,ignoreChecksum=false (1.52s) --- PASS: TestRcat/withChecksum=true,ignoreChecksum=false (1.71s) --- PASS: TestRcat/withChecksum=false,ignoreChecksum=true (1.59s) --- PASS: TestRcat/withChecksum=true,ignoreChecksum=true (1.46s) === RUN TestRcatMetadata run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" === RUN TestRcatMetadata/Normal 2024/12/10 01:22:18 DEBUG : S3 bucket rclone-test-nutecod8gavi: File to upload is small (48 bytes), uploading instead of streaming 2024/12/10 01:22:18 DEBUG : rcat_metadata: md5 = dee10c33b9fe5e2ac4eb8ad8467b7063 OK 2024/12/10 01:22:18 DEBUG : rcat_metadata: Size and md5 of src and dst objects identical === RUN TestRcatMetadata/ViaDisk 2024/12/10 01:22:18 DEBUG : rcat_metadata_uploadcutoff0: open chunk writer: started multipart upload: AFesGE_0JxVymFe60a5tWxcUON_EIQrABph05VPkO1fUOnHbV3KOZ8pYZ2V5fCRlSXkYE-p6-DYtCcXh0One2b6HorHN20KDWY5qiCP7J_JuUPMTvkL3M54n6hy6SFSOunJs8cd4TQ58WYEVEV0p3BUQXWdq_g6kGB5XWvgbGJhN-OzhD0CxodkfA7ToBprwP7kGGTxUn1oSnaIqtUZddthnOToipOe70gkTGmALbW6KyTtp3e1ISm1OLr1ejLa9-HiCod0o5yQ9OMev6y0jn5ccFBzBJVQDDEnIGbnec4iG9vNpIFrlg65VgPgiy-ABhPDmw34tpJtSG53IMkE7wOE 2024/12/10 01:22:18 DEBUG : rcat_metadata_uploadcutoff0: multipart upload: starting chunk 0 size 63 offset 0/off 2024/12/10 01:22:19 DEBUG : rcat_metadata_uploadcutoff0: multipart upload wrote chunk 1 with 63 bytes and etag "51ca8560f35b9b87370a65d72356b86c" 2024/12/10 01:22:19 DEBUG : rcat_metadata_uploadcutoff0: multipart upload "AFesGE_0JxVymFe60a5tWxcUON_EIQrABph05VPkO1fUOnHbV3KOZ8pYZ2V5fCRlSXkYE-p6-DYtCcXh0One2b6HorHN20KDWY5qiCP7J_JuUPMTvkL3M54n6hy6SFSOunJs8cd4TQ58WYEVEV0p3BUQXWdq_g6kGB5XWvgbGJhN-OzhD0CxodkfA7ToBprwP7kGGTxUn1oSnaIqtUZddthnOToipOe70gkTGmALbW6KyTtp3e1ISm1OLr1ejLa9-HiCod0o5yQ9OMev6y0jn5ccFBzBJVQDDEnIGbnec4iG9vNpIFrlg65VgPgiy-ABhPDmw34tpJtSG53IMkE7wOE" finished 2024/12/10 01:22:19 DEBUG : rcat_metadata_uploadcutoff0: Dst hash empty - aborting Src hash check 2024/12/10 01:22:19 DEBUG : rcat_metadata_uploadcutoff0: Size of src and dst objects identical --- PASS: TestRcatMetadata (1.80s) --- PASS: TestRcatMetadata/Normal (0.65s) --- PASS: TestRcatMetadata/ViaDisk (0.97s) === RUN TestRcatSize run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:20 DEBUG : S3 bucket rclone-test-nutecod8gavi: File to upload is small (60 bytes), uploading instead of streaming 2024/12/10 01:22:20 DEBUG : potato2: md5 = d6548b156ea68a4e003e786df99eee76 OK 2024/12/10 01:22:20 DEBUG : potato2: Size and md5 of src and dst objects identical --- PASS: TestRcatSize (1.05s) === RUN TestRcatSizeMetadata run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:21 DEBUG : S3 bucket rclone-test-nutecod8gavi: File to upload is small (60 bytes), uploading instead of streaming 2024/12/10 01:22:21 DEBUG : potato2: md5 = d6548b156ea68a4e003e786df99eee76 OK 2024/12/10 01:22:21 DEBUG : potato2: Size and md5 of src and dst objects identical --- PASS: TestRcatSizeMetadata (1.18s) === RUN TestTouchDir run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" 2024/12/10 01:22:23 DEBUG : S3 bucket rclone-test-nutecod8gavi: Touching "empty space" 2024/12/10 01:22:23 DEBUG : S3 bucket rclone-test-nutecod8gavi: Touching "potato2" 2024/12/10 01:22:23 DEBUG : S3 bucket rclone-test-nutecod8gavi: Touching "sub dir/potato3" --- PASS: TestTouchDir (2.50s) === RUN TestMkdirMetadata run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" operations_test.go:1703: Skipping test as remote does not support MkdirMetadata --- SKIP: TestMkdirMetadata (0.14s) === RUN TestMkdirModTime run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" operations_test.go:1722: Skipping test as remote does not support DirSetModTime or MkdirMetadata --- SKIP: TestMkdirModTime (0.78s) === RUN TestCopyDirMetadata run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" operations_test.go:1743: Skipping test as remote does not support WriteDirMetadata or MkdirMetadata --- SKIP: TestCopyDirMetadata (0.15s) === RUN TestSetDirModTime run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" operations_test.go:1779: Skipping test as remote does not support DirSetModTime or WriteDirSetModTime --- SKIP: TestSetDirModTime (0.12s) === RUN TestDirsEqual run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" operations_test.go:1823: Skipping test as remote does not support WriteDirMetadata or MkdirMetadata --- SKIP: TestDirsEqual (0.13s) === RUN TestRemoveExisting run.go:180: Remote "S3 bucket rclone-test-nutecod8gavi", Local "Local file system at /tmp/rclone2474542083", Modify Window "1ns" operations_test.go:1892: Skipping as remote can't Move --- SKIP: TestRemoveExisting (0.13s) === RUN TestRcAbout rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcAbout (0.00s) === RUN TestRcCleanup rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCleanup (0.00s) === RUN TestRcCopyfile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCopyfile (0.00s) === RUN TestRcCopyurl rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCopyurl (0.00s) === RUN TestRcDelete rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcDelete (0.00s) === RUN TestRcDeletefile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcDeletefile (0.00s) === RUN TestRcList rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcList (0.00s) === RUN TestRcStat rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcStat (0.00s) === RUN TestRcSetTier rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcSetTier (0.00s) === RUN TestRcSetTierFile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcSetTierFile (0.00s) === RUN TestRcMkdir rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcMkdir (0.00s) === RUN TestRcMovefile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcMovefile (0.00s) === RUN TestRcPurge rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcPurge (0.00s) === RUN TestRcRmdir rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcRmdir (0.00s) === RUN TestRcRmdirs rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcRmdirs (0.00s) === RUN TestRcSize rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcSize (0.00s) === RUN TestRcPublicLink rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcPublicLink (0.00s) === RUN TestRcFsInfo rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcFsInfo (0.00s) === RUN TestUploadFile rc_test.go:30: Skipping test on non local remote --- SKIP: TestUploadFile (0.00s) === RUN TestRcCommand rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCommand (0.00s) === RUN TestRcDu rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcDu (0.00s) === RUN TestRcCheck rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcCheck (0.00s) === RUN TestRcHashsum rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcHashsum (0.00s) === RUN TestRcHashsumFile rc_test.go:30: Skipping test on non local remote --- SKIP: TestRcHashsumFile (0.00s) FAIL 2024/12/10 01:22:26 DEBUG : S3 bucket rclone-test-nutecod8gavi: Purge remote 2024/12/10 01:22:26 DEBUG : S3 bucket rclone-test-nutecod8gavi: bucket is versioned: false 2024/12/10 01:22:26 DEBUG : Waiting for deletions to finish 2024/12/10 01:22:26 INFO : S3 bucket rclone-test-nutecod8gavi: Bucket "rclone-test-nutecod8gavi" deleted "./operations.test -test.v -test.timeout 1h0m0s -remote TestS3R2: -verbose -fast-list" - Finished ERROR in 2m45.630334077s (try 1/5): exit status 1: Failed [TestCopyURL TestMoveFile TestMoveFileBackupDir]