"./operations.test -test.v -test.timeout 1h0m0s -remote TestS3R2: -verbose -fast-list -test.run '^TestCopyURL$|^TestRcat$/^withChecksum=false,ignoreChecksum=false$'" - Starting (try 2/5) 2025/12/27 01:04:01 DEBUG : Creating backend with remote "TestS3R2:rclone-test-gaviduc5lize" 2025/12/27 01:04:01 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2025/12/27 01:04:01 DEBUG : Creating backend with remote "/tmp/rclone775903759" === RUN TestCopyURL run.go:185: Remote "S3 bucket rclone-test-gaviduc5lize", Local "Local file system at /tmp/rclone775903759", Modify Window "1ns" 2025/12/27 01:04:02 INFO : S3 bucket rclone-test-gaviduc5lize: Bucket "rclone-test-gaviduc5lize" created with ACL "" 2025/12/27 01:04:02 DEBUG : filename.txt: File name found in url 2025/12/27 01:04:03 DEBUG : headerfilename.txt: filename found in Content-Disposition header. --- PASS: TestCopyURL (2.98s) === RUN TestRcat === RUN TestRcat/withChecksum=false,ignoreChecksum=false run.go:185: Remote "S3 bucket rclone-test-gaviduc5lize", Local "Local file system at /tmp/rclone775903759", Modify Window "1ns" 2025/12/27 01:04:04 DEBUG : S3 bucket rclone-test-gaviduc5lize: File to upload is small (34 bytes), uploading instead of streaming 2025/12/27 01:04:04 DEBUG : no_checksum_small_file_from_pipe: size = 34 OK 2025/12/27 01:04:04 DEBUG : no_checksum_small_file_from_pipe: md5 = 4c762bc6ec18f21b23fcae426c7863b5 OK 2025/12/27 01:04:04 DEBUG : no_checksum_small_file_from_pipe: Size and md5 of src and dst objects identical 2025/12/27 01:04:04 NOTICE: S3 bucket rclone-test-gaviduc5lize: Streaming uploads using chunk size 5Mi will have maximum file size of 48.828Gi 2025/12/27 01:04:05 DEBUG : no_checksum_big_file_from_pipe: open chunk writer: started multipart upload: ADFJWp3xRoiSnoVaDg3HT2YvdezUAT79-6JBrUh7m-kuUu1oGHh4NdmDCDZKIgLzSNVz9nSxwUCIJqmYjFpbhBTNs8Ojrh8KOT3CEywB-91FQ_gwOau2xdUN8Wx-GqNomY3-1_cFouGS2iJPhY4Nj9JQPDqrKWAkoxlZ3M1DjtGSEBJTqSn-qeLWSmyZYJQPrJIsPV3O0bVuNnbOxYPb6mIiwb_HY2kWdgn-9AsJQY6YllSS4scsMlgbhKembimSaANU2ZXl_mhYIgdEMqsROhGa0aoDVTYHusmSTj5IxEhIbxNQA28lsoZZRdMm-BGlgv6VduhOZHqmQaBAfHPU3-g 2025/12/27 01:04:05 DEBUG : no_checksum_big_file_from_pipe: multipart upload: starting chunk 0 size 100.001Ki offset 0/off 2025/12/27 01:04:05 DEBUG : no_checksum_big_file_from_pipe: multipart upload wrote chunk 1 with 102401 bytes and etag "fffc7956ba9a7b58a63c01b6ce1ddc45" 2025/12/27 01:04:05 DEBUG : no_checksum_big_file_from_pipe: multipart upload "ADFJWp3xRoiSnoVaDg3HT2YvdezUAT79-6JBrUh7m-kuUu1oGHh4NdmDCDZKIgLzSNVz9nSxwUCIJqmYjFpbhBTNs8Ojrh8KOT3CEywB-91FQ_gwOau2xdUN8Wx-GqNomY3-1_cFouGS2iJPhY4Nj9JQPDqrKWAkoxlZ3M1DjtGSEBJTqSn-qeLWSmyZYJQPrJIsPV3O0bVuNnbOxYPb6mIiwb_HY2kWdgn-9AsJQY6YllSS4scsMlgbhKembimSaANU2ZXl_mhYIgdEMqsROhGa0aoDVTYHusmSTj5IxEhIbxNQA28lsoZZRdMm-BGlgv6VduhOZHqmQaBAfHPU3-g" finished 2025/12/27 01:04:05 DEBUG : no_checksum_big_file_from_pipe: size = 102401 OK 2025/12/27 01:04:05 DEBUG : no_checksum_big_file_from_pipe: Dst hash empty - aborting Src hash check 2025/12/27 01:04:05 DEBUG : no_checksum_big_file_from_pipe: Size of src and dst objects identical --- PASS: TestRcat (1.10s) --- PASS: TestRcat/withChecksum=false,ignoreChecksum=false (1.10s) PASS 2025/12/27 01:04:05 DEBUG : S3 bucket rclone-test-gaviduc5lize: Purge remote 2025/12/27 01:04:05 DEBUG : S3 bucket rclone-test-gaviduc5lize: bucket is versioned: false 2025/12/27 01:04:05 DEBUG : Waiting for deletions to finish 2025/12/27 01:04:06 INFO : S3 bucket rclone-test-gaviduc5lize: Bucket "rclone-test-gaviduc5lize" deleted "./operations.test -test.v -test.timeout 1h0m0s -remote TestS3R2: -verbose -fast-list -test.run '^TestCopyURL$|^TestRcat$/^withChecksum=false,ignoreChecksum=false$'" - Finished OK in 4.728752843s (try 2/5)