"./operations.test -test.v -test.timeout 1h0m0s -remote TestDrive: -verbose" - Starting (try 1/5) 2021/12/13 05:49:52 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4" 2021/12/13 05:49:52 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2021/12/13 05:49:53 DEBUG : Creating backend with remote "/tmp/rclone884050330" === RUN TestDoMultiThreadCopy --- PASS: TestDoMultiThreadCopy (0.00s) === RUN TestMultithreadCalculateChunks === RUN TestMultithreadCalculateChunks/{size:1_streams:10_wantPartSize:65536_wantStreams:1} === RUN TestMultithreadCalculateChunks/{size:1048576_streams:1_wantPartSize:1048576_wantStreams:1} === RUN TestMultithreadCalculateChunks/{size:1048576_streams:2_wantPartSize:524288_wantStreams:2} === RUN TestMultithreadCalculateChunks/{size:1048577_streams:2_wantPartSize:589824_wantStreams:2} === RUN TestMultithreadCalculateChunks/{size:1048575_streams:2_wantPartSize:524288_wantStreams:2} --- PASS: TestMultithreadCalculateChunks (0.00s) --- PASS: TestMultithreadCalculateChunks/{size:1_streams:10_wantPartSize:65536_wantStreams:1} (0.00s) --- PASS: TestMultithreadCalculateChunks/{size:1048576_streams:1_wantPartSize:1048576_wantStreams:1} (0.00s) --- PASS: TestMultithreadCalculateChunks/{size:1048576_streams:2_wantPartSize:524288_wantStreams:2} (0.00s) --- PASS: TestMultithreadCalculateChunks/{size:1048577_streams:2_wantPartSize:589824_wantStreams:2} (0.00s) --- PASS: TestMultithreadCalculateChunks/{size:1048575_streams:2_wantPartSize:524288_wantStreams:2} (0.00s) === RUN TestMultithreadCopy run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" === RUN TestMultithreadCopy/{size:131071_streams:2} 2021/12/13 05:49:56 DEBUG : file1: Starting multi-thread copy with 2 parts of size 64Ki 2021/12/13 05:49:56 DEBUG : file1: multi-thread copy: stream 2/2 (65536-131071) size 63.999Ki starting 2021/12/13 05:49:56 DEBUG : file1: multi-thread copy: stream 1/2 (0-65536) size 64Ki starting 2021/12/13 05:49:56 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:49:56 DEBUG : pacer: Rate limited, increasing sleep to 1.633764057s 2021/12/13 05:49:57 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:49:57 DEBUG : pacer: Rate limited, increasing sleep to 2.291647569s 2021/12/13 05:49:57 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:49:58 DEBUG : file1: multi-thread copy: stream 2/2 (65536-131071) size 63.999Ki finished 2021/12/13 05:49:59 DEBUG : file1: multi-thread copy: stream 1/2 (0-65536) size 64Ki finished 2021/12/13 05:49:59 DEBUG : file1: Finished multi-thread copy with 2 parts of size 64Ki === RUN TestMultithreadCopy/{size:131072_streams:2} 2021/12/13 05:50:01 DEBUG : file1: Starting multi-thread copy with 2 parts of size 64Ki 2021/12/13 05:50:01 DEBUG : file1: multi-thread copy: stream 2/2 (65536-131072) size 64Ki starting 2021/12/13 05:50:01 DEBUG : file1: multi-thread copy: stream 1/2 (0-65536) size 64Ki starting 2021/12/13 05:50:01 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:50:01 DEBUG : pacer: Rate limited, increasing sleep to 1.007348516s 2021/12/13 05:50:02 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:50:02 DEBUG : pacer: Rate limited, increasing sleep to 2.983506859s 2021/12/13 05:50:02 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:50:03 DEBUG : file1: multi-thread copy: stream 1/2 (0-65536) size 64Ki finished 2021/12/13 05:50:03 DEBUG : file1: multi-thread copy: stream 2/2 (65536-131072) size 64Ki finished 2021/12/13 05:50:03 DEBUG : file1: Finished multi-thread copy with 2 parts of size 64Ki === RUN TestMultithreadCopy/{size:131073_streams:2} 2021/12/13 05:50:08 DEBUG : file1: Starting multi-thread copy with 2 parts of size 128Ki 2021/12/13 05:50:08 DEBUG : file1: multi-thread copy: stream 2/2 (131072-131073) size 1 starting 2021/12/13 05:50:08 DEBUG : file1: multi-thread copy: stream 1/2 (0-131072) size 128Ki starting 2021/12/13 05:50:09 DEBUG : file1: multi-thread copy: stream 2/2 (131072-131073) size 1 finished 2021/12/13 05:50:09 DEBUG : file1: multi-thread copy: stream 1/2 (0-131072) size 128Ki finished 2021/12/13 05:50:09 DEBUG : file1: Finished multi-thread copy with 2 parts of size 128Ki --- PASS: TestMultithreadCopy (17.24s) --- PASS: TestMultithreadCopy/{size:131071_streams:2} (6.17s) --- PASS: TestMultithreadCopy/{size:131072_streams:2} (4.57s) --- PASS: TestMultithreadCopy/{size:131073_streams:2} (5.38s) === RUN TestSizeDiffers --- PASS: TestSizeDiffers (0.00s) === RUN TestReOpen === RUN TestReOpen/Seek === RUN TestReOpen/Seek/Basics === RUN TestReOpen/Seek/ErrorAtStart === RUN TestReOpen/Seek/WithErrors 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 2 bytes: retry 1/10: test error 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 3 bytes: retry 2/10: test error 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 6 bytes: retry 3/10: test error === RUN TestReOpen/Seek/TooManyErrors 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 2 bytes: retry 1/3: test error 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 3 bytes: retry 2/3: test error 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 6 bytes: retry 3/3: test error 2021/12/13 05:50:10 DEBUG : potato: Reopen failed after 6 bytes read: failed to reopen: too many retries === RUN TestReOpen/Range === RUN TestReOpen/Range/Basics === RUN TestReOpen/Range/ErrorAtStart === RUN TestReOpen/Range/WithErrors 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 2 bytes: retry 1/10: test error 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 3 bytes: retry 2/10: test error 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 6 bytes: retry 3/10: test error === RUN TestReOpen/Range/TooManyErrors 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 2 bytes: retry 1/3: test error 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 3 bytes: retry 2/3: test error 2021/12/13 05:50:10 DEBUG : potato: Reopening on read failure after 6 bytes: retry 3/3: test error 2021/12/13 05:50:10 DEBUG : potato: Reopen failed after 6 bytes read: failed to reopen: too many retries --- PASS: TestReOpen (0.00s) --- PASS: TestReOpen/Seek (0.00s) --- PASS: TestReOpen/Seek/Basics (0.00s) --- PASS: TestReOpen/Seek/ErrorAtStart (0.00s) --- PASS: TestReOpen/Seek/WithErrors (0.00s) --- PASS: TestReOpen/Seek/TooManyErrors (0.00s) --- PASS: TestReOpen/Range (0.00s) --- PASS: TestReOpen/Range/Basics (0.00s) --- PASS: TestReOpen/Range/ErrorAtStart (0.00s) --- PASS: TestReOpen/Range/WithErrors (0.00s) --- PASS: TestReOpen/Range/TooManyErrors (0.00s) === RUN TestCheck run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" === RUN TestCheck/1 === RUN TestCheck/2 === RUN TestCheck/3 === RUN TestCheck/4 === RUN TestCheck/5 === RUN TestCheck/6 === RUN TestCheck/7 --- PASS: TestCheck (17.44s) --- PASS: TestCheck/1 (0.32s) --- PASS: TestCheck/2 (0.32s) --- PASS: TestCheck/3 (0.32s) --- PASS: TestCheck/4 (0.33s) --- PASS: TestCheck/5 (2.34s) --- PASS: TestCheck/6 (0.27s) --- PASS: TestCheck/7 (0.28s) === RUN TestCheckFsError 2021/12/13 05:50:27 DEBUG : Creating backend with remote "non-existent" 2021/12/13 05:50:27 DEBUG : Creating backend with remote "non-existent" 2021/12/13 05:50:27 DEBUG : Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/non-existent: Waiting for checks to finish 2021/12/13 05:50:27 ERROR : Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/non-existent: error reading source root directory: directory not found 2021/12/13 05:50:27 NOTICE: Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/non-existent: 2 differences found 2021/12/13 05:50:27 NOTICE: Local file system at /home/rclone/go/src/github.com/rclone/rclone/fs/operations/non-existent: 2 errors while checking --- PASS: TestCheckFsError (0.00s) === RUN TestCheckDownload run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" === RUN TestCheckDownload/1 === RUN TestCheckDownload/2 === RUN TestCheckDownload/3 === RUN TestCheckDownload/4 === RUN TestCheckDownload/5 === RUN TestCheckDownload/6 === RUN TestCheckDownload/7 --- PASS: TestCheckDownload (17.56s) --- PASS: TestCheckDownload/1 (0.84s) --- PASS: TestCheckDownload/2 (0.83s) --- PASS: TestCheckDownload/3 (0.85s) --- PASS: TestCheckDownload/4 (1.34s) --- PASS: TestCheckDownload/5 (1.20s) --- PASS: TestCheckDownload/6 (1.19s) --- PASS: TestCheckDownload/7 (1.08s) === RUN TestCheckSizeOnly run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" === RUN TestCheckSizeOnly/1 === RUN TestCheckSizeOnly/2 === RUN TestCheckSizeOnly/3 === RUN TestCheckSizeOnly/4 === RUN TestCheckSizeOnly/5 === RUN TestCheckSizeOnly/6 === RUN TestCheckSizeOnly/7 --- PASS: TestCheckSizeOnly (12.23s) --- PASS: TestCheckSizeOnly/1 (0.26s) --- PASS: TestCheckSizeOnly/2 (0.26s) --- PASS: TestCheckSizeOnly/3 (0.26s) --- PASS: TestCheckSizeOnly/4 (0.27s) --- PASS: TestCheckSizeOnly/5 (0.28s) --- PASS: TestCheckSizeOnly/6 (0.26s) --- PASS: TestCheckSizeOnly/7 (0.27s) === RUN TestCheckEqualReaders --- PASS: TestCheckEqualReaders (0.00s) === RUN TestParseSumFile run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:51:00 NOTICE: test.sum: improperly formatted checksum line 4 2021/12/13 05:51:00 NOTICE: test.sum: improperly formatted checksum line 5 2021/12/13 05:51:00 NOTICE: test.sum: improperly formatted checksum line 6 2021/12/13 05:51:00 NOTICE: test.sum: 2 warning(s) suppressed... 2021/12/13 05:51:02 NOTICE: test.sum: improperly formatted checksum line 4 2021/12/13 05:51:02 NOTICE: test.sum: improperly formatted checksum line 5 2021/12/13 05:51:02 NOTICE: test.sum: improperly formatted checksum line 6 2021/12/13 05:51:02 NOTICE: test.sum: 2 warning(s) suppressed... --- PASS: TestParseSumFile (6.52s) === RUN TestCheckSum run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:51:04 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/data" === RUN TestCheckSum/subtest1 === RUN TestCheckSum/subtest2 === RUN TestCheckSum/subtest3 === RUN TestCheckSum/subtest4 === RUN TestCheckSum/subtest5 === RUN TestCheckSum/subtest6 === RUN TestCheckSum/subtest7 2021/12/13 05:51:39 DEBUG : data: Rmdir: contains trashed file: "banana" 2021/12/13 05:51:39 DEBUG : data: Rmdir: contains trashed file: "potato" --- PASS: TestCheckSum (36.27s) --- PASS: TestCheckSum/subtest1 (1.88s) --- PASS: TestCheckSum/subtest2 (1.61s) --- PASS: TestCheckSum/subtest3 (1.15s) --- PASS: TestCheckSum/subtest4 (1.25s) --- PASS: TestCheckSum/subtest5 (1.24s) --- PASS: TestCheckSum/subtest6 (1.21s) --- PASS: TestCheckSum/subtest7 (1.23s) === RUN TestCheckSumDownload run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:51:40 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/data" === RUN TestCheckSumDownload/subtest1 === RUN TestCheckSumDownload/subtest2 === RUN TestCheckSumDownload/subtest3 === RUN TestCheckSumDownload/subtest4 === RUN TestCheckSumDownload/subtest5 === RUN TestCheckSumDownload/subtest6 === RUN TestCheckSumDownload/subtest7 2021/12/13 05:52:22 DEBUG : data: Rmdir: contains trashed file: "potato" 2021/12/13 05:52:22 DEBUG : data: Rmdir: contains trashed file: "banana" --- PASS: TestCheckSumDownload (42.46s) --- PASS: TestCheckSumDownload/subtest1 (2.83s) --- PASS: TestCheckSumDownload/subtest2 (2.38s) --- PASS: TestCheckSumDownload/subtest3 (2.20s) --- PASS: TestCheckSumDownload/subtest4 (1.96s) --- PASS: TestCheckSumDownload/subtest5 (1.95s) --- PASS: TestCheckSumDownload/subtest6 (2.04s) --- PASS: TestCheckSumDownload/subtest7 (2.23s) === RUN TestDeduplicateInteractive run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:52:27 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate names using interactive mode. 2021/12/13 05:52:27 NOTICE: one: Found 3 files with duplicate names 2021/12/13 05:52:27 NOTICE: one: Deleting 2/3 identical duplicates (md5 0f6d3e59c45e54d0f2cb30fb594620a3) 2021/12/13 05:52:28 INFO : one: Deleted 2021/12/13 05:52:29 INFO : one: Deleted 2021/12/13 05:52:29 NOTICE: one: All duplicates removed --- PASS: TestDeduplicateInteractive (7.54s) === RUN TestDeduplicateSkip run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:52:34 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate names using skip mode. 2021/12/13 05:52:35 NOTICE: one: Found 3 files with duplicate names 2021/12/13 05:52:35 NOTICE: one: Deleting 1/2 identical duplicates (md5 0f6d3e59c45e54d0f2cb30fb594620a3) 2021/12/13 05:52:35 INFO : one: Deleted 2021/12/13 05:52:35 NOTICE: one: Skipping 2 files with duplicate names --- PASS: TestDeduplicateSkip (7.34s) === RUN TestDeduplicateSizeOnly run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:52:41 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate names using skip mode. 2021/12/13 05:52:42 NOTICE: one: Found 3 files with duplicate names 2021/12/13 05:52:42 NOTICE: one: Deleting 1/2 identical duplicates (size 11) 2021/12/13 05:52:43 INFO : one: Deleted 2021/12/13 05:52:43 NOTICE: one: Skipping 2 files with duplicate names --- PASS: TestDeduplicateSizeOnly (7.44s) === RUN TestDeduplicateFirst run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:52:49 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate names using first mode. 2021/12/13 05:52:50 NOTICE: one: Found 3 files with duplicate names 2021/12/13 05:52:50 INFO : one: Deleted 2021/12/13 05:52:51 INFO : one: Deleted 2021/12/13 05:52:51 NOTICE: one: Deleted 2 extra copies --- PASS: TestDeduplicateFirst (7.48s) === RUN TestDeduplicateNewest run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:52:56 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate names using newest mode. 2021/12/13 05:52:57 NOTICE: one: Found 3 files with duplicate names 2021/12/13 05:52:58 INFO : one: Deleted 2021/12/13 05:52:58 INFO : one: Deleted 2021/12/13 05:52:58 NOTICE: one: Deleted 2 extra copies --- PASS: TestDeduplicateNewest (7.32s) === RUN TestDeduplicateNewestByHash run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:53:07 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate md5 hashes using newest mode. 2021/12/13 05:53:08 NOTICE: 149f868e8478b73ad9103c9815605215: Found 3 files with duplicate md5 hashes 2021/12/13 05:53:08 INFO : one: Deleted 2021/12/13 05:53:09 INFO : also/one: Deleted 2021/12/13 05:53:09 NOTICE: 149f868e8478b73ad9103c9815605215: Deleted 2 extra copies 2021/12/13 05:53:12 DEBUG : also: Rmdir: contains trashed file: "one" --- PASS: TestDeduplicateNewestByHash (13.13s) === RUN TestDeduplicateOldest run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:53:17 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate names using oldest mode. 2021/12/13 05:53:17 NOTICE: one: Found 3 files with duplicate names 2021/12/13 05:53:18 INFO : one: Deleted 2021/12/13 05:53:19 INFO : one: Deleted 2021/12/13 05:53:19 NOTICE: one: Deleted 2 extra copies --- PASS: TestDeduplicateOldest (7.47s) === RUN TestDeduplicateLargest run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:53:24 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate names using largest mode. 2021/12/13 05:53:25 NOTICE: one: Found 3 files with duplicate names 2021/12/13 05:53:26 INFO : one: Deleted 2021/12/13 05:53:26 INFO : one: Deleted 2021/12/13 05:53:26 NOTICE: one: Deleted 2 extra copies --- PASS: TestDeduplicateLargest (7.45s) === RUN TestDeduplicateSmallest run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:53:32 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate names using smallest mode. 2021/12/13 05:53:32 NOTICE: one: Found 3 files with duplicate names 2021/12/13 05:53:33 INFO : one: Deleted 2021/12/13 05:53:34 INFO : one: Deleted 2021/12/13 05:53:34 NOTICE: one: Deleted 2 extra copies --- PASS: TestDeduplicateSmallest (7.75s) === RUN TestDeduplicateRename run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:53:41 INFO : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Looking for duplicate names using rename mode. 2021/12/13 05:53:41 NOTICE: one.txt: Found 3 files with duplicate names 2021/12/13 05:53:43 INFO : one-2.txt: renamed from: one.txt 2021/12/13 05:53:45 INFO : one-3.txt: renamed from: one.txt 2021/12/13 05:53:46 INFO : one-4.txt: renamed from: one.txt --- PASS: TestDeduplicateRename (14.28s) === RUN TestMergeDirs run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:53:58 INFO : dupe2: merging "two.txt" 2021/12/13 05:53:59 INFO : dupe2: removing empty directory 2021/12/13 05:54:00 INFO : dupe3: merging "three.txt" 2021/12/13 05:54:00 INFO : dupe3: removing empty directory 2021/12/13 05:54:05 DEBUG : dupe1: Rmdir: contains trashed file: "three.txt" 2021/12/13 05:54:05 DEBUG : dupe1: Rmdir: contains trashed file: "two.txt" 2021/12/13 05:54:05 DEBUG : dupe1: Rmdir: contains trashed file: "one.txt" --- PASS: TestMergeDirs (16.17s) === RUN TestListDirSorted run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:54:21 DEBUG : a.txt: Excluded 2021/12/13 05:54:22 DEBUG : sub dir/hello world2: Excluded 2021/12/13 05:54:22 DEBUG : sub dir/hello world: Excluded 2021/12/13 05:54:23 DEBUG : sub dir/ignore dir: Excluded 2021/12/13 05:54:23 DEBUG : sub dir/hello world2: Excluded 2021/12/13 05:54:23 DEBUG : sub dir/hello world: Excluded 2021/12/13 05:54:23 DEBUG : sub dir/ignore dir: Excluded 2021/12/13 05:54:30 DEBUG : sub dir/sub sub dir: Rmdir: contains trashed file: "hello world3" 2021/12/13 05:54:31 DEBUG : sub dir/ignore dir: Rmdir: contains trashed file: "should be ignored" 2021/12/13 05:54:31 DEBUG : sub dir/ignore dir: Rmdir: contains trashed file: ".ignore" 2021/12/13 05:54:32 DEBUG : sub dir: Rmdir: contains trashed file: "sub sub dir" 2021/12/13 05:54:32 DEBUG : sub dir: Rmdir: contains trashed file: "ignore dir" 2021/12/13 05:54:32 DEBUG : sub dir: Rmdir: contains trashed file: "hello world2" 2021/12/13 05:54:32 DEBUG : sub dir: Rmdir: contains trashed file: "hello world" --- PASS: TestListDirSorted (26.88s) === RUN TestListJSON run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" === RUN TestListJSON/Default === RUN TestListJSON/FilesOnly === RUN TestListJSON/DirsOnly === RUN TestListJSON/Recurse === RUN TestListJSON/SubDir === RUN TestListJSON/NoModTime === RUN TestListJSON/NoMimeType === RUN TestListJSON/ShowHash === RUN TestListJSON/HashTypes 2021/12/13 05:54:43 DEBUG : sub: Rmdir: contains trashed file: "file2" --- PASS: TestListJSON (11.37s) --- PASS: TestListJSON/Default (0.41s) --- PASS: TestListJSON/FilesOnly (0.35s) --- PASS: TestListJSON/DirsOnly (0.29s) --- PASS: TestListJSON/Recurse (0.63s) --- PASS: TestListJSON/SubDir (0.34s) --- PASS: TestListJSON/NoModTime (0.30s) --- PASS: TestListJSON/NoMimeType (0.33s) --- PASS: TestListJSON/ShowHash (0.30s) --- PASS: TestListJSON/HashTypes (0.31s) === RUN TestStatJSON run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" === RUN TestStatJSON/Root === RUN TestStatJSON/RootFilesOnly === RUN TestStatJSON/RootDirsOnly === RUN TestStatJSON/Dir === RUN TestStatJSON/File === RUN TestStatJSON/NotFound === RUN TestStatJSON/DirFilesOnly === RUN TestStatJSON/FileFilesOnly === RUN TestStatJSON/NotFoundFilesOnly === RUN TestStatJSON/DirDirsOnly === RUN TestStatJSON/FileDirsOnly === RUN TestStatJSON/NotFoundDirsOnly === RUN TestStatJSON/RootNotFound 2021/12/13 05:54:54 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/notfound" 2021/12/13 05:54:58 DEBUG : sub: Rmdir: contains trashed file: "file2" --- PASS: TestStatJSON (14.44s) --- PASS: TestStatJSON/Root (0.29s) --- PASS: TestStatJSON/RootFilesOnly (0.00s) --- PASS: TestStatJSON/RootDirsOnly (0.30s) --- PASS: TestStatJSON/Dir (0.61s) --- PASS: TestStatJSON/File (0.30s) --- PASS: TestStatJSON/NotFound (0.56s) --- PASS: TestStatJSON/DirFilesOnly (0.29s) --- PASS: TestStatJSON/FileFilesOnly (0.29s) --- PASS: TestStatJSON/NotFoundFilesOnly (1.14s) --- PASS: TestStatJSON/DirDirsOnly (0.33s) --- PASS: TestStatJSON/FileDirsOnly (0.31s) --- PASS: TestStatJSON/NotFoundDirsOnly (0.28s) --- PASS: TestStatJSON/RootNotFound (1.58s) === RUN TestMkdir run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:54:59 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Making directory 2021/12/13 05:54:59 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Making directory --- PASS: TestMkdir (0.86s) === RUN TestLsd run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:55:04 DEBUG : sub dir: Rmdir: contains trashed file: "hello world" --- PASS: TestLsd (5.97s) === RUN TestLs run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" --- PASS: TestLs (5.66s) === RUN TestLsWithFilesFrom run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:55:15 DEBUG : empty space: Excluded --- PASS: TestLsWithFilesFrom (5.69s) === RUN TestLsLong run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" --- PASS: TestLsLong (5.54s) === RUN TestHashSums run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" === RUN TestHashSums/Md5 === RUN TestHashSums/Md5Download --- PASS: TestHashSums (6.96s) --- PASS: TestHashSums/Md5 (0.36s) --- PASS: TestHashSums/Md5Download (1.25s) === RUN TestHashSumsWithErrors 2021/12/13 05:55:29 DEBUG : Creating backend with remote ":memory:" 2021/12/13 05:55:29 ERROR : file1: hash unsupported: hash type not supported --- PASS: TestHashSumsWithErrors (0.00s) === RUN TestHashStream 2021/12/13 05:55:29 DEBUG : Creating md5 hash of 0 bytes read from input stream 2021/12/13 05:55:29 DEBUG : Creating md5 hash of 0 bytes read from input stream 2021/12/13 05:55:29 DEBUG : Creating sha1 hash of 0 bytes read from input stream 2021/12/13 05:55:29 DEBUG : Creating sha1 hash of 0 bytes read from input stream 2021/12/13 05:55:29 DEBUG : Creating md5 hash of 12 bytes read from input stream 2021/12/13 05:55:29 DEBUG : Creating md5 hash of 12 bytes read from input stream 2021/12/13 05:55:29 DEBUG : Creating sha1 hash of 12 bytes read from input stream 2021/12/13 05:55:29 DEBUG : Creating sha1 hash of 12 bytes read from input stream --- PASS: TestHashStream (0.00s) === RUN TestSuffixName --- PASS: TestSuffixName (0.00s) === RUN TestCount run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:55:39 DEBUG : sub dir: Rmdir: contains trashed file: "potato3" --- PASS: TestCount (10.25s) === RUN TestDelete run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:55:44 DEBUG : Waiting for deletions to finish 2021/12/13 05:55:45 DEBUG : large: Excluded from sync (and deletion) 2021/12/13 05:55:45 INFO : small: Deleted 2021/12/13 05:55:45 INFO : medium: Deleted --- PASS: TestDelete (7.61s) === RUN TestRetry 2021/12/13 05:55:47 DEBUG : Received error: EOF - low level retry 1/5 2021/12/13 05:55:47 DEBUG : Received error: EOF - low level retry 2/5 2021/12/13 05:55:47 DEBUG : Received error: EOF - low level retry 1/5 2021/12/13 05:55:47 DEBUG : Received error: EOF - low level retry 2/5 2021/12/13 05:55:47 DEBUG : Received error: EOF - low level retry 3/5 2021/12/13 05:55:47 DEBUG : Received error: EOF - low level retry 4/5 2021/12/13 05:55:47 DEBUG : Received error: EOF - low level retry 5/5 --- PASS: TestRetry (0.00s) === RUN TestCat run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" --- PASS: TestCat (11.78s) === RUN TestPurge 2021/12/13 05:55:59 DEBUG : Creating backend with remote "TestDrive:rclone-test-fegaqib0legajuf7dolayel0" 2021/12/13 05:55:59 DEBUG : Using config file from "/home/rclone/.rclone.conf" 2021/12/13 05:55:59 DEBUG : Creating backend with remote "/tmp/rclone3959911728" run.go:181: Remote "Google drive root 'rclone-test-fegaqib0legajuf7dolayel0'", Local "Local file system at /tmp/rclone3959911728", Modify Window "1ms" 2021/12/13 05:56:05 DEBUG : A2: Making directory 2021/12/13 05:56:06 DEBUG : A1/B2: Making directory 2021/12/13 05:56:07 DEBUG : A1/B2/C2: Making directory 2021/12/13 05:56:08 DEBUG : A1/B1/C3: Making directory 2021/12/13 05:56:09 DEBUG : A3: Making directory 2021/12/13 05:56:10 DEBUG : A3/B3: Making directory 2021/12/13 05:56:11 DEBUG : A3/B3/C4: Making directory 2021/12/13 05:56:15 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:56:15 DEBUG : pacer: Rate limited, increasing sleep to 1.716042288s 2021/12/13 05:56:15 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:56:17 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:56:17 DEBUG : pacer: Rate limited, increasing sleep to 1.336566009s 2021/12/13 05:56:17 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:56:18 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:56:18 DEBUG : pacer: Rate limited, increasing sleep to 1.812422869s 2021/12/13 05:56:19 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:56:22 DEBUG : Google drive root 'rclone-test-fegaqib0legajuf7dolayel0': Purge remote 2021/12/13 05:56:22 purge failed: directory not found --- PASS: TestPurge (23.41s) === RUN TestRmdirsNoLeaveRoot run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:56:27 DEBUG : A2: Making directory 2021/12/13 05:56:28 DEBUG : A1/B2: Making directory 2021/12/13 05:56:29 DEBUG : A1/B2/C2: Making directory 2021/12/13 05:56:30 DEBUG : A1/B1/C3: Making directory 2021/12/13 05:56:31 DEBUG : A3: Making directory 2021/12/13 05:56:32 DEBUG : A3/B3: Making directory 2021/12/13 05:56:33 DEBUG : A3/B3/C4: Making directory 2021/12/13 05:56:37 INFO : A3/B3/C4: Removing directory 2021/12/13 05:56:37 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:56:37 DEBUG : pacer: Rate limited, increasing sleep to 1.568410414s 2021/12/13 05:56:38 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:56:42 INFO : A3/B3: Removing directory 2021/12/13 05:56:43 DEBUG : A3/B3: Rmdir: contains trashed file: "C4" 2021/12/13 05:56:43 INFO : A3: Removing directory 2021/12/13 05:56:44 DEBUG : A3: Rmdir: contains trashed file: "B3" 2021/12/13 05:56:44 INFO : A2: Removing directory 2021/12/13 05:56:45 INFO : A1/B2/C2: Removing directory 2021/12/13 05:56:46 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:56:46 DEBUG : pacer: Rate limited, increasing sleep to 1.080513464s 2021/12/13 05:56:46 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:56:46 DEBUG : pacer: Rate limited, increasing sleep to 2.922874954s 2021/12/13 05:56:47 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:56:47 DEBUG : pacer: Rate limited, increasing sleep to 4.035983335s 2021/12/13 05:56:50 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:56:50 INFO : A1/B2: Removing directory 2021/12/13 05:56:54 DEBUG : A1/B2: Rmdir: contains trashed file: "C2" 2021/12/13 05:56:55 INFO : A1/B1/C3: Removing directory 2021/12/13 05:57:00 DEBUG : A1/B1/C1: Rmdir: contains trashed file: "one" 2021/12/13 05:57:01 DEBUG : A1/B1: Rmdir: contains trashed file: "C3" 2021/12/13 05:57:01 DEBUG : A1/B1: Rmdir: contains trashed file: "C1" 2021/12/13 05:57:02 DEBUG : A1: Rmdir: contains trashed file: "B2" 2021/12/13 05:57:02 DEBUG : A1: Rmdir: contains trashed file: "B1" 2021/12/13 05:57:02 DEBUG : A1: Rmdir: contains trashed file: "two" --- PASS: TestRmdirsNoLeaveRoot (40.62s) === RUN TestRmdirsLeaveRoot run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:57:03 DEBUG : A1: Making directory 2021/12/13 05:57:04 DEBUG : A1/B1: Making directory 2021/12/13 05:57:05 DEBUG : A1/B1/C1: Making directory 2021/12/13 05:57:07 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:57:07 DEBUG : pacer: Rate limited, increasing sleep to 1.247212554s 2021/12/13 05:57:07 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:57:07 DEBUG : pacer: Rate limited, increasing sleep to 2.585069199s 2021/12/13 05:57:08 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:57:12 INFO : A1/B1/C1: Removing directory 2021/12/13 05:57:13 INFO : A1/B1: Removing directory 2021/12/13 05:57:13 DEBUG : A1/B1: Rmdir: contains trashed file: "C1" 2021/12/13 05:57:15 DEBUG : A1: Rmdir: contains trashed file: "B1" --- PASS: TestRmdirsLeaveRoot (13.45s) === RUN TestRmdirsWithFilter run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:57:16 DEBUG : A1: Making directory 2021/12/13 05:57:17 DEBUG : A1/B1: Making directory 2021/12/13 05:57:19 DEBUG : A1/B1/C1: Making directory 2021/12/13 05:57:22 INFO : A1/B1/C1: Removing directory 2021/12/13 05:57:23 INFO : A1/B1: Removing directory 2021/12/13 05:57:24 DEBUG : A1/B1: Rmdir: contains trashed file: "C1" 2021/12/13 05:57:26 DEBUG : A1: Rmdir: contains trashed file: "B1" --- PASS: TestRmdirsWithFilter (10.41s) === RUN TestCopyURL run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:57:27 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:57:27 DEBUG : pacer: Rate limited, increasing sleep to 1.410792415s 2021/12/13 05:57:27 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:57:27 DEBUG : pacer: Rate limited, increasing sleep to 2.013752768s 2021/12/13 05:57:29 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:57:32 DEBUG : filename.txt: File name found in url --- PASS: TestCopyURL (12.00s) === RUN TestCopyURLToWriter --- PASS: TestCopyURLToWriter (0.00s) === RUN TestMoveFile run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:57:39 DEBUG : file1: Need to transfer - File not found at Destination 2021/12/13 05:57:42 DEBUG : file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2021/12/13 05:57:42 INFO : file1: Copied (new) to: sub/file2 2021/12/13 05:57:42 INFO : file1: Deleted 2021/12/13 05:57:43 DEBUG : file1: Size and modification time the same (differ by -999.999µs, within tolerance 1ms) 2021/12/13 05:57:43 DEBUG : file1: Unchanged skipping 2021/12/13 05:57:43 INFO : file1: Deleted 2021/12/13 05:57:43 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': don't need to copy/move sub/file2, it is already at target location 2021/12/13 05:57:46 DEBUG : sub: Rmdir: contains trashed file: "file2" --- PASS: TestMoveFile (7.90s) === RUN TestMoveFileWithIgnoreExisting run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:57:47 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:57:47 DEBUG : pacer: Rate limited, increasing sleep to 1.741210381s 2021/12/13 05:57:47 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:57:47 DEBUG : pacer: Rate limited, increasing sleep to 2.055561582s 2021/12/13 05:57:49 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:57:49 DEBUG : file1: Need to transfer - File not found at Destination 2021/12/13 05:57:52 DEBUG : file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2021/12/13 05:57:52 INFO : file1: Copied (new) 2021/12/13 05:57:52 INFO : file1: Deleted 2021/12/13 05:57:53 DEBUG : file1: Destination exists, skipping 2021/12/13 05:57:53 DEBUG : file1: Not removing source file as destination file exists and --ignore-existing is set --- PASS: TestMoveFileWithIgnoreExisting (7.84s) === RUN TestCaseInsensitiveMoveFile run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" --- PASS: TestCaseInsensitiveMoveFile (0.57s) === RUN TestMoveFileBackupDir run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:57:59 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/backup" 2021/12/13 05:58:00 DEBUG : dst/file1: Sizes differ (src 14 vs dst 18) 2021/12/13 05:58:03 INFO : dst/file1: Moved (server-side) 2021/12/13 05:58:05 DEBUG : dst/file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2021/12/13 05:58:05 INFO : dst/file1: Copied (new) 2021/12/13 05:58:05 INFO : dst/file1: Deleted 2021/12/13 05:58:08 DEBUG : dst: Rmdir: contains trashed file: "file1" 2021/12/13 05:58:09 DEBUG : backup/dst: Rmdir: contains trashed file: "file1" 2021/12/13 05:58:10 DEBUG : backup: Rmdir: contains trashed file: "dst" --- PASS: TestMoveFileBackupDir (16.10s) === RUN TestCopyFile run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:58:11 DEBUG : file1: Need to transfer - File not found at Destination 2021/12/13 05:58:14 DEBUG : file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2021/12/13 05:58:14 INFO : file1: Copied (new) to: sub/file2 2021/12/13 05:58:15 DEBUG : file1: Size and modification time the same (differ by -999.999µs, within tolerance 1ms) 2021/12/13 05:58:15 DEBUG : file1: Unchanged skipping 2021/12/13 05:58:16 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': don't need to copy/move sub/file2, it is already at target location 2021/12/13 05:58:18 DEBUG : sub: Rmdir: contains trashed file: "file2" --- PASS: TestCopyFile (7.70s) === RUN TestCopyFileBackupDir run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:58:22 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/backup" 2021/12/13 05:58:24 DEBUG : dst/file1: Sizes differ (src 14 vs dst 18) 2021/12/13 05:58:27 INFO : dst/file1: Moved (server-side) 2021/12/13 05:58:28 DEBUG : dst/file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2021/12/13 05:58:28 INFO : dst/file1: Copied (new) 2021/12/13 05:58:31 DEBUG : dst: Rmdir: contains trashed file: "file1" 2021/12/13 05:58:32 DEBUG : backup/dst: Rmdir: contains trashed file: "file1" 2021/12/13 05:58:33 DEBUG : backup: Rmdir: contains trashed file: "dst" --- PASS: TestCopyFileBackupDir (15.57s) === RUN TestCopyFileCompareDest run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:58:34 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/dst" 2021/12/13 05:58:36 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/CompareDest" 2021/12/13 05:58:36 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:58:36 DEBUG : pacer: Rate limited, increasing sleep to 1.446704112s 2021/12/13 05:58:36 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:58:36 DEBUG : pacer: Rate limited, increasing sleep to 2.499319925s 2021/12/13 05:58:38 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:58:41 DEBUG : one: Need to transfer - File not found at Destination 2021/12/13 05:58:43 DEBUG : one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2021/12/13 05:58:43 INFO : one: Copied (new) 2021/12/13 05:58:45 DEBUG : one: Sizes differ (src 5 vs dst 3) 2021/12/13 05:58:47 DEBUG : one: md5 = 07912d142f5d63ee918b34796b5a2432 OK 2021/12/13 05:58:47 INFO : one: Copied (replaced existing) 2021/12/13 05:58:47 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:58:47 DEBUG : pacer: Rate limited, increasing sleep to 1.904950717s 2021/12/13 05:58:47 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:58:47 DEBUG : pacer: Rate limited, increasing sleep to 2.401252915s 2021/12/13 05:58:49 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:58:57 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:58:57 DEBUG : pacer: Rate limited, increasing sleep to 1.974622367s 2021/12/13 05:58:57 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:58:57 DEBUG : pacer: Rate limited, increasing sleep to 2.097117225s 2021/12/13 05:58:59 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:58:59 DEBUG : one: Size and modification time the same (differ by -456.789µs, within tolerance 1ms) 2021/12/13 05:58:59 DEBUG : one: Destination found in --compare-dest, skipping 2021/12/13 05:59:02 DEBUG : two: Size and modification time the same (differ by -456.789µs, within tolerance 1ms) 2021/12/13 05:59:02 DEBUG : two: Destination found in --compare-dest, skipping 2021/12/13 05:59:03 DEBUG : two: Size and modification time the same (differ by -456.789µs, within tolerance 1ms) 2021/12/13 05:59:03 DEBUG : two: Destination found in --compare-dest, skipping 2021/12/13 05:59:05 DEBUG : two: Sizes differ (src 5 vs dst 3) 2021/12/13 05:59:05 DEBUG : two: Need to transfer - File not found at Destination 2021/12/13 05:59:06 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:59:06 DEBUG : pacer: Rate limited, increasing sleep to 1.815613846s 2021/12/13 05:59:06 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:59:09 DEBUG : two: md5 = 2379e4ce8c3380e996ab0509f17069ad OK 2021/12/13 05:59:09 INFO : two: Copied (new) 2021/12/13 05:59:13 DEBUG : dst: Rmdir: contains trashed file: "two" 2021/12/13 05:59:13 DEBUG : dst: Rmdir: contains trashed file: "one" 2021/12/13 05:59:14 DEBUG : CompareDest: Rmdir: contains trashed file: "two" 2021/12/13 05:59:14 DEBUG : CompareDest: Rmdir: contains trashed file: "one" --- PASS: TestCopyFileCompareDest (40.34s) === RUN TestCopyFileCopyDest run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 05:59:15 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/dst" 2021/12/13 05:59:16 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/CopyDest" 2021/12/13 05:59:18 DEBUG : one: Need to transfer - File not found at Destination 2021/12/13 05:59:21 DEBUG : one: md5 = f97c5d29941bfb1b2fdab0874906ab82 OK 2021/12/13 05:59:21 INFO : one: Copied (new) 2021/12/13 05:59:22 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:59:22 DEBUG : pacer: Rate limited, increasing sleep to 1.442426456s 2021/12/13 05:59:22 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:59:22 DEBUG : one: Sizes differ (src 5 vs dst 3) 2021/12/13 05:59:24 DEBUG : one: md5 = 07912d142f5d63ee918b34796b5a2432 OK 2021/12/13 05:59:24 INFO : one: Copied (replaced existing) 2021/12/13 05:59:24 DEBUG : pacer: low level retry 1/1 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:59:24 DEBUG : pacer: Rate limited, increasing sleep to 1.271527394s run.go:283: Retry Put of "dst/one" to Google drive root 'rclone-test-volomal9xowupum7bugeyat4': 1/10 (googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:59:27 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:59:32 DEBUG : Creating backend with remote "TestDrive:rclone-test-volomal9xowupum7bugeyat4/BackupDir" 2021/12/13 05:59:33 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:59:33 DEBUG : pacer: Rate limited, increasing sleep to 1.423062626s 2021/12/13 05:59:33 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:59:35 DEBUG : one: Size and modification time the same (differ by -456.789µs, within tolerance 1ms) 2021/12/13 05:59:35 DEBUG : one: Sizes differ (src 5 vs dst 3) 2021/12/13 05:59:38 INFO : one: Moved (server-side) 2021/12/13 05:59:39 DEBUG : one: md5 = 07912d142f5d63ee918b34796b5a2432 OK 2021/12/13 05:59:39 INFO : one: Copied (server-side copy) 2021/12/13 05:59:39 DEBUG : one: Destination found in --copy-dest, using server-side copy 2021/12/13 05:59:43 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 05:59:43 DEBUG : pacer: Rate limited, increasing sleep to 1.337862783s 2021/12/13 05:59:43 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 05:59:43 DEBUG : two: Size and modification time the same (differ by -456.789µs, within tolerance 1ms) 2021/12/13 05:59:44 DEBUG : two: md5 = b8a9f715dbb64fd5c56e7783c6820a61 OK 2021/12/13 05:59:44 INFO : two: Copied (server-side copy) 2021/12/13 05:59:44 DEBUG : two: Destination found in --copy-dest, using server-side copy 2021/12/13 05:59:46 DEBUG : two: Size and modification time the same (differ by -456.789µs, within tolerance 1ms) 2021/12/13 05:59:46 DEBUG : two: Size and modification time the same (differ by -456.789µs, within tolerance 1ms) 2021/12/13 05:59:46 DEBUG : two: Unchanged skipping 2021/12/13 05:59:49 DEBUG : three: Sizes differ (src 7 vs dst 5) 2021/12/13 05:59:49 DEBUG : three: Destination not found in --copy-dest 2021/12/13 05:59:49 DEBUG : three: Need to transfer - File not found at Destination 2021/12/13 05:59:51 DEBUG : three: md5 = 1bccb9dccb3e9f6a3f9d2a8bdb54b7f5 OK 2021/12/13 05:59:51 INFO : three: Copied (new) 2021/12/13 05:59:57 DEBUG : dst: Rmdir: contains trashed file: "three" 2021/12/13 05:59:57 DEBUG : dst: Rmdir: contains trashed file: "one" 2021/12/13 05:59:57 DEBUG : dst: Rmdir: contains trashed file: "two" 2021/12/13 05:59:58 DEBUG : CopyDest: Rmdir: contains trashed file: "one" 2021/12/13 05:59:58 DEBUG : CopyDest: Rmdir: contains trashed file: "two" 2021/12/13 05:59:58 DEBUG : CopyDest: Rmdir: contains trashed file: "three" 2021/12/13 05:59:59 DEBUG : BackupDir: Rmdir: contains trashed file: "one" --- PASS: TestCopyFileCopyDest (45.74s) === RUN TestSameConfig --- PASS: TestSameConfig (0.00s) === RUN TestSame --- PASS: TestSame (0.00s) === RUN TestOverlapping --- PASS: TestOverlapping (0.00s) === RUN TestListFormat --- PASS: TestListFormat (0.00s) === RUN TestDirMove run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 06:00:13 DEBUG : A1/B2: Making directory 2021/12/13 06:00:14 DEBUG : A1/B1/C3: Making directory 2021/12/13 06:00:19 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 06:00:19 DEBUG : pacer: Rate limited, increasing sleep to 1.323852444s 2021/12/13 06:00:19 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 06:00:27 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 06:00:27 DEBUG : pacer: Rate limited, increasing sleep to 1.705215056s 2021/12/13 06:00:27 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=847840889997, userRateLimitExceeded) 2021/12/13 06:00:27 DEBUG : pacer: Rate limited, increasing sleep to 2.204904272s 2021/12/13 06:00:29 DEBUG : pacer: Reducing sleep to 0s 2021/12/13 06:00:32 INFO : A2/one: Moved (server-side) to: A3/one 2021/12/13 06:00:32 INFO : A2/B1/C1/four: Moved (server-side) to: A3/B1/C1/four 2021/12/13 06:00:32 INFO : A2/B1/C2/five: Moved (server-side) to: A3/B1/C2/five 2021/12/13 06:00:32 INFO : A2/B1/three: Moved (server-side) to: A3/B1/three 2021/12/13 06:00:33 INFO : A2/two: Moved (server-side) to: A3/two 2021/12/13 06:00:37 DEBUG : A2/B1: Rmdir: contains trashed file: "C3" 2021/12/13 06:00:37 DEBUG : A2/B1: Rmdir: contains trashed file: "C2" 2021/12/13 06:00:37 DEBUG : A2/B1: Rmdir: contains trashed file: "C1" 2021/12/13 06:00:38 DEBUG : A2: Rmdir: contains trashed file: "B2" 2021/12/13 06:00:38 DEBUG : A2: Rmdir: contains trashed file: "B1" 2021/12/13 06:00:47 DEBUG : A3/B1/C2: Rmdir: contains trashed file: "five" 2021/12/13 06:00:48 DEBUG : A3/B1/C1: Rmdir: contains trashed file: "four" 2021/12/13 06:00:49 DEBUG : A3/B1: Rmdir: contains trashed file: "C3" 2021/12/13 06:00:49 DEBUG : A3/B1: Rmdir: contains trashed file: "C2" 2021/12/13 06:00:49 DEBUG : A3/B1: Rmdir: contains trashed file: "C1" 2021/12/13 06:00:49 DEBUG : A3/B1: Rmdir: contains trashed file: "three" 2021/12/13 06:00:50 DEBUG : A3: Rmdir: contains trashed file: "B2" 2021/12/13 06:00:50 DEBUG : A3: Rmdir: contains trashed file: "B1" 2021/12/13 06:00:50 DEBUG : A3: Rmdir: contains trashed file: "two" 2021/12/13 06:00:50 DEBUG : A3: Rmdir: contains trashed file: "one" --- PASS: TestDirMove (50.04s) === RUN TestGetFsInfo run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" --- PASS: TestGetFsInfo (0.61s) === RUN TestRcat === RUN TestRcat/withChecksum=false,ignoreChecksum=false === CONT TestRcat run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 06:00:51 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': File to upload is small (34 bytes), uploading instead of streaming 2021/12/13 06:00:53 DEBUG : no_checksum_small_file_from_pipe: md5 = 4c762bc6ec18f21b23fcae426c7863b5 OK 2021/12/13 06:00:53 INFO : no_checksum_small_file_from_pipe: Copied (new) 2021/12/13 06:00:54 DEBUG : no_checksum_big_file_from_pipe: Sending chunk 0 length 102401 2021/12/13 06:00:55 DEBUG : no_checksum_big_file_from_pipe: Size and modification time the same (differ by -456.789µs, within tolerance 1ms) === RUN TestRcat/withChecksum=true,ignoreChecksum=false === CONT TestRcat run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 06:00:57 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': File to upload is small (34 bytes), uploading instead of streaming 2021/12/13 06:00:59 DEBUG : with_checksum_small_file_from_pipe: md5 = 4c762bc6ec18f21b23fcae426c7863b5 OK 2021/12/13 06:00:59 INFO : with_checksum_small_file_from_pipe: Copied (new) 2021/12/13 06:00:59 DEBUG : with_checksum_big_file_from_pipe: Sending chunk 0 length 102401 2021/12/13 06:01:01 DEBUG : with_checksum_big_file_from_pipe: md5 = fffc7956ba9a7b58a63c01b6ce1ddc45 OK 2021/12/13 06:01:01 DEBUG : with_checksum_big_file_from_pipe: Size and md5 of src and dst objects identical === RUN TestRcat/withChecksum=false,ignoreChecksum=true === CONT TestRcat run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 06:01:03 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': File to upload is small (34 bytes), uploading instead of streaming 2021/12/13 06:01:05 INFO : ignore_checksum_small_file_from_pipe: Copied (new) 2021/12/13 06:01:05 DEBUG : ignore_checksum_big_file_from_pipe: Sending chunk 0 length 102401 2021/12/13 06:01:06 DEBUG : ignore_checksum_big_file_from_pipe: Size and modification time the same (differ by -456.789µs, within tolerance 1ms) 2021/12/13 06:01:07 DEBUG : TestDrive: Loaded invalid token from config file - ignoring 2021/12/13 06:01:07 DEBUG : Saving config "token" in section "TestDrive" of the config file 2021/12/13 06:01:07 DEBUG : TestDrive: Saved new token in config file === RUN TestRcat/withChecksum=true,ignoreChecksum=true === CONT TestRcat run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 06:01:09 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': File to upload is small (34 bytes), uploading instead of streaming 2021/12/13 06:01:11 INFO : ignore_checksum_small_file_from_pipe: Copied (new) 2021/12/13 06:01:11 DEBUG : ignore_checksum_big_file_from_pipe: Sending chunk 0 length 102401 2021/12/13 06:01:12 DEBUG : ignore_checksum_big_file_from_pipe: Size of src and dst objects identical --- PASS: TestRcat (23.39s) --- PASS: TestRcat/withChecksum=false,ignoreChecksum=false (5.78s) --- PASS: TestRcat/withChecksum=true,ignoreChecksum=false (6.06s) --- PASS: TestRcat/withChecksum=false,ignoreChecksum=true (5.89s) --- PASS: TestRcat/withChecksum=true,ignoreChecksum=true (5.65s) === RUN TestRcatSize run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 06:01:16 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': File to upload is small (60 bytes), uploading instead of streaming 2021/12/13 06:01:18 DEBUG : potato2: md5 = d6548b156ea68a4e003e786df99eee76 OK 2021/12/13 06:01:18 INFO : potato2: Copied (new) --- PASS: TestRcatSize (5.09s) === RUN TestCopyFileMaxTransfer run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 06:01:20 DEBUG : TestCopyFileMaxTransfer/file1: Need to transfer - File not found at Destination 2021/12/13 06:01:22 DEBUG : TestCopyFileMaxTransfer/file1: md5 = 0ef726ce9b1a7692357ff70dd321d595 OK 2021/12/13 06:01:22 INFO : TestCopyFileMaxTransfer/file1: Copied (new) 2021/12/13 06:01:23 DEBUG : TestCopyFileMaxTransfer/file2: Need to transfer - File not found at Destination 2021/12/13 06:01:23 ERROR : TestCopyFileMaxTransfer/file2: Failed to copy: Post "https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CexplicitlyTrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink%2CshortcutDetails%2CexportLinks&keepRevisionForever=false&prettyPrint=false&supportsAllDrives=true&uploadType=multipart": googleapi: Copy failed: Max transfer limit reached as set by --max-transfer operations_test.go:1592: Error Trace: operations_test.go:1592 Error: Should be true Test: TestCopyFileMaxTransfer Messages: Not fatal error: Post "https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CexplicitlyTrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink%2CshortcutDetails%2CexportLinks&keepRevisionForever=false&prettyPrint=false&supportsAllDrives=true&uploadType=multipart": googleapi: Copy failed: Max transfer limit reached as set by --max-transfer: &fserrors.wrappedCountableError{error:(*url.Error)(0xc0009e0780), isCounted:true}: 2021/12/13 06:01:24 DEBUG : TestCopyFileMaxTransfer/file3: Need to transfer - File not found at Destination 2021/12/13 06:01:25 DEBUG : TestCopyFileMaxTransfer/file4: Need to transfer - File not found at Destination 2021/12/13 06:01:27 DEBUG : TestCopyFileMaxTransfer/file4: md5 = d01d179e01469fa7e8f67588aeffd848 OK 2021/12/13 06:01:27 INFO : TestCopyFileMaxTransfer/file4: Copied (new) 2021/12/13 06:01:29 DEBUG : TestCopyFileMaxTransfer: Rmdir: contains trashed file: "file4" 2021/12/13 06:01:29 DEBUG : TestCopyFileMaxTransfer: Rmdir: contains trashed file: "file1" --- FAIL: TestCopyFileMaxTransfer (10.62s) === RUN TestTouchDir run.go:181: Remote "Google drive root 'rclone-test-volomal9xowupum7bugeyat4'", Local "Local file system at /tmp/rclone884050330", Modify Window "1ms" 2021/12/13 06:01:37 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Touching "empty space" 2021/12/13 06:01:38 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Touching "potato2" 2021/12/13 06:01:38 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Touching "sub dir/potato3" 2021/12/13 06:01:42 DEBUG : sub dir: Rmdir: contains trashed file: "potato3" --- PASS: TestTouchDir (12.92s) === RUN TestRcAbout rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcAbout (0.00s) === RUN TestRcCleanup rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcCleanup (0.00s) === RUN TestRcCopyfile rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcCopyfile (0.00s) === RUN TestRcCopyurl rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcCopyurl (0.00s) === RUN TestRcDelete rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcDelete (0.00s) === RUN TestRcDeletefile rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcDeletefile (0.00s) === RUN TestRcList rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcList (0.00s) === RUN TestRcStat rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcStat (0.00s) === RUN TestRcMkdir rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcMkdir (0.00s) === RUN TestRcMovefile rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcMovefile (0.00s) === RUN TestRcPurge rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcPurge (0.00s) === RUN TestRcRmdir rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcRmdir (0.00s) === RUN TestRcRmdirs rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcRmdirs (0.00s) === RUN TestRcSize rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcSize (0.00s) === RUN TestRcPublicLink rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcPublicLink (0.00s) === RUN TestRcFsInfo rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcFsInfo (0.00s) === RUN TestUploadFile rc_test.go:25: Skipping test on non local remote --- SKIP: TestUploadFile (0.00s) === RUN TestRcCommand rc_test.go:25: Skipping test on non local remote --- SKIP: TestRcCommand (0.00s) FAIL 2021/12/13 06:01:43 DEBUG : Google drive root 'rclone-test-volomal9xowupum7bugeyat4': Purge remote "./operations.test -test.v -test.timeout 1h0m0s -remote TestDrive: -verbose" - Finished OK in 11m51.636500129s (try 1/5)