-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Delete all files within folder #422
Comments
I'm still no closer to resolving this, unfortunately. Having used the help facility, I know I can use a command such as the one below: -r will ensure files are deleted recursively, but my goal isn't to delete a folder - just the contents within it. Commands such as this will list files from a particular folder, but is there something similar that would allow me to delete these files but keep the folder intact?
|
What you want can be achieved through batch request to API, but as of v2.1.0 gdrive doesn't support batch queries yet. Similar issue mentioned in #381. Until a fix is introduced in a future version, you can use 2 alternative ways to bypass it.
What this does is basically listing all files within the folder, getting file IDs from it using cut and executing |
Solution number 2 is along the lines of what I'm looking for and seems to work. It is preferable because solution number 1 would involve a different folder ID on each occasion, which only further complicates things as I intend to use this script as a cron. One further question, however, is would it be possible to remove all files that are older than (let's say) 24 hours old? I was thinking something along the lines of this, but it returns an error 400 message so clearly doesn't work.
|
@ChrisMann89 Drive API expects date formats given in queries to be in ISO 8601 format, for that you need to use
(Notice the single quotes and sub-shell call in the query) Before I finish, I want to point out 2 things in this kind of query.
Your query would become
(Notice the extra quotes with backslash) Now in this form, using the above functions, if your query gets a "Failed" message from gdrive it will try again in 5 seconds. (And if you check your logs you'll realize you're getting failed messages often) |
Fantastic! Problem solved. To summarise, I have a cron that creates a backup of files on my server each night and sends a .tar.gz over to Google Drive. Over time, these build up so this new script - added via another line in the cron list - will automatically remove any backups older than one week old and free up space in Google Drive. To do this, I have changed 'yesterday' to 'last-week' in the script, meaning it now looks as follows: Thank you for your help, @mbenlioglu. I now consider this issue closed. |
Glad to be of help |
I would like to delete all file in 1 month ago.
Show eeror usage: date [-jnRu] [-d dst] [-r seconds] [-t west] [-v[+|-]val[ymwdHMS]] ...
[-f fmt date | [[[mm]dd]HH]MM[[cc]yy][.ss]] [+format]
Failed to list files: googleapi: Error 400: Invalid Value, invalid Please help me |
I test on Mac! |
Deleting files with
|
Is there a way to delete all files within a particular folder?
I'm running a cron on my server that creates a backup every 24 hours, with the file that is created being synced with Google Drive.
I wish to delete the backups after a few days, however, and am unsure of the command required to delete all files within my allocated backup folder.
I've found a way that would allow me to delete the folder but I don't want that...just the files located within.
The text was updated successfully, but these errors were encountered: