-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sync / Clear bucket before uploading? #74
Comments
Honestly when releasing for prod you usually want to use a hash on the end of your filenames to cache bust. I'm fairly certain this will override the old files though |
Right, so I am actually using hashing to invalidate caches… Actually, that’s the issue. Since I’m generating unique filenames each time, it’s basically just adding more and more files to my S3 bucket. I can of course go through and manually clear them out from time to time and just rebuild, but just curious if there was an option to make it more automated.
… On Jun 17, 2017, at 12:30 AM, Mika Kalathil ***@***.***> wrote:
Honestly when releasing for prod you usually want to use a hash on the end of your filenames to cache bust. I'm fairly certain this will override the old files though
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub <#74 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AAQtslzh_KuIKNSlJTz-s-d4Y7h8K-3Gks5sE4COgaJpZM4N9IRK>.
|
Sadly no, though I would totally accept a pr with one since I could see this being useful! I can of course add it later on too |
Cool! I'll look into it. I know someone has a sync plugin out there.
… On Jun 17, 2017, at 14:16, Mika Kalathil ***@***.***> wrote:
Sadly no, though I would totally accept a pr with one since I could see this being useful! I can of course add it later on too
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.
|
@andrewmartin Out of curiosity, did you ever come up with a solution? Dealing with this exact same issue. |
@wootencl I got sidetracked on it, ended up leaving it for now. Would love to hear if you / anyone came up with a solution... |
Does someone has a solution for this issue? |
Nothing yet, PR's welcome! |
A napkin note to whoever decides to roll up sleeves for this: in order to achieve zero downtime, deletion of the old/obsolete S3 objects would need to happen after the upload of the new ones, not before. |
Made an attempt on this here and seems to work just fine.
Spot on. We are using Circle CI so this is dockerized and just runs after the deployment completes to ensure no downtime. Let me know if you guys are interested, and I can move this code inside the plugin with a config (default to false):
|
@dejanvasic85 I would looooove to have this feature in the plugin! |
@dejanvasic85 A PR is welcome! |
2 years later, still no PR? |
Any news on this? |
This would be nice! |
Maybe a good opportunity to do some OSS work 😉 |
Hey there, I came across needing to use this tool again. Got a branch to cater for a sync option but when I push the branch I am getting access denied @MikaAK. |
I have a really quick question; is there an option to
sync
the assets up, e.g. clean the S3 bucket before I upload? Just curious how you might handle asset revving to clear out the old assets to keep the bucket clean. Sorry if I missed something obvious, thanks again for your wonderful work on this. Using it in production with great success!Cheers.
Please complete these steps and check these boxes (by putting an
x
insidethe brackets) before filing your issue:
version) I am using.
my issue.
that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
actually perform all of these steps or the issue is not with the library itself
Thank you for adhering to this process! This ensures that I can pay attention to issues that are relevant and answer questions faster.
The text was updated successfully, but these errors were encountered: