Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

In one of my local repositories I have a custom setup for uploading lfs files to an AWS S3 bucket. In one of my latest push attempts, the operation has failed for some reason but this happened after the large files were uploaded to the bucket. Problem is that I will have to try some configurations and to test the push operation; and each time I push the repo, the failed objects will be stored in the bucket again and again, resulting in more and more space in the bucket.Now the question is, how can I get rid of the objects that were failed? I can remove the files manually but I don't know how this will effect the repository.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
3.3k views
Welcome To Ask or Share your Answers For Others

1 Answer

等待大神答复

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...