I have an upload system in my .Net Core site that allows for a user to upload as many files as they want at a time. The files are uploaded directly to an S3 bucket and then processed. The problem is that when uploading, lets say 1,000 files, the browser does not like to create that many connections and more often than not, files routinely fail to upload. Even with retries enabled, those retries tend to fail since the browser only allows a certain number of concurrent connections.
What I am looking to do is place the files into a queue and only allow 20 files to be actually uploading at any given time (Think of how FileZilla queues items to upload). When a file completes, a new files is added until the queue is exhausted. I already have it so `AutoUpload` is set to `false` and I can place the files into an array to process but the `uploadSelectEvent.sender.upload()` method enables upload for everything.
Is there a way to pause all the uploads prior to enabling the upload so I can then resume them as needed? Is there a better way to handle this?