Chunked Upload - "file in use by another process" Errors with larger files, using fallback Save() method results in incomplete corrupt files

0 Answers 398 Views
Form Upload
Matt
Top achievements
Rank 1
Matt asked on 05 Jul 2022, 09:36 PM | edited on 05 Jul 2022, 09:41 PM

Using a controller based off of the sample code and the chunked upload configuration, I find that in local testing I eventually end up with an IO exception claiming another process is trying to access the file when I'm uploading larger files.

I suspect I am somehow having the AppendToFile() call run into the previous AppendToFile() call before it is done syncing to disk or something, but have no proof.

In addition, when the fallback Save() call is used (if I forget to set a chunk size) the files turn out corrupt, being only partially uploaded. 

AV is disabled for the relevant directories and my Git services are not running, and so I'm not sure what else could be accessing the file. Is there a way to solve this issue?

I can go and try using the vanilla version of the sample code if I must but feel that there may be something else going on here regardless.

Thank you!

My Controller:


using Csla;
using Finbuckle.MultiTenant;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.StaticFiles;
using Newtonsoft.Json;
using Portal.Library;
using System.Net;
using System.Net.Http.Headers;
using System.Security.Claims;
using System.Text;

namespace WebUI.Controllers
{
    public class FileStore : Csla.Web.Mvc.Controller
    {
        private string UploadBasePath = "wwwroot\\uploads";
        private string UploadStoragePath = "wwwroot\\storage";

        public WebUI.TenantInfo? TenantInfo { get; private set; }

        public FileStore(IMultiTenantContextAccessor<WebUI.TenantInfo> mtaccessor)
        {
            TenantInfo = mtaccessor.MultiTenantContext.TenantInfo;
        }

        public class ChunkMetaData
        {
            public string UploadUid { get; set; }
            public string FileName { get; set; }
            public string RelativePath { get; set; }
            public string ContentType { get; set; }
            public long ChunkIndex { get; set; }
            public long TotalChunks { get; set; }
            public long TotalFileSize { get; set; }
        }

        public class FileResult
        {
            // Because the chunks are sent in a specific order,
            // the server is expected to send back a response
            // with the meta data of the chunk that is uploaded.
            public bool uploaded { get; set; }

            public string fileUid { get; set; }

            public string fileStorageGuid { get; set; } // internal GUID in DB per file not per session

            public string selector { get; set; } // because the UID isn't used or sent for the fallback Save()
        }

        public void AppendToFile(string fullPath, IFormFile content)
        {
            try
            {
                using (FileStream stream = new(fullPath, FileMode.Append, FileAccess.Write, FileShare.ReadWrite))
                {
                    content.CopyTo(stream);
                }
            }
            catch (IOException ex)
            {
                throw ex;
            }
        }

        public ActionResult ChunkSave(IEnumerable<IFormFile> files, string metaData)
        {
            if (metaData == null)
            {
                return Save(files);
            }

            MemoryStream ms = new(Encoding.UTF8.GetBytes(metaData));

            JsonSerializer serializer = new();
            ChunkMetaData chunkData;
            using (StreamReader streamReader = new(ms))
            {
                chunkData = (ChunkMetaData)serializer.Deserialize(streamReader, typeof(ChunkMetaData));
            }

            string path = Path.Combine(UploadBasePath, chunkData.FileName);
            // The Name of the Upload component is "files".
            if (files != null)
            {
                foreach (var file in files)
                {
                    AppendToFile(path, file);
                }
            }

            var isUploaded = chunkData.TotalChunks - 1 <= chunkData.ChunkIndex;
            FileResult fileBlob = new()
            {
                uploaded = isUploaded,
                fileUid = chunkData.UploadUid
            };

            if (isUploaded)
            {
                var bytes = Encoding.UTF8.GetBytes(chunkData.FileName);

                fileBlob.fileStorageGuid = CatalogAndMoveFileAsync(chunkData.FileName, path).ToString();
                fileBlob.selector = Convert.ToBase64String(bytes);
            }

            return Json(fileBlob);
        }

        private Guid CatalogAndMoveFileAsync(string FileName, string FilePath)
        {
            //sanity
            if (System.IO.File.Exists(FilePath))
            {
                // step 1 - generate guid in files table by inserting the requisite information
                var _user = Csla.ApplicationContext.User as ClaimsPrincipal;
                int userID = int.Parse(_user.Claims.FirstOrDefault(S => S.Type == "UserID").Value.ToString());

                var finfo = new FileInfo(FilePath);

                var obj = DataPortal.Create<FileStorage>();
                obj.UserID = userID;
                obj.Extension = Path.GetExtension(FileName);
                obj.OrigName = FileName;
                obj.KBSize = finfo.Length / 1024;
                obj.SiteID = TenantInfo.SiteID;

                // TODO: revisit and see if SaveObjectAsync updates our object v6
                // use the object's own save function to update the object
                obj = obj.Save();

                // do we have to re-fetch or is the object guid updated automatically?
                if (obj.ID != Guid.Empty)
                {
                    // step 2 - determine new name, location, move to it
                    string StorageSubfolders = Path.Combine(UploadStoragePath, obj.RelPath);

                    try
                    {
                        System.IO.Directory.CreateDirectory(StorageSubfolders); // so glad this func does it in one go

                        var newFilePath = Path.Combine(StorageSubfolders, obj.InternalName);
                        System.IO.File.Move(FilePath, newFilePath);

                        if (System.IO.File.Exists(newFilePath))
                        {
                            // step 3 - return the guid
                            return obj.ID;
                        }
                    }
                    catch (Exception ex)
                    {
                        if (obj.ID != Guid.Empty)
                        {
                            DataPortal.Delete<FileStorage>(obj.ID, userID);
                        }

                        throw new FileNotFoundException("FileStorage moving file", ex);
                    }
                }
            }

            return Guid.Empty;
        }

        public ActionResult RemoveAsync(string[] fileNames, string[] FileStorageUIDs)
        {
            // we don't actually care about fileNames, telerik always sends *something* related to that though
            // we rely on our file UID(s) that were sent back and hopefully captured in the onSuccess event

            // The parameter of the Remove action must be called "fileNames"
            if (FileStorageUIDs.Length > 0)
            {
                foreach (var fUID in FileStorageUIDs)
                {
                    // get the current UID to be 100% sure we're deleting only our own file
                    var _user = Csla.ApplicationContext.User as ClaimsPrincipal;
                    int userID = int.Parse(_user.Claims.FirstOrDefault(S => S.Type == "UserID").Value.ToString());

                    var obj = DataPortal.Fetch<FileStorage>(new Guid(fUID), userID);

                    if (obj.ID != Guid.Empty)
                    {
                        var basePath = Path.Combine(UploadStoragePath, obj.RelPath);
                        var physicalPath = Path.Combine(basePath, obj.InternalName);
                        if (System.IO.File.Exists(physicalPath))
                        {
                            System.IO.File.Delete(physicalPath);

                           //TODO: maybe try/catch?
                            if (Directory.EnumerateFiles(basePath).Any() == false)
                            {
                                Directory.Delete(basePath, true);
                            }

                            // we already fetched with UID
                            DataPortal.Delete<FileStorage>(obj.ID, userID);
                        }
                    }
                }
            }

            // Return an empty string to signify success
            return Content("");
        }

        // TODO: while this method fully works, for whatever reason it corrupts files when we fall back to it
        public ActionResult Save(IEnumerable<IFormFile> files)
        {
            List<string> fileNames = new();
            List<FileResult> results = new();
            
            // The Name of the Upload component is "files".
            if (files != null)
            {
                foreach (var file in files)
                {
                    var fileContent = ContentDispositionHeaderValue.Parse(file.ContentDisposition);

                    // Some browsers send file names with full path.
                    // The demo is interested only in the file name.
                    var fileName = Path.GetFileName(fileContent.FileName.ToString().Trim('"'));
                    var physicalPath = Path.Combine(UploadBasePath, fileName);

                    using var fileStream = new FileStream(physicalPath, FileMode.Create);
                    file.CopyToAsync(fileStream);

                    fileNames.Add(fileName);
                }

                // since we can't delete while in the scope of the loop there, we have to do a second one
                foreach (var fileName in fileNames)
                {
                    var physicalPath = Path.Combine(UploadBasePath, fileName);

                    var fileStorageGuid = CatalogAndMoveFileAsync(fileName, physicalPath);
                    string guidString = fileStorageGuid.ToString();

                    var bytes = Encoding.UTF8.GetBytes(fileName);

                    results.Add(new FileResult()
                    {
                        uploaded = true,
                        fileStorageGuid = guidString,
                        selector = Convert.ToBase64String(bytes)
                    });
                }

                return Json(results);
            }

            return Content("");
        }

        public async Task GetFile(string FileStorageUID)
        {
            var _user = Csla.ApplicationContext.User as ClaimsPrincipal;
            int userID = int.Parse(_user.Claims.FirstOrDefault(S => S.Type == "UserID").Value.ToString());

            // TODO: maybe add a func to the user dal to fetch hierarchy?
            // TODO: check role here, if UID doesn't match, don't pass userID

            var obj = await DataPortal.FetchAsync<FileStorage>(new Guid(FileStorageUID), 0); // uid is second param if needed

            if (obj.ID != Guid.Empty)
            {
                var basePath = Path.Combine(UploadStoragePath, obj.RelPath);
                var physicalPath = Path.Combine(basePath, obj.InternalName);
                if (System.IO.File.Exists(physicalPath))
                {
                    var provider = new FileExtensionContentTypeProvider();
                    if (!provider.TryGetContentType(physicalPath, out string contentType))
                    {
                        contentType = "application/octet-stream";
                    }

                    Response.Clear();
                    Response.Headers.Add("Content-Disposition", "inline;filename=" + obj.OrigName);
                    Response.ContentType = contentType;

                    await Response.SendFileAsync(physicalPath);
                }
            }
        }
    }
}

My View:


                        @(Html.Kendo().Upload()
                            .Name("files")
                            .Async(a => a
                                .Save("ChunkSave", "FileStore")
                                .Remove("Remove", "FileStore")
                                .AutoUpload(true)
                                .ChunkSize(10240) 
                            )
                            .Validation(validation => validation
                                //.AllowedExtensions(new string[] { ".gif", ".jpg", ".png" })
                                .MaxFileSize(200000000) // 200 megs
                               //.MinFileSize(512)
                            )
                            .Events(events => events
                                .Success("onSuccess")
                                .Remove("onRemove")
                                .Upload("onUpload")
                            ).Multiple(false) // one file and one file only
                        ) 

Mihaela
Telerik team
commented on 08 Jul 2022, 02:07 PM

Thank you for the provided code snippets.

I have created a demo project to replicate the issue, and it appears that the file uploads successfully at my end (I have tested it with a 6MB file). It is attached for your reference.

It would be of great help if you could modify the application based on the behavior you are experiencing to pinpoint what is causing it.

Thank you for your cooperation.

Matt
Top achievements
Rank 1
commented on 08 Jul 2022, 04:33 PM

Thank you for taking the time to look at this, I opened a help ticket yesterday as well, 1571835, and failed to disclose the use of .net Core, complicating things slightly.

That said, upon attempting to use the solution you created and used with no problems, I still get:

"System.IO.IOException: 'The process cannot access the file 'C:\Users\matt\Downloads\TelerikAspNetCoreApp57\TelerikAspNetCoreApp57\TelerikAspNetCoreApp57\wwwroot\uploads\Git-2.36.0-64-bit.exe' because it is being used by another process.'"

Matt
Top achievements
Rank 1
commented on 08 Jul 2022, 04:35 PM | edited

I was uploading the git installer as it's around 48mb but the error occurs with smaller files as well...

Not sure if I have some random process running on my machine causing this or what.

Matt
Top achievements
Rank 1
commented on 08 Jul 2022, 04:43 PM

That is, my solution is using .Net 6 which I believe is the point at which core and non core merged?
Matt
Top achievements
Rank 1
commented on 08 Jul 2022, 04:53 PM

For whatever reason a more vanilla version of the upload controller appears to work for me, shared in a ticket from one of your reps; I'll attach it:


using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using System;
using System.Collections.Generic;
using System.IO;
using Newtonsoft.Json;
using System.Text;
using System.Net.Http.Headers;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Hosting;

namespace Core_Uploader_Chunks.Controllers { 
    public class UploadController : Controller
    {
        private IWebHostEnvironment WebHostEnvironment;

        public UploadController(IWebHostEnvironment _environment)
        {
            WebHostEnvironment = _environment;
        }
        public class ChunkMetaData
        {
            public string UploadUid { get; set; }
            public string FileName { get; set; }
            public string RelativePath { get; set; }
            public string ContentType { get; set; }
            public long ChunkIndex { get; set; }
            public long TotalChunks { get; set; }
            public long TotalFileSize { get; set; }
        }

        public class FileResult
        {
            public bool uploaded { get; set; }
            public string fileUid { get; set; }
        }
        public void AppendToFile(string fullPath, IFormFile content)
        {
            try
            {
                using (FileStream stream = new FileStream(fullPath, FileMode.Append, FileAccess.Write, FileShare.ReadWrite))
                {
                    content.CopyTo(stream);
                }
            }
            catch (IOException ex)
            {
                throw ex;
            }
        }

        public async Task<ActionResult> Chunk_Upload_Save(IEnumerable<IFormFile> files, string metaData)
        {
            if (metaData == null)
            {
                return await Save(files);
            }

            MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(metaData));

            JsonSerializer serializer = new JsonSerializer();
            ChunkMetaData chunkData;
            using (StreamReader streamReader = new StreamReader(ms))
            {
                chunkData = (ChunkMetaData)serializer.Deserialize(streamReader, typeof(ChunkMetaData));
            }

            string path = String.Empty;
            // The Name of the Upload component is "files"
            if (files != null)
            {
                foreach (var file in files)
                {
                    path = Path.Combine(WebHostEnvironment.WebRootPath, "App_Data", chunkData.FileName);

                    //AppendToFile(path, file);
                }
            }

            FileResult fileBlob = new FileResult();
            fileBlob.uploaded = chunkData.TotalChunks - 1 <= chunkData.ChunkIndex;
            fileBlob.fileUid = chunkData.UploadUid;

            return Json(fileBlob);
        }

        public ActionResult Chunk_Upload_Remove(string[] fileNames)
        {
            // The parameter of the Remove action must be called "fileNames"

            if (fileNames != null)
            {
                foreach (var fullName in fileNames)
                {
                    var fileName = Path.GetFileName(fullName);
                    var physicalPath = Path.Combine(WebHostEnvironment.WebRootPath, "App_Data", fileName);

                    // TODO: Verify user permissions

                    if (System.IO.File.Exists(physicalPath))
                    {
                        // The files are not actually removed in this demo
                        // System.IO.File.Delete(physicalPath);
                    }
                }
            }

            // Return an empty string to signify success
            return Content("");
        }

        public async Task<ActionResult> Save(IEnumerable<IFormFile> files)
        {
            // The Name of the Upload component is "files"
            if (files != null)
            {
                foreach (var file in files)
                {
                    var fileContent = ContentDispositionHeaderValue.Parse(file.ContentDisposition);

                    // Some browsers send file names with full path.
                    // We are only interested in the file name.
                    var fileName = Path.GetFileName(fileContent.FileName.ToString().Trim('"'));
                    var physicalPath = Path.Combine(WebHostEnvironment.WebRootPath, "App_Data", fileName);

                    // The files are not actually saved in this demo
                    //using (var fileStream = new FileStream(physicalPath, FileMode.Create))
                    //{
                    //    await file.CopyToAsync(fileStream);
                    //}
                }
            }

            // Return an empty string to signify success
            return Content("");
        }
    }
}

Matt
Top achievements
Rank 1
commented on 08 Jul 2022, 05:49 PM

I take that back, that example is bad because the Telerik rep on my ticket commented out the AppendToFile call, negating the whole point of my question.
Matt
Top achievements
Rank 1
commented on 08 Jul 2022, 05:59 PM

After changing file location to save outside the users directory it seems to work - security context?
Stoyan
Telerik team
commented on 13 Jul 2022, 11:09 AM

Hello Matt,

I noticed this thread is а duplicate to the private ticket thread you have opened. For this reason I will copy my answer from there so it is available for the community.

Without a runnable sample that showcases the experienced behavior I suspect the most likely reason for the encountered error is the configuration of the path string that refers to a directory that doesn't exist or is not configured in the Startup.cs/Program.cs. It is also possible the reason to be security related.

That being said the Upload Component doesn't have a specific requirement for the configuration of the Controller to save files. The Controller can be customized according to your needs.

For more information on the topic I recommend the Microsoft's Upload files Docs article or our Professional Services which can help you achieve custom scenarios that run outside of our support scope.

If more questions related to the configuration of the Telerik UI Components arise, please consider continuing the discussion here to avoid additional overhead. Thank you.

No answers yet. Maybe you can help?

Tags
Form Upload
Asked by
Matt
Top achievements
Rank 1
Share this question
or