This is a migrated thread and some comments may be shown as answers.

Memory leaks. OutOfMemoryException

5 Answers 132 Views
General Discussions
This is a migrated thread and some comments may be shown as answers.
This question is locked. New answers and comments are not allowed.
Anton Sharov
Top achievements
Rank 1
Anton Sharov asked on 06 May 2010, 07:42 AM
Hello everybody!

I've got  question about memory leaks.
I've got following code:
 [OperationBehavior(ReleaseInstanceMode = ReleaseInstanceMode.AfterCall)] 
        public void AddDataFile(DataFileMessage userDataFile) 
        { 
            Console.WriteLine("Allocated memory before is = {0}M",GC.GetTotalMemory(false)/1000000);     
 
            if (userDataFile == nullthrow new ArgumentNullException("userDataFile"); 
            if (userDataFile.FileToStore == nullthrow new ArgumentNullException("stream is null"); 
 
 
 
            IObjectScope scope = ArchivariusModelProvider.GetNewObjectScope(); 
 
            if (scope != null
            { 
                var query = from appgroup in scope.Extent<ApplicationGroup>() 
                            where appgroup.GroupId == userDataFile.GroupId 
                            select appgroup; 
 
                Debug.Assert(query.Count() == 1); 
                DF dFile = new DF 
                               { 
                                   FileName = userDataFile.FileName, 
                                   DataFiles = userDataFile.FileToStore.ReadFullyStream(), 
                                   ApplicationGroup = query.First() 
                               }; 
                scope.Transaction.Begin(); 
                scope.Add(dFile); 
                scope.Transaction.Commit(); 
                scope.Dispose(); 
            } 
 
        } 
 

DF class (alias) is the following:
    public partial class DataFile 
    { 
         
 
        //The 'no-args' constructor required by OpenAccess.  
        public DataFile() 
        { 
            DataFilesId = Guid.NewGuid(); 
        } 
 
        [Telerik.OpenAccess.FieldAlias("dataFilesId")] 
        public Guid DataFilesId 
        { 
            get { return dataFilesId; } 
            set { dataFilesId = value; } 
        } 
        
        [Telerik.OpenAccess.FieldAlias("dataFiles")] 
        public byte[] DataFiles 
        { 
            get { return dataFiles; } 
            set { this.dataFiles = value; } 
        } 
 
        [Telerik.OpenAccess.FieldAlias("groupId")] 
        public int GroupId 
        { 
            get { return groupId; } 
            set { this.groupId = value; } 
        } 
 
        [Telerik.OpenAccess.FieldAlias("applicationGroup")] 
        public ApplicationGroup ApplicationGroup 
        { 
            get { return applicationGroup; } 
            set { this.applicationGroup = value; } 
        } 
        [Telerik.OpenAccess.FieldAlias("fileName")] 
        public string FileName 
        { 
            get { return fileName; } 
            set { fileName = value; } 
        } 
         
    } 
ReadFullyStream is the following:
 public static byte[] ReadFullyStream(this Stream stream) 
        { 
            if (stream == nullthrow new ArgumentNullException("stream"); 
            byte[] buffer = new byte[32768]; 
            using (MemoryStream ms = new MemoryStream()) 
            { 
                while (true
                { 
                    int read = stream.Read(buffer, 0, buffer.Length); 
                    if (read <= 0) 
                    { 
                       return ms.ToArray(); 
                    } 
                    ms.Write(buffer, 0, read); 
                } 
            } 
             
        } 

After I trying to upload several times of file with 100mb size I get OutOfMemoryException. I tried to profile with CLRProfiler and it seems that OA pooling something wich holds reference to this huge byte[] and prevent it from GC collect. What I'm doing wrong?
If needed I will provide more info.

Thanks in advance.

5 Answers, 1 is accepted

Sort by
0
PetarP
Telerik team
answered on 11 May 2010, 04:20 PM
Hello Anton Sharov,

The problem does not originate in OpenAccess itself. If you strip out all ORM code and just use the extension method you are using for read you will still get an out of memory exception. In order to avoid the exception you should think of another way of reading larger files. I believe that you will find the following topic useful.

All the best,
Petar
the Telerik team

Do you want to have your say when we set our development plans? Do you want to know when a feature you care about is added or when a bug fixed? Explore the Telerik Public Issue Tracking system and vote to affect the priority of the items.
0
Anton Sharov
Top achievements
Rank 1
answered on 12 May 2010, 06:40 AM
Hello, Petar!

Thank you for reply.

I tried use it without OA and it seems alright. Ofcourse, if I'm transforming big files like 1GB or greater I will get outofmemory exception (because of transform to byte[]), but when I try to  upload ten times file of 100mb using OA  I will get the same exception, because my memory is always increasing in this case. And this is not desired behavior. I've profiled and seen some OA objects reference on byte[].
So, it seems strange, to me. I suspect that I'm wrong with some settings.

0
PetarP
Telerik team
answered on 27 May 2010, 04:46 PM
Hi Anton Sharov,

We are still not able to reproduce the problem on our side. If your project allows it will be perfect for us if you can send us a sample project that represents the issue. If it is possible for you please do let us know and we will convert the thread to one that allows attachments.

Sincerely yours,
Petar
the Telerik team

Do you want to have your say when we set our development plans? Do you want to know when a feature you care about is added or when a bug fixed? Explore the Telerik Public Issue Tracking system and vote to affect the priority of the items.
0
Shahbaz
Top achievements
Rank 1
answered on 13 Dec 2015, 04:25 AM

Hello,

I'm facing the same issue. Was there any resolution for this?

I'm using latest dll 2015 Q1.

0
Thomas
Telerik team
answered on 17 Dec 2015, 08:38 AM
In general, OpenAccess does not hold references more/longer than needed. For this to work you will need to obey the Dispose pattern. OpenAccess Context instances as well as the IObjectScope and Database implement it. This will ensure that persistent instances will be GC'able in time and memory is quickly available again.
However, in the original post the code to fill a large stream seems suboptimal: the target MemoryStream does not know how big it will be in the end, and so will need to resize often, which is an expensive operation. I think using stream.Length would be better:


public static byte[] ReadFullyStream(this Stream stream) 
    if (stream == null) throw new ArgumentNullException("stream"); 
    var len = (int)stream.Length;
        
    var buffer = new byte[len]; 
    stream.CopyTo(buffer);
    return buffer;              
 


Regards,
Thomas
Telerik
 
Check out the latest announcement about Telerik Data Access vNext as a powerful framework able to solve core development problems.
Tags
General Discussions
Asked by
Anton Sharov
Top achievements
Rank 1
Answers by
PetarP
Telerik team
Anton Sharov
Top achievements
Rank 1
Shahbaz
Top achievements
Rank 1
Thomas
Telerik team
Share this question
or