This is a migrated thread and some comments may be shown as answers.

slow insertion/deletion of objects

7 Answers 130 Views
General Discussions
This is a migrated thread and some comments may be shown as answers.
This question is locked. New answers and comments are not allowed.
Kyaw Zayar Han
Top achievements
Rank 1
Kyaw Zayar Han asked on 07 Oct 2009, 02:32 AM
hi. our application requires persisting of large amount of objects.

we are using Oracle 11g over gigabit ethernet connection. both the application and database servers are of decent configuration.

with the L2 caching disabled, my test application takes 28 seconds to insert 10,000 objects of the following implementation, and seconds to delete. When caching is enabled, deletion performance is improved (less than 2 seconds), but insertion still take the same 28 seconds.

On the other hand, if i run the same test against a local SQL Express 2005 instance, both insertion/deletion takes approximately 3 seconds. However, it is our requirement to use Oracle.

Is such performance expected? What can we do about it? Thanks in advance.

    [Telerik.OpenAccess.Persistent]
    public class TestClass
    {
        public int x, y;
    }

7 Answers, 1 is accepted

Sort by
0
Kyaw Zayar Han
Top achievements
Rank 1
answered on 07 Oct 2009, 02:39 AM
clarification,
when cache is not enabled, deletion takes 24 seconds
0
IT-Als
Top achievements
Rank 1
answered on 07 Oct 2009, 07:20 AM
Hi Kyaw,

It sounds like it is Oracle related, but to be sure it is not related to other stuff... can you post the code that does the insert and update... from the point where you retrieve the object scope and until you commit.

Thanks

/HG
0
Calvin Zhang
Top achievements
Rank 1
answered on 07 Oct 2009, 07:43 AM
Thanks for the reply. Here's the code used,

static void Main(string[] args)
        {
            const int recordCount = 10000;
            try
            {
                IObjectScope scope = ObjectScopeProvider1.GetNewObjectScope();
                var query = scope.Extent<TestClass>();
                ObjectContainer container = new ObjectContainer();
                IObjectCollector collector = new FetchGroupCollector(FetchGroupCollector.DeepFetchGroup);

                ShowMsg("reading existing records ...");
                container.CopyFrom(scope, query.GetType().Name, query, collector);
                List<TestClass> lst = container.Extent<TestClass>().ToList<TestClass>();
                ShowMsg("\tOK " + lst.Count + " records found");

                ShowMsg("adding "+recordCount+" records...");
                container.Transaction.Begin();

                for (int i = 0; i < recordCount; i++)
                    container.Add(new TestClass());

                container.Transaction.Commit();
               ShowMsg("\tOK");

                ShowMsg("committing changes...");
                ObjectContainer.CommitChanges(container.GetChanges(ObjectContainer.Verify.Changed), ObjectContainer.Verify.Changed, scope, true, true);
                ShowMsg("\tOK");

                container = new ObjectContainer();
                ShowMsg("reading existing records ...");
                container.CopyFrom(scope, query.GetType().Name, query, collector);
                lst = container.Extent<TestClass>().ToList<TestClass>();
                
                ShowMsg("\tOK "+lst.Count+" records found");

                ShowMsg("deleting existing records ...");
                container.Transaction.Begin();
                container.Remove(lst);
                container.Transaction.Commit();

                ObjectContainer.CommitChanges(container.GetChanges(ObjectContainer.Verify.Changed), ObjectContainer.Verify.Changed, scope, true, true);
                ShowMsg("\tOK");

                ShowMsg("completed\n");
            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.ToString());
            }
        }

        private static void ShowMsg(string str)
        {
            Console.WriteLine(DateTime.Now.ToString("mm:ss.fff") + "\t" + str);
        }
0
IT-Als
Top achievements
Rank 1
answered on 07 Oct 2009, 08:38 AM
Hi Calvin,

Thanks for the code post.

First of all: Do you have any specific reasons for using the ObjectContainer? And what keygenerator do you have configured for your class?

You could also try to use the q.ForwardsOnly and the scope.FetchPlan.Limit to reduce round trips to database server, which I think is the problem.

Here's your code rewritten to use the scope directly:

static void Main(string[] args)
        {
            const int recordCount = 10000;
            try
            {
                IObjectScope scope = ObjectScopeProvider1.GetNewObjectScope();
                var q = scope.Extent<TestClass>();

                ShowMsg("reading existing");
                // q.ToList() executes the query
                List<TestClass> lst = q.ToList();
                ShowMsg("\tOK " + lst.Count + " records found");

                lst.Clear();
                ShowMsg("Adding "+recordCount+" records to list");              
                for (int i = 0; i < recordCount; i++)
                    lst.Add(new TestClass());

                ShowMsg("Persisting list");              
                scope.Transaction.Begin();
                scope.Add(lst);
                scope.Transaction.Commit();
                ShowMsg("Persisted");

                ShowMsg("reading existing");
                // q.ToList() executes the query
                lst.Clear();
                lst = q.ToList();
                ShowMsg("\tOK " + lst.Count + " records found");
               
                ShowMsg("deleting existing records ...");
                scope.Transaction.Begin();
                scope.Remove(lst);
                scope.Transaction.Commit();

                ShowMsg("completed\n");
            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.ToString());
            }
        }

        private static void ShowMsg(string str)
        {
            Console.WriteLine(DateTime.Now.ToString("mm:ss.fff") + "\t" + str);
        }


Regards

HG


0
Calvin Zhang
Top achievements
Rank 1
answered on 07 Oct 2009, 09:58 AM
Thank you very much, HG. You even wrote the code for me!

There's a minor mistake in the code though. We can't add the list of persistable class directly into the scope object and try to persist it, doing so causes a runtime error. I changed it and it works fine.
                ShowMsg("Persisting list");
                scope.Transaction.Begin();
                for (int i = 0; i < lst.Count; i++)
                    scope.Add(lst[i]);
                scope.Transaction.Commit();
                ShowMsg("Persisted");

Come to the performance part, removing the container object does improve the performance significantly. Now it takes 4.5 seconds to insert and 0.7 seconds to delete. However, our application has a client/server architecture. We are actually maintaining the container object on the client side and synchronize it by passing ChangeSet through WCF between client and server. Do you have a better way to do this if performance is a priority?

By the way, the local SQL server can do both insertion and deletion within a second if container is not used. There is still a gap as compared to Oracle. But i think that's related to the network.

Thanks again.

0
IT-Als
Top achievements
Rank 1
answered on 07 Oct 2009, 10:40 AM
Ok, that's a performance boost :-)

Sorry for the typo - but you managed to fix it - I was writing the code in the forum editor :-)

I think you are right about the the local / remote database. The difference is network traffic for the remote box.

Regarding the ObjectContainer part:
I haven't been using the ObjectContainer that much. But I have been using WCF in combination with OpenAccess as persistence model for the serverside. In our scenario we just serialize WCF data contract objects over the wire to the client. It has some advantages / disadvantages

Advantages:
- No need for ObjectContainer implementation on the client side
- The interface for the service method is more intuitive (that is, is has names like CreateCustomer, that takes a DataContract.Customer as parameter)
- Your services are consumable by other client types (non .NET)... naturally depending on the on-the-wire format you use.
- Nice performance... we have not encountered any issues yet.

Disadvantages:
- You need to do some kind of change tracking yourself (it not that hard to write)
- You possibly need to do a mapping between your persistent model and your data contract model.

PS:
I am about to prepare a series of blog posts / articles on creating a reference architecture on using WCF as communication betweeen server and client and using OA as the persistence model on the server side. By reference architecture I mean: it shows how it can be done... not necessarily the only/best way to do it - that's up to reader... :-)




0
Calvin Zhang
Top achievements
Rank 1
answered on 08 Oct 2009, 03:01 AM
Thanks again for all the suggestions and recommendations... We shall probably reconsider our architecture in terms of integrating WCF and OA.

I am looking forward to read about the reference architecture. Would you mind emailing me your blog address? My email address is calvin.zhang@consistel.com

Tags
General Discussions
Asked by
Kyaw Zayar Han
Top achievements
Rank 1
Answers by
Kyaw Zayar Han
Top achievements
Rank 1
IT-Als
Top achievements
Rank 1
Calvin Zhang
Top achievements
Rank 1
Share this question
or