If I understand things correctly you have so far used RemObjects SDK. A product from a company that has successfully developed a framework targeting distributed application development for a long time.
The alternative you are considering is Open Access – a product that is basically an ORM with no proven track record, for all practical purposes it’s still a product under development.
As can be seen from the examples it is certainly possible to develop “toy” demos using Open Access – but will it work in the real world? Given a “real world” database design, is Open Access usable at all? By “real world” I mean a database with more than, let’s say, 50 tables and more than 10 million records.
Take a look at:
Using RemObjects DataAbstract I can get a barely usable server up and running in about half an hour, and a decent one in a couple of days. Combine DataAbstract with allroundautomations.com Direct Oracle Access components and you have the ingredients of a stable and efficient distributed multitier solution using Delphi that’s interoperable with Silverlight, WCF, Windows CE, Java and a number of other scenarios.
Exactly why you would want to throw away, if I understand you correctly, a working solution beats me…
Now I haven’t really used RemObjects for about a year, I’ve changed job – but at that time the documentation was on the poor side as such things go.
With Open Access, Telerik has done a wonderful job. If you go by what they write, you can fairly easily be mislead into thinking that Open Access is a great product providing loads of functionality found nowhere else.
So while using RemObjects DataAbstract will surprise you when it comes to what the product actually can do, Open Access will surprise you when it comes to what the product can’t do.
You can also look forward to sometimes more than 200000 % performance overhead, and that’s not a spelling error, compared to accessing the results of and evaluating queries using System.Data.SqlClient directly – and I expect this is also true for Oracle ODP.Net – this was tested using a relatively small database with less than a million records and evaluating a query against four tables using linq. Obviously your mileage will vary. Picking the query apart and executing it against one table at a time reduces the overhead to about 10000 %, most of which probably comes from the xml based mapping of columns to entity properties. This is with the MS SQL Server and the test application running on the same computer – move the database to another computer and everything gets much worse, and before you know it you have successfully written an application ready to consume all available network bandwidth and then some. This might make Cisco happy, but I’m not so sure about your customers.
The basic reason for this is that when you rip away the antlr generated code together with ports of various open source java projects from Open Access there isn’t all that much left – a bit of state information and caching. Open Access evaluates linq expressions on its own using the database as a dumb data store. It doesn’t matter how efficient they actually evaluates the queries, all the data has to be loaded from the database – they are a long, long way from evaluating them on a performance level comparable to SQL servers like MS SQL or Oracle RDBMS.
Fetch plans are just a way to reduce the number of round trips required to load all that data from the database – it still gets loaded. If you want to avoid all that, the provided mechanisms make programming against DataSet and DataTable seem relatively painless and straight forward.
Remobjects DataAbstract and Developer Express XPO are products that support SilverLight on their own, and I’m sure there are others out there too. With Open Access, you have to rely on something else, like RIA or WCF for getting things across the network – this is partially true for XPO too, but they provide a SilverLight integration that only requires you to provide an implementation of a predefined interface and viola you have intelligent updates based on delta packets.
WCF services generated along the guidelines of Open Access loose most of what could actually be interesting about the product in the translation – state information. IEditableObject and INotifyPropertyChanged is left as an exercise for the programmer – so in most cases, when you actually want to edit a piece of data, you end up writing significantly more code when you base your product on OpenAccess compared to using System.Data.SqlClient directly and mapping operations to a well defined WCF service that you plug into the view model on the Silverlight side.
Open Access is an alternative to NHibernate, MS Entity Framework, MS Linq SQL, and what you find in the System.Data.* namespaces. If you have the full .Net framework available DataSet and DataTable will usually take you further than Open Access with less work and better integration with the rest of the .Net framework.
If you really want strongly typed objects and performance is an issue consider hand coding the data access code and roll your own entities – if performance is not an issue take a look at MS Entity Framework.
Take this together with the ramifications of SOX http://en.wikipedia.org/wiki/Sarbanes%E2%80%93Oxley_Act and the related ongoing adjustments to laws, regulations and standardization efforts that increasingly govern architecture, design and implementation details related to the IT industry at large.
If you want to be really sure that the product you are developing fits into this picture of mandated control, auditing, and security and so on – you’ll need every bit of performance you can tweak out of available hardware and software resources.
The average database is not going to get smaller, and while there are obvious ways to provide Open Access with only a limited amount of data – requiring you to do so kind of defeats the purpose of an ORM.
really hope that going forward Telerik will rectify most of the shortcomings of
Open Access – either that or scrap the product.