This question is locked. New answers and comments are not allowed.
I'm having problems with Telerik ORM when I'm trying import a large amount of data (16252 records, 45 columns) from a text file into a SQLServer table.
Here is a snippet of the code I am using:
After adding the 16000+ records to the data context, I call dataContext.SaveChanges(). It blocks on the SaveChanges for about 4 minutes before throwing the following exception:
System.InvalidOperationException: Invalid attempt to call Read when reader is closed.
at Telerik.OpenAccess.RT.Adonet2Generic.Impl.ResultSetImp.next()
at OpenAccessRuntime.Relational.sql.MsSqlDriver.getAutoIncColumnValue(RelationalTable classTable, Connection con, Statement stat)
at Telerik.OpenAccess.Runtime.KeyGenerator.AutoIncRelationalKeyGenerator.generatePrimaryKeyPost(String className, RelationalTable classTable, Object[] data, Connection con, Statement stat)
at OpenAccessRuntime.Relational.RelationalStorageManager.generateInserts(NewObjectOID oid, Int32 index, ClassMetaData cmd, PersistGraph graph, Int32[] fieldNos, CharBuf s, Object[] oidData, IntArray toUpdateIndexes)
at OpenAccessRuntime.Relational.RelationalStorageManager.persistPass1(PersistGraph graph)
at OpenAccessRuntime.Relational.RelationalStorageManager.doUpdates(StatesToStore toStore, StateContainer container, Boolean retainValues)
at OpenAccessRuntime.Relational.RelationalStorageManager.store(StatesToStore toStore, DeletePacket toDelete, Boolean returnFieldsUpdatedBySM, Int32 storeOption, Boolean evictClasses)
My question is, what is causing this exception, and is there a better way to use ORM to import large amounts of data to a SQLServer database?
In the past, when trying to basically do the same type of large data import, I've used BulkCopy. But SQLBulkCopy is a bit of a pain and I was hoping to use ORM to simplify my work.
Any help you can provide would be greatly appreciated.
Here is a snippet of the code I am using:
using
(EntitiesModel dataContext =
new
EntitiesModel())
{
foreach
(JITDump_Model item
in
jitRecords)
{
JIT_Dump dataRecord =
new
JIT_Dump();
dataRecord.AddChangeDeleteFlag = item.AddChangeDeleteFlag;
dataRecord.CarrierArrivalTime = item.CarrierArrival;
dataRecord.CarrierCode = item.CarrierCode;
...
dataContext.Add(dataRecord);
}
dataContext.SaveChanges();
}
After adding the 16000+ records to the data context, I call dataContext.SaveChanges(). It blocks on the SaveChanges for about 4 minutes before throwing the following exception:
System.InvalidOperationException: Invalid attempt to call Read when reader is closed.
at Telerik.OpenAccess.RT.Adonet2Generic.Impl.ResultSetImp.next()
at OpenAccessRuntime.Relational.sql.MsSqlDriver.getAutoIncColumnValue(RelationalTable classTable, Connection con, Statement stat)
at Telerik.OpenAccess.Runtime.KeyGenerator.AutoIncRelationalKeyGenerator.generatePrimaryKeyPost(String className, RelationalTable classTable, Object[] data, Connection con, Statement stat)
at OpenAccessRuntime.Relational.RelationalStorageManager.generateInserts(NewObjectOID oid, Int32 index, ClassMetaData cmd, PersistGraph graph, Int32[] fieldNos, CharBuf s, Object[] oidData, IntArray toUpdateIndexes)
at OpenAccessRuntime.Relational.RelationalStorageManager.persistPass1(PersistGraph graph)
at OpenAccessRuntime.Relational.RelationalStorageManager.doUpdates(StatesToStore toStore, StateContainer container, Boolean retainValues)
at OpenAccessRuntime.Relational.RelationalStorageManager.store(StatesToStore toStore, DeletePacket toDelete, Boolean returnFieldsUpdatedBySM, Int32 storeOption, Boolean evictClasses)
My question is, what is causing this exception, and is there a better way to use ORM to import large amounts of data to a SQLServer database?
In the past, when trying to basically do the same type of large data import, I've used BulkCopy. But SQLBulkCopy is a bit of a pain and I was hoping to use ORM to simplify my work.
Any help you can provide would be greatly appreciated.