Hi,
I was working with Datasource and Grid and doing some mass updates for the data (see my earlier post about what I'm trying to achieve). For smaller datasets everything went okay.
Then I tried with real data, having a couple of thousand rows. It's quite big, but I would like to avoid server pagination at this point, since I'd like to use Kendo filters and sorting on the data.
It turns out setting data values is roughly of quadratic complexity and for my use it takes far too long to be actually usable. My timing test results are as follows:
100 rows of data, ~1200 ms
200 rows of data, ~4600 ms
400 rows of data, ~16000 ms
My question is, am I doing something wrong? What is the correct and performance-aware way of updating Datasource contents? Here's the code that causes the problem:
01.
$(
"#testi"
).click(
function
() {
02.
var
data=DS.data();
03.
for
(
var
i=0; i<data.length; i++) {
04.
var
row = DS.at(i);
05.
if
(row.fi==row.en) {
06.
row.set(
"ok"
,
true
);
07.
}
else
{
08.
row.set(
"ok"
,
false
);
09.
}
10.
}
11.
});
It seems that row.set is the slow one. I tried accessing the data on if condition with both "row.fi" and "data[i].fi", but turns out they are both very fast.
I also tried replacing var row = DS.at(i); with var row = data[i]; but they seem to do virtually the same thing with no performace difference.
For what it's worth, I'm on Google Chrome 27.