Posted 12 Oct 2012
Link to this post
Whenever I ask my developers to implement an Analytics metric the return to me saying that it will take them a couple of days to test because the measurements are not visible right away in Analytics.
Do you have a describtion of the path data passes from our application to being visible in the Analytics UI. Also the time it takes?
We have been playing a bit with the "Live" option but that does not seem to help us much.
Are there any papers on best practices for testing your analytics integration?
Posted 12 Oct 2012
Link to this post
We have heard this a few times and we have some general recommendations but first I'll give you a bit of general background on how data moves from the API call in your application to our servers and is shown on the web so the recommendations perhaps makes a bit more sense.
When you use an API call like e.g. trackFeature(...) we push the data on our internal data model and return control to the calling code. This internal model lives in memory and is persisted to disk once a minute (this is configurable), whenever we send data and we stop() is called. This model is pushed to our servers automatically once enough data has been collected or on certain events (calls to start(), stop() and trackException) or if you call the forceSync API method. Data that has not been delivered but has been persisted to disk is delivered the next time the application is started.
Once data arrives at our servers we process them in bulk and insert them into our data store. This is typically delayed for up to 5 minutes and once it is inserted into the data store it should be available using the "Live" mode. In non-Live mode we serve aggregated data that is calculated on a daily basis.
With this out of the way, on to the recommendations and what we typically see causes confusion when developing new features.
* We typically see that during development, the collected data is not persisted or sent to the servers because you stop the application from the development environment without calling the stop() method (basically stops it from the IDE) so the collected data vanishes. It is hard to give good recommendations on this, since it is a pretty natural flow for developers to just stop applications. One thing to do during development is to set the IAnalyticsMonitorSettings.setStorageSaveInterval quite low to increase the chances of data being persisted to disk at any one point.
* You could also (in development mode) run a loop that calls forceSync() periodically to push data to the servers, but there is an internal limitation that ensures that we do not send data on too frequent forceSync calls, so that is a less secure method than setting the setStorageSaveInterval low.
* Just for general troubleshooting, we also recommend that you during development configure a logger using the IAnalyticsMonitorSettings.setLoggingInterface(...) that collects the internal messages from the monitor. We log a message whenever we deliver data to out servers, so if you find that data is not being delivered you can use these messages to determine if the servers are contacted at all.
* We also recommend having either Fiddler or Charles running if you are uncertain if data has been delivered to our servers. These tools should allow you to snift http network traffic and you can generally filter on specific hosts being contacted.
* As part of development it is typically a good idea for the individual developer to access the CookieID of his application (using the getStatus().getCookieID() ) and use this ID when looking at data on the web. This can be done by either applying a general cookieID filter (at the bottom of the page) or you can go to the Session report and filter on this cookie to access the raw list of sessions that has been received from the individual developer machine. Dependent upon what you are developing and what data you are looking for, you can choose one or the other. If you set a general filter, you should be able to see data within 5 minutes (in Live mode) if the data has been delivered from the developer machine.
Hope this makes sense and that the recommendations are actionable for you. We are aware that it is very frustrating not getting immediate gratification when integrating new tracking points so I am very interested in learning more about you issues and any feedback you may have to improve this situation.
Looking forward to a continued dialogue.