Every DataSeries description that you assign should be unique
Author: thodder
Creation Date: 11/2/2010 5:06 PM
profile picture

thodder

#1
I ran into a problem while using Sum.Series to build my own indicator. The example code is as follows:

CODE:
Please log in to see this code.


When I sum the values like this...

QUOTE:
dsValue1 = Sum.Series(dsValue1, period);
dsValue2 = Sum.Series(dsValue2, period);

DataSeries dsBadSeries = dsValue1 / dsValue2;


All dsBadSeries values are calculated as 1.0 since the numerator (dsValue1) and denominator (dsValue2) appear to have the same value.

When I sum the values like this...

QUOTE:
DataSeries dsGoodSeries = new DataSeries(Bars, "");

for(int bar = period; bar < Bars.Count; bar++)
{
dsGoodSeries[bar] = Sum.Value(bar, dsValue1, period) / Sum.Value(bar, dsValue2, period);
}


The series dsGoodSeries is what I would expect.

I've noticed in the Community.Indicators library that the Series method typically checks the bars.Cache using the description of the indicator to see if it was already calculated. The DataSeries is pulled from the cache instead of calculating a new one if the description matches. I assume that is what happened in this case. As an example, this is what is done for MACDEx in the Community.Indicator library...

CODE:
Please log in to see this code.


If you refer to my example code at the top, dsValue1 and dsValue2 were given a blank name as I just wanted to use them for temporary values. Since this blank name was used by Sum.Series to build it's description, it created the same description for both. This resulted in Sum.Series building dsValue1, but then returning the cached DataSeries for dsValue1 when the similar calculation was done for dsValue2.

I just want to point out the danger of caching all indicator calculations in the Series method. Even inventing names that are unique to my indicator such as...

QUOTE:
DataSeries dsValue1 = new DataSeries(Bars, "value1");
DataSeries dsValue2 = new DataSeries(Bars, "value2");


...doesn't guarantee that other indicators won't assign similar temporary values to a DataSeries objects.

It took me a while of scratching my head to figure out what caused this problem. I wanted to document it here as I'm sure others will stumble onto this at some point.

I believe caching all DataSeries values in this way seems dangerous as all developers have to realize that any description they assign to a DataSeries must be unique.

profile picture

Eugene

#2
That's right, the .Value method, that calculates a single value on-the-fly (btw, suboptimal choice in this context) doesn't use the Bars.Cache like the .Series method does, hence it doesn't rely on the Bars.Cache mechanism.

As what you just described is caused by misconception rather than bug, I corrected the catchy but irrelevant topic name "Sum.Series bug?!" to something more appropriate. Thanks for stressing the fact that to be looked up in Bars.Cache properly, any data series has to have a unique description.
profile picture

thodder

#3
Thanks for changing the topic name as yours does sound more appropriate. I don't want to be alarmist.

Not only did I want to point out that DataSeries need to have unique descriptions if the Series method is used, but there is a downside to caching all DataSeries objects. As I mentioned, I could create a DataSeries with a unique name of "value1", but if another indicator used the same description within it's constructor to calculate the values (unique name within each indicator), then it could end up pulling my calculation from the cache instead of recalculating the series.

Instead of using Sum.Series (which internally appears to cache the results), I finally decided a more optimal approach was the following:

CODE:
Please log in to see this code.


The above code would replace:
CODE:
Please log in to see this code.


This is a safer way to use standard indicators within a constructor for a DataSeries when temporary series objects are used. We want to avoid accidentally populating the cache when the values are designed to be temporary only.

profile picture

thodder

#4
Am I correct that calculations between DataSeries objects are NOT cached? For example the following code seems to correctly produce two distinct results even though the DataSeries dsValue1 and dsValue2 have the same descriptions. Based on this test, I assume calculations will not cache their results.

CODE:
Please log in to see this code.
profile picture

Cone

#5
Calculations are cached. Add this to your script -
CODE:
Please log in to see this code.

But, you created two DataSeries with the same description "". So, the internal description of the series to add to the cache is the same for both calculations, "/". The second attempt to add the same key to the cache probably creates an internal error that is "eaten"; so, the second result does not actually get cached for that reason.
profile picture

thodder

#6
Thanks Cone, that's good to know. I see your example code will allow me to review all the descriptions in the cache in the debug window.

Is there a way I could disable the cache processing while building an indicator? I'm going through John Ehler's books on digital signal processing (DSP), and I believe the cache is occasionally corrupting my calculations by pulling a bad value out of the cache. The calculations above are a lot simplier than the book. I can give the variables dsValue1 and dsValue2 descriptions of "value1" and "value2"; however, if I do something similar in another indicator, I could run into calculation problems if both indicators are in the same strategy. That could be a nightmare to debug.

At the moment I'm just trying to write code within the indicator so it doesn't pull or push anything to the cache.
profile picture

Eugene

#7
The cache can't be disabled but can be cleared with Bars.Cache.Clear.
profile picture

thodder

#8
Okay. I don't want to clear a cache from within an indicator constructor, so I'll just use Cone's example code to make sure nothing gets added to the cache during my indicator calculations.

Thanks.
This website uses cookies to improve your experience. We'll assume you're ok with that, but you can opt-out if you wish (Read more).