I'm having some trouble with one of my indicators, I can't seem to figure out why it is always returning NaN. The indicator that is having issues is at the end of a chain of several other custom indicators.
At the root, I have an "averaging" indicator which is based on the IndicatorBase class.
Next in the chain is a "line" class which is based on the SeriesCalculatorBaseWithValues class and uses the "averaging" indicator as it's input.
After the "line" class is a "trend" detection indicator, again based on the SeriesCalculatorBaseWithValues. The "trend" indicator uses the "line" indicator and the Close price series as inputs.
The problem I'm having is with the final indicator in the chain, a "range" price indicator. This indicator uses the "trend" indicator and the High price series as inputs.
In the code I've attached, both the "RangeHighDebug" and "RangeHighChained" indicators are using the same basic CalcNewValue() logic. The weird thing is that the "RangeHighDebug" should not ever return NaN because the logic that would normally handle this case has been replaced by the 'return 2.0' line. When I apply these indicators to a chart the debug version shows several NaN values, and no values of 2.0. Also, the breakpoint in the chained version is never hit.
Something weird is happening here. I suspect I'm just doing something dumb, but I can't seem to figure it out. Any thoughts on what's wrong?
PS : I've been able to workaround the problem by pushing the trend calculations down into the range indicator.