Optimization suggestion for future versions
Author: gbullr
Creation Date: 4/18/2012 11:36 AM
profile picture

gbullr

#1
When working w/ large Datasets optimizations become time consuming even when using montecarlo and genetic algorithms. I would like to suggest a feature that allows you to run the backtest on a certain percentage of the dataset chosen at random. So for example you could run an optimization on 25 pct of the Russell 3000 w/o having to create a separate dataset of 25 pct of Russell 3000 components chosen at Random.

Thanks your consideration.
profile picture

Eugene

#2
The longer an optimization runs, the more memory it consumes and the more objects are created. Although GC (garbage collector) works in the background, in .NET 2.0 (on which WL 6.3 is based) it might not be as effective as in .NET 4.0 (on which future builds will be targeted). This can affects optimizations.

Also, it may be a sign of some inefficiency in your Strategy code. Finally, slowness may even indicate some sleeping bug: for example, we had a case of a progressively slowing optimization and it turned out that the culprit is the built-in StochD indicator!

For now, try using less bars. Use less parameters: the more the worse (curve fitting). Ideal systems are parameterless (adaptive) and rarely require re-optimization. Another idea is to use a coarser parameter step at first and then refine it, once the optimum range is found. Be creative.

profile picture

gbullr

#3
Thank you.

I have done all of these things and it is why I suggested this additional step. Finally a time out could also be good.... After x iterations w/out a new max profit... quit... Just suggestions...

Thanks.

profile picture

Eugene

#4
QUOTE:
After x iterations w/out a new max profit... quit...

Essentially, this is what Genetic Algorithm is about - optimizing until a satisfactory fitness level is found (where fitness can be any optimization metric).
QUOTE:
I would like to suggest a feature that allows you to run the backtest on a certain percentage of the dataset chosen at random.

Honestly, I tried to understand the reasoning but failed. Why would one want the results of an incomplete optimization? Just because it would be faster than it is now? At any rate, creating a separate DataSet (breaking up the optimization and running its subsets concurrently in N Wealth-Labs in N Windows accounts, installing a faster CPU, installing WL on a powerful server in a cloud instance...) is easier to accomplish than to modify Wealth-Lab.

That's not a solution but a workaround. The real solution would be to introduce multi-core CPU support in Strategies and/or Optimizations. Please call your Fidelity rep and tell him that a customer needs faster performing optimizations by virtue of multi-core CPU and/or GPGPU support.
profile picture

Cone

#5
I don't think it was mentioned here, but one of the great things about the GA optimizer is the ability to find (or nearly find) the optimum set of parameters using only a small percentage of the runs required in an exhaustive optimization. Consequently, the GA allows you to narrow the set of parameters for a future exhaustive optimization, which for me is still invaluable for the 3D visualizations of the optimization space of multiple parameters.

Anyway, as Eugene said, we think the next release will be built on .NET 4.0, so it should help.
profile picture

gbullr

#6
I am posting here because it is not a new post. I can't seem to find a function that was alluded to in a post that allowed you to create a random smaller dataset based on an existing one.


Thanks in advance for your help.

profile picture

Eugene

#7
I don't think I ever saw that function, but this collection of links from the FAQ may help you build one on your own: Is it possible to make actions on DataSets programmatically? (at the end of the page)

The practical implementation (the code side of things) of your idea would deserve a different, more appropriate forum thread though.
This website uses cookies to improve your experience. We'll assume you're ok with that, but you can opt-out if you wish (Read more).