Continuing with baseline methods for comparison… Decision trees have been studied for decades, and innumerable enhancements have been proposed. Unfortunately, this complicates the situation.

Continue reading# Category Archives: regression

# k-Nearest neighbors

KNN is simple. I’m interested in trying it because it seems like the way some traders think. In interesting situations, they think back to other stocks that were in a similar situation in the past and bet on a similar outcome. The trick is knowing what aspects of a situation are really useful for prediction.

Continue reading# Ordinary least squares linear regression

Multiple regression is the ubiquitous workhorse of most scientific fields. The solution form is interpretable. It is very efficient to solve due to the simple closed-form solution. For massive datasets, finding the solution can even be parallelized using map-reduce.

Continue reading# Handcrafted model baseline

My current model for dataset-1 is handcrafted, but calibrated on the data. I won’t say too much about this method, except that it is very opinionated. In the Bayesian spirit, I bring domain knowledge into the mix and let the data update my priors. Every part of the model can be visualized and explained.

Continue reading