- Quitting my job
- Wrestling with Kibot tick data
- Initial modeling success
- Paper trading the intraday reaction to filings
Quitting my job
I’ve been working half-time for the past few months, trying to wrap up a major project and pass the baton to someone else. Now I feel like I’ve reached those goals, so I talked to my boss about fully quitting. I worried about quitting, because I thought I might lose whatever year-end bonus I could have gotten. However, my boss was cool. We agreed that I will be placed on unpaid leave, but technically retained as an employee through the end of the year. This allows me to keep my health insurance. In exchange, I’ll continue to answer questions from other employees, but I will no longer be coding. And then early next year I will receive a bonus equal to last year’s, but prorated to reflect my partial year worked. This arrangement is a fair one, but at the same time I feel lucky. Bonuses are purely subjective, and I would have no recourse if this last bonus were withheld.
I’m excited to have more time to work on my research. Lately, life has been harried. My wife just began her first semester of her PhD program, and we have five young kids and no nanny or daycare. So the days are tightly scheduled, with us and our oldest two children each taking shifts to help with the youngest three, particularly the one-year-old. Next year, we will see about getting health insurance through her university.
I bought tick data from Kibot. The price is relatively low. They seem to have a lot of expired tickers, which helps reduce survivorship bias. They provide a record of each trade, along with the best bid/ask price immediately before the trade. This sampling of historical bid-ask spreads is useful for estimating trading costs.
Kibot provides software for downloading in bulk, and updating the data, including adjusting timeseries for new splits and dividends. The problem is it only runs on Windows. I spent hours deep in the rabbit hole of trying to get it to run on linux. I tried Wine, Wine32, WineTricks, PlayOnLinux, Mono, dotnet48, and .Net Core. Nothing worked. Finally, I gave up and went and bought an ultra-small form factor Windows machine from the classified ads for $80, which seemed simpler than running a VM and even came with a Windows 10 Pro license.
For a while, I was quite happy, as the KibotAgent software makes it easy to schedule tasks to run automatically and update the data, even enabling NTFS filesystem compression. However, it’s been a real struggle to actually get the data. The first problem was that my hard drive was a very slow 1TB laptop drive. KibotAgent will allow you to run up to five download threads, but it throws all kinds of errors if your hard drive can’t respond quickly enough. After figuring that out, I reduced that setting all the way down to one thread. It was working slowly, but surely, until Windows forced a reboot when I wasn’t looking. When I restarted the download, it threw new errors when it encountered some small stock share classes it had already downloaded, but that didn’t have any recent ticks. And after enough of these errors, it just aborts the whole process. I seem to have it working again by ignoring any files already on disk, but I’m not sure I’ll be able to get the updates installed this way.
[Update] I gave up on the slow drive. Instead, I am now using samba to share a MooseFS folder from one box on my cluster to the Windows box. Now the KibotAgent saves data directly to the cluster. It works without errors, and the performance is way better. Now I can run the full five download threads.
My initial research is focused on low capacity trades. I hope to make a small amount money to extend my runway by looking in areas the big funds deem unworth the effort. Anything with high capacity is going to be much more competitive. The analogy in my head is that it is possible for a lion to starve on the same stretch of African savanna where a mouse easily finds food. The natural trade, then, is to try capture some of the immediate reaction to intraday filings.
Unfortunately, EDGAR timestamps are not nearly as useful as they could be. The “accepted” time is many seconds or even several minutes before the filing actually becomes available to the public on the SEC’s website. Furthermore, I have no idea if clients paying for the public dissemination service are still getting filings before the rest of us. Therefore, I’ve decided to start by just modeling the reaction from before the filing until some time after. This research is going well, however it does not indicate whether any money can actually be made by those like me who are likely not the fastest to trade. My model only estimates where the price will eventually settle.
Paper trading with IB
I decided to take a break from modeling to see if what I’ve already done can be used to make money. I set up a paper trading account with Interactive Brokers and downloaded the python API. Many people complain about IB’s API, saying it isn’t modern. Perhaps they like simpler synchronous REST APIs, which I see IB is now offering, as well. But I actually like the IB API. The example code was great, and I was able to quickly get up and running.
My system has multiple threads in the main process, and it also has another process with many threads in a pool for checking the SEC’s website for new filings. I make extensive use of logging, so I can go back after and better understand what is happening. I also built a nice overnight process which updates all the data and features. It generates a web page I can view from my phone so I can see at a glance if there were any overnight issues.
Anyway, after running the trading system for a few days, I’ve decided to put it on hold. The market is very efficient. My order was arriving after the market moved. It’s winner-take-all, and my fills were not good. There are lots of ways I could reduce my latency, but they would be a lot of work:
- Buy streaming data for all symbols so I could place limit orders with knowledge of the current price, as opposed to looking up the current price on the spot
- Buy access to the the public dissemination data feed, for around $20k a year
- Colocate my trade server
- Use C++ instead of python
- Use a faster broker than IB
- Optimize how I scrape the SEC’s website*
*This is a slippery slope. The rules allow you to hit the website at a maximum rate of 10 times a second. But the SEC has no way of knowing if one person is using multiple IP addresses to increase their rate. I don’t want to break the rules myself, but I’ll bet the current fastest trader of filing reactions is breaking the rules. So, I’m going to step back and look for a different way to compete.
Slowing it down
Next, I’m going to investigate trading in smallcap stocks first thing in the morning after a filing. So, no more intraday trading for the time being. It should be a level playing field with regard to speed, so all that matters is model accuracy and execution strategy. Additionally, this should lead to a higher-capacity trading strategy. Not only will I have more time to build up a position, but also there are just many more filings that are submitted after hours.