Skip to content
Dec 22, 2023 9:42:52 AM3 min read

Letter from our CTO

Modena, December 2023.

From the tech development side, 2023 has been a watershed for our team. We close this year with a "mission accomplished" feeling regarding the tech improvements we set for 2023, while looking forward to all the new and shiny stuff our 5-year tech roadmap holds for us. 

As you dive into this article, let me anticipate that we're not here to boast about our accomplishments: tech successes are often ephemeral, and their value quickly evaporates as the context changes. This letter is about sharing what we've learned during this intense year and celebrating our team's successes. We often use a phrase internally that kind of reflects what it is all about:

 “It’s not a knowledge game; it’s a learning game.” 

So, for the sake of continuous improvement, let me take you through some of our key innovations and the impacts our teams have brought this year:


DataSmith release

We introduced a new component, codename "DataSmith", into our internal software toolkit, a game-changer for handling tabular and time-series data. DataSmith's ability to quickly build datasets has been a boon for our Axyon IRIS models, which rely on varied multi-source data streams. This enhancement means we're now equipped to train new AI models and bolster our live inference pipelines with unprecedented speed, scalability, and robustness.

bf37524d-3390-4666-9f25-ff46113e9cbd While we gained numerous lessons during the development of DataSmith, there are a couple particularly worth sharing. First, if you are serious about implementing a robust data pipeline for machine learning, the specific choice of feature store is not that important as long as you pick a modern, scalable solution. Choosing an open-source solution can be a viable option as well. In fact, due to the rapid evolution of software libraries in this space, making your software stack not too dependent on a specific solution (and actually making it a switchable module) might be a good idea. Second, a pipeline that extracts, loads, & transforms vast amounts of data with blazing speed but results in crappy data is entirely useless, and will actually cause a lot of headaches. Data quality is everything, and every hour spent in "Data Reliability Engineering" is a tremendous investment. 


Reframing AI Model Development & Selection

To refine our AI-driven solutions further, we completely reimagined our approach to AI model development and selection.

The finance sector's challenging signal-to-noise ratioiStock-1431862546 makes finding the 'perfect model' an elusive goal. Our strategy now focuses on automatically constructing a high-performing, robust ensemble of models. This pivot ensures that our solutions are yielding stable, predictable performance in a wildly unstable financial market.

The main learning here is that a large, heterogeneous set of simple & unremarkable ML models combined in an optimal and domain-informed way will beat a single "great" model any day while being hugely more robust to different situations.  


Release of Cassandra

We also rolled out "Cassandra", a critical addition to our internal software stack focused on data drift detection and model monitoring. Given the complexity of machine learning pipelines, Cassandra serves as our vigilant guardian. It proactively detects any drifts in data or shifts in model predictions and/or performance. This early-warning system empowers our Data Science team to maintain the highest standards of accuracy and reliability in our AI solutions: it essentially allows us to ascertain when (not) to trust our own models!

iStock-1397851108Among the main insights gained through this project is the realisation that when building observability into your pipeline, you don't have to get everything right from the start: monitoring the most critical stages will most likely ignite ideas on what to monitor next, and your observability stack will organically grow as a result.


What’s next?

These technological leaps are more than just milestones—they confirm our commitment to advancing AI in finance. As we move into 2024, we are excited to leverage these developments to enhance your investment strategies and outcomes further.

At Axyon AI, we're not just trying to build technology; we're disrupting quant research by scientifically leveraging AI and automation to build high-performing systematic investment strategies at scale. 

Thank you for joining us on this exciting journey. We look forward to continuing to explore new frontiers and achieving greater heights together.

Best regards,
Jacopo Credi
Chief Technology Officer