Our Technology.

Bringing breakthrough automated
machine learning technology
and AI for investing.

AI solutions for investing.

Axyon AI is, first and foremost, a technology company. Our team of engineers, scientists and quantitative researchers is on a mission to build a cutting-edge technology stack for AI-based time series modelling.



Axyon AI Platform. The link between Artificial Intelligence and Big Data is extremely powerful for forecasting financial time series.

We use Artificial Intelligence and in particular Deep Learning to make predictions on financial time series, and we can exploit a very large amount of data available regarding financial markets. This link between Artificial Intelligence and Big Data is extremely powerful for financial time-series predictions.
The computing power available to us is increasingly crucial to empower this link between Artificial Intelligence and Big Data.

The technology that we use to model financial time series is flexible and can model unknown distributions, overcoming the problem with traditional model technologies that need to assume normal probability distributions and linearity in relationships among variables.


Predictive models that are included in Axyon products.

Deep learning is a type of machine learning which uses particularly sophisticated neural networks, inspired by the functioning of the human brain, able to express their potential in the presence of enormous amounts of data. This technology underpins the predictive models that are included in Axyon AI products. Techniques such as deep learning delegate to algorithms the choice of the best functional form or a probability distribution, with significantly more performing results and at the same time based on fewer a priori assumptions.

Artificial intelligence


Make inferences about the unknown, based on knowledge learned from observed data.



Identifying observations exhibiting anomalous patterns, violating the regularities of other observations.



Learning the distribution of the data-generating process from observations, allowing to sample new realizations.

infografica AI PLATFORM_F2


● One-stop solution for teams of data scientists and engineers to streamline the AI development process
● Highly automated process based on AutoML and Evolutionary Computation methods for hyperparameter meta-optimisation and feature selection
● Advanced functionalities for time series modelling
● Highly scalable and reliable thanks to an advanced workload management system supporting multiple HPC and cloud computing clusters

Where technology meets ART.

What is an algorithm


Only an industrialised, fully-automated AI process can reach and maintain a performance and stability advantage in this competitive sector, free from human-induced bias, information leaks and overfit issues.



Our AI models are specifically designed to handle a wide range of input conditions. Additionally, we make use of unsupervised learning to detect anomalies in financial data for regulatory and risk-management purposes.



Increasing AI adoption calls for innovations and benchmarks to make its use responsible, ethical, transparent and accountable. We employ state of the art AI explainability techniques to shed light on the models’ inner workings.

Axyon AI - ai for asset management

Research timeline


Axyon AI Platform - v0.1 Release

First release of the Axyon Platform, a proprietary AI model training environment for data scientists and ML engineers, optimized for time series data.


Support to single-vendor cloud compute clusters

The Axyon Platform is able to leverage cloud compute nodes (from a single leading cloud provider), enabling the simultaneous execution of multiple training jobs in a scalable way.


MSc Thesis: Deep Q-Learning Techniques for Forex Trading

In this work, we applied Reinforcement Learning techniques (and in particular Deep Q-Learning) to the challenging problem of finding profitable trading strategies in the Forex market by trial-and-error in a simulated market. This was Axyon’s first of many MSc thesis projects in collaboration with the AImageLab research group at the University of Modena and Reggio Emilia.


Support to multi-cloud and on-premise compute nodes

The Axyon Platform is able to leverage cloud compute nodes from multiple vendors at the same time, as well as on-premise computational nodes that can be dynamically turned on at any time.


MSc Thesis: Deep Learning for Portfolio Allocation

In this work, we combined AI with an existing quantitative portfolio allocation model. In particular, we used the prediction of a Deep Neural Network as “investor views” in the Black-Litterman allocation model. The resulting MSc thesis has won the SIAT Technical Analyst Award 2019.


MSc Thesis: Deep Generative Neural Networks for Financial Time Series Modelling and Forecasting

In this work, we applied the Generative Adversarial Network (GAN) framework to the challenging task of financial time series generation. We showed how this model can be used to simulate future market scenarios by introducing a conditioning in the generator, using a recurrent neural network. To the best of our knowledge, this was the first application of GANs to financial time series at the time of this work.


MSc Thesis: Extension and Industrialization of Generative Neural Networks for Financial Time Series Modelling and Forecasting

In this work, we built up on our previous work in generative modelling, extending our GAN model designed for the conditional generation of financial time series. In particular, the contribution of this research activity was twofold: (i) we modified the generator so as to obtain a recurrent sequence-to-sequence architecture, and (ii) we added a self-attention mechanism bringing improved performance and interpretability.


SHAPE Project Axyon AI: a scalable HPC Platform for AI Algorithms in Finance

The goal of this work was to maximize the efficiency of accessing different types of remote computational resources potentially available to our proprietary Machine Learning platform, without losing the flexibility provided by in-house compute power. This is a mandatory requirement for a FinTech company oftentimes working with proprietary data that cannot be uploaded to cloud systems. We achieved this by designing and implementing a scalable and flexible DB-centric Master-Slave system architecture able to exploit any connected internal or external computational resource (including an HPC cluster) in a flawless and secure way. This was the first project that marked a fruitful and ongoing collaboration with CINECA, the largest Italian computing centre.


MSc Thesis: Reinforcement Learning for Asset Allocation

Reinforcement Learning (RL) has drawn a lot of attention thanks to its successful applications in many fields, most notably to playing games. In this work, we have designed and implemented an RL framework for the task of tactical asset allocation, given a portfolio of equity and fixed income assets. Our approach based on Policy Gradient made use of a particular reward function accounting not only for P&L but also for diversification and stability.


Master Project: Alternative Data for ML-based Asset Performance Forecasting

Alternative data is structured or unstructured data that is not typically used by traditional investment companies and that can provide insights into the future performance of a financial asset. This study examined the possibility of including alternative data sources into Axyon IRIS ML-based predictive models, by comparing the performance before and after the addition of data series extracted from Google Trends.


ESAX: Enhancing the Scalability of the Axyon Platform

In this work, carried out jointly with HPC consultants from CINECA, we brought the computational scalability of the Axyon Platform to a new level, almost quadrupling the previous peak of parallely executed jobs. Moreover, we added support to distributed training on multi-GPU/multi-node HPC clusters, and stress-tested our Platform using Marconi100, the 11th largest supercomputer in the world at the time of the project.


MSc Thesis: VaR Estimation with conditional GANs and GCNs

The Value-at-Risk (VaR) is a common risk measure, often required by financial regulators, typically estimated based on simple closed-form distributions. In this work, we aimed at overcoming the need for parametric assumptions through the use of deep generative models, namely a conditional generative adversarial networks (CGAN). We further extended the model to the multivariate case, by enabling the interaction of multiple stocks through graph convolutions in the generator. Work presented at SIMAI 2021.


MSc Thesis: Continual Learning Techniques for Financial Time Series

The problem of Continual Learning has drawn much interest in recent years, as training AI models able to learn new tasks or move to new domains poses the risk of forgetting earlier knowledge. In this study, we have applied several CL methods to train time series forecasting models in the financial domain, using Bayesian changepoint detection methods to segment series into different regimes and thus framing the problem as one of Domain-Incremental Learning. Work presented at Ital-IA 2022.



Confidential partner


FF4 EuroHPC Project Axyon - Leveraging HPC for AI and DL-powered Solutions for Asset Management

This is a 15-month research project under the FF4 EuroHPC framework, where Axyon leads a consortium of partners including CINECA and AImageLab. The project has the overall goal of improving the service offered by Axyon to its clients through several technological advancements. In particular, three main areas of improvement have been identified: computational scalability, risk management and adaptiveness of AI models.


MSc Thesis: Multivariate Autoregressive Denoising Diffusion Model for Value-at-Risk Evaluation

The Value-at-Risk (VaR) is a common risk measure, often required by financial regulators, typically estimated based on simple closed-form distributions. In this work, we built up on our existing GAN-based model for VaR estimation, by comparing it to newer deep learning approaches, namely an Autoregressive Denoising Diffusion Model based on the Timegrad architecture and a model based on Low-Rank Gaussian Copula Processes.


Contact us