Interview: Marko Djukic, Founder & CEO, Hentsū

What are you most looking forward to at the AI & Data Science in Trading conference?

Picking up on industry trends, networking and having the opportunity to further validate our product development approach around data pipelines and turnkey data solutions – which use many of the data from providers represented at this event.

What do you think are the biggest challenges facing data scientists/AI experts/quantitative investors in 2018/2019? Why are they important?

Understanding the data, finding relevancy and making use of it in a timely fashion. This is especially true with the alpha decay that is significantly underestimated with alternative data. Part of our mission is to enable the use of data and alternative data as quickly and efficiently as possible for the funds.

Now that data is here to stay, there is a lot of alternative data out there and doing the exploratory analysis on it is essential, but before you do that, the need for the data cleaned and then appropriately loaded has never been more critical. The heart of any good data is the ETL, so that your Quant or data scientist will make well-informed decisions on useful data, especially now that finding alpha is all about good data analysis!

What is going to be the biggest area of investment for your organisation/data/machine learning over the next 12 months?

We aim to simplify and reduce barriers to entry for data in fundamental and quantitative hedge funds. Our trading data as a service brings the complex data management as a defined outcome. This is where we are investing the bulk of our product development.

Can you share an example of how your system has been used by a new customer? Feel free to include any feedback or practical examples

We have taken data, from traditional and alternative sources and put it through data pipelines built on the public cloud. We’ve simplified the delivery and reduced barriers to entry for fundamental and quantitative hedge funds. We have worked with datasets of over 100TB per month. 

A specific example involved migrating an ETL solution from AWS to Azure. This was mainly driven by the costs savings associated with Azure SQL Data Warehouse, and its decoupled compute and storage configuration options. The solution allowed the client to easily add additional data sources, and most importantly allow them to focus on Data Science.

More generally we’re helping clients understand when and how to move to big data storage, analysis solutions, and how to leverage the power of public cloud for clustering and parallel processing.

Top tips: How can a quant strategist best engage and support a fundamental business to work together as a successful team?

End to end delivery of the solution, from the understanding of the business models, the data that is needed to enable the business and the delivery of the solution to the end user. A good quant strategist, will understand how the current business models work and will support the team by intimately understanding how to make the current models more profitable by looking at risk metrics and alpha analysis. Also, introducing more data-driven strategies that complement the current offering.

Cloud computing has been widely adopted in most sectors except financial services.  Is this now changing, and if so how will funds decide how and where to include external providers?

There has been a notable shift to cloud adoption since 2016, and not the legacy private cloud providers, but public cloud using the AWS, Azure and Google platforms. 

If you look 3 years into the future, what do you consider will be the biggest changes to the way hedge funds will use technology to generate alpha?

Big data has started to make an impact in the hedge fund space and this will grow more and more. Data for trading is becoming ever more prevalent and ever increasing in breadth and volume. The fundamental/discretionary trading is searching for additional signals and insights to augment the humans. There is seemingly easier access to sources alpha, but also it’s getting more difficult to discern the good from the bad. As people become more creative with how to aggregate their data to generate alpha in way's we could have never imagined, especially with the ever-growing and more powerful Deep Learning models finding some alternative signals for funds to capitalize on.

The methods hedge funds use to ingest new data sets are long, complex and expensive.  Is this sustainable, or will outsourcing play a role?

Hentsū has recognised this as not sustainable. We have heard from many funds to date who experience some level of pain with data ingestion and management. It’s a headache the funds have been forced to solve because of their need for data. This is the premise of our Trading Data as a Service and we know this is a problem we can solve for them allowing them to focus on generating alpha.


Is there a growing appetite for end-to-end automated investment systems?

Absolutely! The more funds we talk to, the clearer the message is that they want easy access to clean, actionable data so that their researchers can focus on alpha generation. Also, on the other side, the quants or data scientists do not want to deal with the execution side, and are looking for easier access to execution and trading to take advantage of the alpha. What takes time is getting the data and managing it and deploying it to the systems all of which can easily cause alpha decay.

To learn more about our sponsors, go to: or find out how you can become one too go to:

If you have any questions about sponsorship, you can also contact Thomas Allum.

Return to Blog