March 31 - April 1 2026 | New York Marriott, Brooklyn Bridge

Decode the Market. 
Build the Future.
Capture the Alpha.

Adaptive Systematic Macro: Fragmented Market Outlook

Speaker Q&A with Daniel DeWoskin, Managing Director of Quantitative Research at Graham Capital Management.

How is the opportunity set in systematic trading evolving in today’s market environment and where do you believe investors are currently under-allocated?

We are operating in a regime defined by policy volatility, persistent inflation uncertainty, widening divergence in central bank policy, geopolitical fragmentation, and more frequent market shocks. These forces are driving increased cross-asset dispersion across rates, currencies, commodities, and equity indices, an environment that has historically been well suited to systematic macro strategies. In this context, diversified and adaptive cross-asset macro strategies are well positioned to identify and capture these opportunities while providing liquidity and diversification benefits that many traditional portfolios lack. We believe investors remain structurally under-allocated to truly diversified systematic macro approaches that can dynamically allocate across global markets. Strategies that integrate adaptive modelling with active, dynamic risk management offer scalable, liquid, and differentiated sources of return that remain underrepresented in conventional asset allocations.

What distinguishes a sustainable alpha signal from one that is likely to decay, particularly in increasingly competitive and data-rich markets?

In my experience, there is no easy answer to this question. I have seen both surprisingly simple models have great success for years as well as complex data-intensive models that perform well for a time before their effectiveness diminishes as market conditions evolve. One general phenomenon that I have observed is that models can exhibit different levels of alpha decay depending on if they are betting on divergence or convergence. The former is typically more immune if the underlying structural reasons for price divergence still exist. While for the latter, you are in competition with other investors. When evaluating any signal, there should be a clear economic rationale for why it should work and in which market environments it is expected to perform. Equally important is ongoing validation: systematically monitoring whether the signal is behaving as expected, whether its return drivers remain intact, and whether changes in market structure are eroding its edge.

How should portfolios be structured to harness volatility while maintaining convexity and protection against tail-risk events in today’s market environment?

Active diversification and risk management at the portfolio construction stage are essential. This diversification can come in many forms including allocating across different trading styles, types of input data, trading timescales, and asset classes. Today’s market environment has plenty of opportunities, but there has also been an increase in the speed of market shocks. To protect against these, it’s important to keep in mind the convexity profiles of the strategies in the portfolio and their conditional performance during crisis events. This is where tactically allocating to structural protection in the form of options or volatility rather than correlation-based protection can be helpful, as market correlation shocks can lead to amplified losses during tail events. On a day-to-day basis, managing risk at the portfolio level requires dynamically adjusting allocations and portfolio risk targets to respond to changes in market volatility and correlation.

What role do market data and market risk assessments play in improving model robustness, and how do you guard against overfitting as datasets grow more complex?

Accurate and timely market data is essential for any model since models can really only be as reliable as the data they consume. On the other hand, even the best models will encounter market environments in which they struggle. This is especially true if the market conditions are outside of the realm of training data that the model has seen. One natural approach to improving robustness is to have the model modulate the strength of its signal based on the similarity of current market characteristics to regimes observed in the training data. This gives a measure of confidence in the signal and helps to decrease the risk of outsized exposures in unstable market conditions.

Guarding against overfitting requires consistent adherence to a good research process. This starts with a clear hypothesis and choosing the best data and modelling methodology for the problem. When exploring an idea, I tend to start with simple models and add complexity only as it is necessitated. Good fundamentals like thoughtful selection of in- and out-of-sample testing periods or properly setting up cross-validation are a necessity. But additionally, the introduction of new data, parameters, or features to a model should be well-motivated. This is especially important in situations where there is a small amount of training data available or a small number of relevant historical market events. Regularization can help, but it again requires careful consideration and should not be applied to every model indiscriminately. Finally, a model fails out-of-sample testing, then you must have the discipline to go back to the drawing board and start over and avoid the urge to make small tweaks in an attempt to salvage results.

How is the integration of data science with fundamental research reshaping quantitative investing, and what does this mean for the future skill sets required in investment teams?

Data has always been the core of quantitative investing. The difference now is that there has been an explosion in the quantity and variety of available data sources. Use of unstructured data in finance like text and images has become very common as they contain a wealth of information, but they pose obvious challenges that require more complex techniques for processing and analysis. As a result, experience in natural language processing or image processing can be very valuable, and the professionals best suited to this work often come from outside traditional finance. We’ve already found great success hiring machine learning researchers from other scientific fields who can apply their experience working with complex real-world data to analyzing new data sets and building systematic strategies. I expect this trend will continue, especially with the power of custom language models for ingesting and processing text.

Join Daniel on Day of 1 of Future Alpha 2026 at 11.40 AM for an insightful panel discussion: Navigating the Quant Landscape: Alpha, Risk, and Convexity in a Volatile World.

Secure your pass today!