QuantAgent:
Price-Driven Multi-Agent LLMs for High-Frequency Trading

Abstract
Recent advances in Large Language Models (LLMs) have demonstrated impressive capabilities in financial reasoning and market understanding. However, existing LLM frameworks for High-Frequency Trading (HFT) face significant limitations in terms of latency, interpretability, and the ability to handle real-time market dynamics. We introduce QuantAgent, the first multi-agent LLM framework specifically designed for HFT, which decomposes trading into four specialized agents: Indicator, Pattern, Trend, and Risk.
Our framework leverages structured financial priors and language-native reasoning to achieve superior performance in zero-shot evaluations across ten financial instruments, including Bitcoin and Nasdaq futures, in terms of predictive accuracy and cumulative return over 4-hour trading intervals. The results suggest that combining structured financial priors with language-native reasoning unlocks new potential for traceable, real-time decision systems in high-frequency financial markets.
Multi-Agent Interaction
Indicator Agent: Comprehensive technical analysis using multiple indicators including RSI, MACD, and Bollinger Bands to generate robust trading signals with high accuracy.

Evaluation Metrics

Overview of benchmark assets, including market type, data collection window, and a fixed sample of 5,000 four-hour OHLC bars per asset for consistent comparison.
Main Results

Performance comparison across assets between a random baseline and our method (highlighted in green). Bolded values and upward arrows indicate improvements on metrics where higher is better.
Detailed Analysis
Rolling Performance: Sample-based directional accuracy on SPX. Of 10 LLM-generated signals, correct predictions are shown in green/red, errors in grey, yielding 80% accuracy during the highlighted period.

BibTeX
@article{xiong2025quantagent, title={QuantAgent: Price-Driven Multi-Agent LLMs for High-Frequency Trading}, author={Xiong, Fei and Zhang, Xiang and Sun, Siqi and You, Chenyu}, journal={arXiv preprint arXiv:2XXX.XXXXX}, year={2025} }