Recently diving deep into the underlying logic of transformers
My deepest understanding is that what AI considers "correct" is actually an extremely powerful function approximator, a form of ultimate statistical modeling Based on this, I really believe that some people using the current capabilities of LLMs for quantification and even running live trading are no different from gambling—completely the wrong approach Because the most proficient task of LLM models is predicting words. They are autoregressive models that can only output what is statistically "correct" It's like asking them "Will Bitcoin go up?" The LLM will only generate responses based on the text distribution in its training data, spelling out the most common response patterns in human history one word at a time And this depends on the output of the first token For example, candidate responses might be: Will Won't The market is uncertain First rise then fall …. The LLM will generate subsequent tokens based on the first token it produced, ultimately creating a lengthy report that it doesn't even know is right or wrong but looks very professional. This is entirely dependent on how the matching context appears in search engines In reality, quantitative models require order book flow data, various mathematical modeling, multi-factor analysis, and so on Quantitative large models and LLM large models are completely different. Quant systems simply don't use transformers Any use of LLM tools for fully automated trading is pure gambling—betting on what context the built-in search engine can piece together and what the first token it generates will be Predicting markets, contract trading, or other markets like US stocks Retail traders should stop believing in stories about fully automated AI crypto trading. Recently, I’ve seen too many cases where people connect a skill to openclaw and let it do fully automated trading It’s not that AI can’t achieve fully automated crypto trading Like the AI auto-trading system that Aster previously developed, its underlying isn't based on the LLM's own capabilities They just use the LLM to call quant models—essentially putting a shell over it. The role of the LLM is only to make decisions based on real data. But even that isn’t very reliable
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Recently diving deep into the underlying logic of transformers
My deepest understanding is that
what AI considers "correct" is actually an extremely powerful function approximator, a form of ultimate statistical modeling
Based on this, I really believe that some people using the current capabilities of LLMs for quantification and even running live trading are no different from gambling—completely the wrong approach
Because the most proficient task of LLM models is predicting words. They are autoregressive models that can only output what is statistically "correct"
It's like asking them "Will Bitcoin go up?"
The LLM will only generate responses based on the text distribution in its training data, spelling out the most common response patterns in human history one word at a time
And this depends on the output of the first token
For example, candidate responses might be:
Will
Won't
The market is uncertain
First rise then fall
….
The LLM will generate subsequent tokens based on the first token it produced, ultimately creating a lengthy report that it doesn't even know is right or wrong but looks very professional. This is entirely dependent on how the matching context appears in search engines
In reality, quantitative models require order book flow data, various mathematical modeling, multi-factor analysis, and so on
Quantitative large models and LLM large models are completely different. Quant systems simply don't use transformers
Any use of LLM tools for fully automated trading is pure gambling—betting on what context the built-in search engine can piece together and what the first token it generates will be
Predicting markets, contract trading, or other markets like US stocks
Retail traders should stop believing in stories about fully automated AI crypto trading. Recently, I’ve seen too many cases where people connect a skill to openclaw and let it do fully automated trading
It’s not that AI can’t achieve fully automated crypto trading
Like the AI auto-trading system that Aster previously developed, its underlying isn't based on the LLM's own capabilities
They just use the LLM to call quant models—essentially putting a shell over it. The role of the LLM is only to make decisions based on real data. But even that isn’t very reliable