georgemiller
Publish Date: Tue, 28 Jan 2025, 08:06 AM

Key takeaways
- DeepSeek, a Chinese AI lab, surprised global markets and US stocks off-guard after they released their R1 large language model, explaining in a detailed paper how to build a LLM (Large Language Model) on a tight budget. This is both because of its supposed outperformance vs other AI models and the claims of much lower development costs.
- We believe it is too early to know how much this will impact the AI capex trend. While chips could remain volatile for now and most exposed, the development of much better performing models could help other parts of the AI-ecosystem, in particular the users, as it may raise productivity in areas we like such as next gen medicines, automation and cybersecurity.
- We continue to broaden our equity exposure to names outside of the Mag-7 as we think earnings growth is broadening. Mag-7 stocks have already been losing leadership as investors position more in cyclical sectors for the new US administration, and we think this will continue. We maintain our overweight on global and US equities and continue to adopt an active and multi-asset approach to exploit volatility.
What happened?
DeepSeek’s AI assistant’s very rapid rise to the top of Apple’s download chart has led to a sharp fall in AI-related stocks.
Founded in 2023, the company claims it used just 2,048 Nvidia H800s and USD5.6m to train a model with 671bn parameters, a fraction of what Open AI and other firms have spent to train comparable size models, according to the Financial Times. This has triggered a debate about whether US Tech companies can defend their technical edge and whether the recent CAPEX spend on AI initiatives is truly warranted when more efficient outcomes are possible.
However, it seems that the very low cost has been achieved through “distillation” or is a derivative of existing LLMs, with a focus on improving efficiency. This amount also seems to only reflect the cost of the existing training, so costs seem to be understated. It remains a question how much DeepSeek would be able to directly threaten US LLMs given potential regulatory measures and constraints, and the need for a track record on its reliability.
That said, the debate about what all of this means will probably create pricing pressure for the industry. In addition, there could be reduced CAPEX; this is particularly the case as there had already been a nagging doubt with many investors about the return on investments, contributing to the pronounced market reaction. Therefore, it will be very important to watch the announcements on this point during the earnings season, which may lead to more short-term two-way volatility.
More efficiency and lower prices will certainly be good for the users. It could also accelerate usage and help create new use cases, which in turn should support the demand for chips in the medium-to-long term.
Investment implications
We think volatility may remain in the short term for chip and AI-model related stocks in particular. We also think many analysts will wait for more clarity from discussions during the earnings season to change any forecasts or target pricing.
We reiterate our view that it is important to continue diversifying exposure, as we have been doing. For the AI theme, this means diversifying into the adopters and beneficiaries of AI (both LLM and broader AI), including areas such as automation, healthcare innovation, cybersecurity and services firms, which will benefit from increased efficiencies.
It also means diversification beyond the technology sector, as earnings growth is picking up outside of the Mag7, while it is likely to slow (albeit from high levels) for the Mag7.
https://www.hsbc.com.my/wealth/insights/market-outlook/special-coverage/deepseek-news-and-its-impact-on-the-ai-ecosystem/