January Update

The early January 2025 market dip was primarily driven by trade policy announcements and volatility in the AI sector. The subsequent recovery was fueled by the delay of tariffs, positive economic indicators, and renewed investor optimism.

Trade Policy Announcements

In early January, President Donald Trump announced steep tariffs on imports from key trading partners, including Mexico, Canada, and China. The initial reaction was a heavy sell-off in global equities, with the S&P 500 dropping nearly 2%. However, markets rebounded after Trump agreed to delay the tariffs on Mexico and Canada by one month following discussions with their leaders. The decision to delay tariffs on Mexico and Canada alleviated immediate trade tensions, restoring investor confidence and contributing to market recovery. But this didn’t last long before Trump officially declared tariffs early Feb.

Positive Economic Indicators

Despite the early volatility, the economic fundamentals remained strong. U.S. economic data continued to impress, with GDP growth showing resilience and unemployment remaining at historic lows. Consumer confidence remained robust, suggesting that spending and demand were still healthy. Additionally, corporate earnings forecasts for the first quarter of 2025 indicated that many sectors would post solid results, even amid the early uncertainty. Analysts revised down the likelihood of an impending recession, citing the country’s strong financial position and low inflation, which allowed for greater monetary flexibility from the Federal Reserve.

Investor sentiment turned more positive as data suggested that the economy could continue to weather both trade tensions and AI-related volatility. This optimism drove stocks higher as market participants realized that economic growth and corporate performance would likely remain on track despite the challenges.

Analysts like Tom Lee of Fundstrat advised investors to "buy the dip" following the tariff announcement, citing factors such as the flexibility of the new tariff strategy and the tendency of markets to overreact to events like the AI-related sell-off.

DeepSeek: The “Sputnik Moment” of AI

The release of DeepSeek’s AI model in early 2025 has been compared to the “Sputnik moment” of the Cold War, when the Soviet Union’s successful launch of the Sputnik satellite in 1957 spurred the U.S. to accelerate its own space exploration efforts.

Similarly, DeepSeek's unexpected leap in AI capabilities served as a wake-up call, pushing the U.S. to fast-track its investments in artificial intelligence to maintain its technological leadership and outpace emerging global competitors, just as it did in the race to the moon.

DeepSeek’s Founder: A Quant Trader Turned AI Pioneer

DeepSeek was founded by Liang Wenfeng, who studied electrical engineering before launching a quant trading firm in his 30s. Using AI/ML-driven strategies, his firm reportedly generated ¥100 billion ($14B USD) in trading profits. During this time, Wenfeng purchased thousands of NVIDIA GPUs—which he later repurposed to build DeepSeek as a side project before scaling it into a full-fledged AI company.

DeepSeek’s Disruptive Claims

In January 2025, Chinese startup DeepSeek unveiled a revolutionary AI reasoning model that it claimed could drastically reduce the cost of running large language models (LLMs) while delivering performance on par with leading models like OpenAI’s GPT-4 Turbo. The company positioned its technology as a major disruptor, promising efficiency at a fraction of the cost of traditional AI systems.

However, skepticism quickly followed. Critics questioned how DeepSeek could achieve such cost reductions without sacrificing performance and speculated whether it had leveraged proprietary data—potentially from OpenAI’s ChatGPT—to train its model.

DeepSeek’s founder, Liang Wenfeng, sStudied electrical engineering, teamed up with classmates to get into quant trading which he founded in his 30s and generated ¥100B using AI/ML in trading, his firm bought thousands of NVIDIA GPUs and built DeepSeek — as a side project.

DeepSeek’s Compute Claims vs. Industry Benchmarks

DeepSeek claimed to have trained its AI model with just 3,000 NVIDIA A100 GPUs, achieving GPT-4 Turbo-level performance at 10% of the cost. If accurate, this would represent a huge efficiency leap, but experts have questioned the feasibility of such claims.

For context, here’s how DeepSeek’s compute resources compare to industry leaders:

Breaking Down the Costs

  • Cloud rental cost for an A100 GPU: ~$1.50/hour.

  • Training duration assumption: ~3 months (~2,160 hours).

  • 10,000 GPUs → ~$325M | 20,000 GPUs → ~$650M | 3,000 GPUs (DeepSeek) → ~$97M.

  • On-premise purchase: ~$10,000–$15,000 per A100 GPU, meaning 3,000 GPUs would cost ~$30M–$45M upfront (excluding infrastructure and energy costs).

If DeepSeek truly achieved GPT-4-level performance with only 3,000 GPUs, it would suggest a 3x–8x efficiency improvement. This raises two major possibilities:

  1. They achieved a significant breakthrough in AI training efficiency.

  2. They used pre-trained proprietary data (e.g., OpenAI’s), sidestepping costly reinforcement learning and synthetic data generation processes.

Imagine you're buying a car. You want a car that's both fast (high performance) and fuel-efficient (low cost). DeepSeek-V3 is like a car that's both fast and fuel-efficient, making it a great choice.

The Nvidia Shock: DeepSeek’s Impact on AI Compute Demand

DeepSeek’s efficiency claims had major implications for Nvidia and the broader AI ecosystem. If their model’s cost reductions were real, it could significantly reduce demand for high-performance GPUs like Nvidia’s, sparking fears of declining AI hardware sales.

This uncertainty led to Nvidia’s stock shedding nearly $600 billion in market value. Additionally, energy companies—previously benefiting from rising AI data center demand—also saw concerns emerge over future energy consumption trends.

DeepSeek’s Potential Data Controversy

Recent reports have raised concerns about DeepSeek potentially accessing OpenAI’s proprietary data without authorization. Microsoft and OpenAI are currently investigating whether individuals linked to DeepSeek extracted vast amounts of training data via OpenAI’s API. If proven, it could undermine DeepSeek’s efficiency claims and trigger legal consequences.

Like TikTok, DeepSeek’s ties to China have raised concerns about potential user data tracking and surveillance. This is why Texas has become the first U.S. state to ban DeepSeek over security fears, citing potential data exposure to the Chinese Communist Party (CCP).

Why AI Compute Demand Won’t Decline Anytime Soon

Despite DeepSeek’s efficiency claims, AI’s long-term compute demand is still accelerating. The reason? While training models is expensive, the real surge in demand comes from inference (running AI models at scale).

This is a real-world example of Jevons Paradox: as AI becomes more efficient and widely available, its use explodes, leading to an unprecedented need for computing resources, data storage, and power.

Q4 Earnings Kick Off

Digital Payments

Both Visa and Mastercard reported strong fundamentals in their Q1 2025 earnings calls, supported by a healthy consumer spending environment and ongoing shift to digital payments. Mastercard expects high single-digit to low teens revenue growth for 2025, driven by global opportunities in value-added services, while Visa highlighted its continued strong performance in the payments sector. Both companies are well-positioned for growth as digital payment adoption accelerates globally, with a focus on expanding their service offerings and capitalizing on emerging market opportunities.

Advanced Computing

ASML and Microsoft are both key players in the advanced compute thematic, but they serve different yet complementary roles in the growing AI and cloud ecosystems.

ASML, as the leading provider of cutting-edge EUV lithography equipment, plays a critical role in enabling the production of advanced semiconductors. These semiconductors are essential for powering AI, cloud computing, and machine learning technologies. ASML’s strong position in the semiconductor supply chain, particularly its monopoly in EUV technology, allows it to capitalize on the increasing demand for more powerful chips. As AI applications and the need for advanced computing power grow, ASML’s technology remains indispensable in manufacturing the next generation of chips that fuel these innovations.

On the other hand, Microsoft operates as the infrastructure provider behind AI through its expansive Azure cloud platform. Microsoft’s AI-driven products, like those powered by its partnership with OpenAI, have shown significant growth, with AI revenue surging by 175% year-over-year. However, Microsoft faces some short-term hurdles, particularly infrastructure constraints that limit its ability to meet the rapidly growing demand for AI services. With its massive capital investments in data center capacity, Microsoft is addressing these challenges and expects to have sufficient infrastructure to support demand by late 2025. The company’s role as a cloud and AI infrastructure provider, however, continues to be essential as enterprises and developers look to leverage the compute power necessary for AI applications.

While ASML is the enabler of the hardware required for advanced AI processing, Microsoft provides the cloud infrastructure that allows AI workloads to scale. Together, they represent two sides of the same coin, with ASML driving the development of the tools (semiconductors) and Microsoft powering the infrastructure behind AI advancements.

WealthTech

LPL Financial (LPLA): LPL Financial expects a 6-8% growth in core G&A for 2025, with a focus on driving efficiencies and expanding high-net-worth services. The company plans to continue its strong organic growth, supported by ongoing investments and efficient onboarding for large institutions.

Both Netwealth (NWL) and HUB24 (HUB) showed strong growth in FY24, with NWL reporting FUA of $60.3 billion, up 9% YoY, and HUB24 reaching $60.9 billion in FUA, up 29%. Both saw significant net inflows—$3.1 billion for Netwealth and $3.3 billion for HUB24—driven by increased demand for their platform offerings, particularly as advisors shift business away from traditional institutions. Netwealth holds a 7.7% market share, while HUB24 saw a 6.2% increase in account numbers. Both companies benefit from the secular tailwind of banks exiting the wealth management space. Moving forward, both are well-positioned to sustain growth through platform enhancements and expanding advisor relationships, although HUB24 expects some moderation in inflows.

Next-Gen Industry

Tesla's recent quarterly results showed mixed performance. While expected earnings for the current quarter are set to rise by 51.1%, vehicle deliveries declined year-over-year in 2024, marking the first drop in Tesla's history - 1.78 million units delivered, 19,000 fewer than the previous year. Despite this, the company plans to launch more affordable models in 2025, aiming to increase annual production by 60% to three million cars. While Tesla's core business remains vehicle production, the company's vision extends far beyond being just a car OEM. There is a concern that we see relating to Elon’s meddling in politics, which would tricking down to Tesla, perhaps we are seeing this already.

Humanoid Robots (Optimus)

Tesla's humanoid robot, Optimus, is another exciting segment. Although the training requirements for Optimus are vastly higher than for the cars, about 10x, Tesla is developing this gradually to avoid huge costs upfront. According to Elon, Optimus could eventually generate over $10 trillion in revenue, though its rollout will take time, and we all now Elon likes to make bold predictions on these kinds of things. Elon Musk’s long-term vision is that Optimus will become the most valuable part of Tesla, with the robot eventually being able to perform tasks like playing the piano and threading a needle. The normal internal plan calls for roughly 10,000 Optimus robots to be built this year. Probably won't hit that, but several thousand is more probable, then they are aspiring to ramp up the order of magnitude each year, so the growth in deliveries will become exponential, and the use case would increase - similar to each new iPhone release in its early days, there was a new feature with every new iPhone and iOS update.

Full Self-Driving (FSD)

Tesla is seeing rapid progress with Full Self-Driving (FSD), which promises to unlock exponential utility. Currently, passenger cars are only in use for about 10 hours a week, but autonomous vehicles could operate for 50-55 hours weekly - this expansion includes both cargo and passenger delivery, increasing the utility fivefold. But being realistic about this, not everyone would want their Tesla used for Ubers and Courier services. FSD is already working well in the U.S., and Tesla is planning to launch unsupervised FSD as a paid service in Austin by June, further boosting the potential for autonomous vehicle adoption. Elon discussed the challenges Tesla faces in China, particularly with certain road rules that restrict car lanes for buses during specific times of the day. These rules could result in instant fines for drivers if they accidentally violate them, adding complexity to Tesla’s operations in the region. To address this, Tesla is enhancing its FSD system to better recognize and navigate these specific traffic rules. This includes developing advanced mapping and data processing capabilities to ensure compliance with local regulations

Energy Storage

Tesla is also focusing on energy storage, both for grid and home-based solutions. While battery pack constraints remain a challenge this year, Tesla is addressing these issues with the construction of a second factory in Shanghai and plans for a third. Energy storage for the grid is a game-changer, enabling much greater energy output. Vast majority of the grid has no energy storage capability, so they have to design the power plants to cope for very high peaks and assuming that there's no energy storage. This shift could drive the demand for stationary battery packs, particularly at grid scale, significantly improving the efficiency and sustainability of the energy grid.

Previous
Previous

The path to a million qubits

Next
Next

December Update