- Phison’s SSD technique slashes AI coaching prices from $3 million to $100,000
- aiDAPTIV+ software program shifts AI workloads from GPUs to SSDs effectively
- SSDs may exchange expensive GPUs in large AI mannequin coaching
The event of AI fashions has turn into more and more expensive as their measurement and complexity develop, requiring large computational assets with GPUs taking part in a central position in dealing with the workload.
Phison, a key participant in moveable SSDs, has unveiled a brand new resolution that goals to drastically cut back the price of coaching a 1 trillion parameter mannequin by shifting a few of the processing load from GPUs to SSDs, bringing the estimated $3 million operational expense down to simply $100,000.
Phison’s technique entails integrating its aiDAPTIV+ software program with high-performance SSDs to deal with some AI device processing duties historically managed by GPUs whereas additionally incorporating NVIDIA’s GH200 Superchip to reinforce efficiency and maintain prices manageable.
AI mannequin development and the trillion-parameter milestone
Phison expects the AI business to succeed in the 1 trillion parameter milestone earlier than 2026.
In keeping with the corporate, mannequin sizes have expanded quickly, transferring from 69 billion parameters in Llama 2 (2023) to 405 billion with Llama 3.1 (2024), adopted by DeepSeek R3’s 671 billion parameters (2025).
If this sample continues, a trillion-parameter mannequin might be unveiled earlier than the tip of 2025, marking a big leap in AI capabilities.
As well as, it believes that its resolution can considerably cut back the variety of GPUs wanted to run large-scale AI fashions by shifting a few of the processing duties away from GPUs to the most important SSDs and this method may carry down coaching prices to simply 3% of present projections (97% financial savings), or lower than 1/25 of the same old working bills.
Phison has already collaborated with Maingear to launch AI workstations powered by Intel Xeon W7-3455 CPUs, signaling its dedication to reshaping AI {hardware}.
As corporations search cost-effective methods to coach large AI fashions, improvements in SSD expertise may play a vital position in driving effectivity beneficial properties whereas exterior HDD choices stay related for long-term information storage.
The push for cheaper AI coaching options gained momentum after DeepSeek made headlines earlier this yr when its DeepSeek R1 mannequin demonstrated that cutting-edge AI might be developed at a fraction of the same old value, with 95% fewer chips and reportedly requiring solely $6 million for coaching.
Through Tweaktown