With AI forecasted to continue its explosion in 2025, the ever-evolving technology presents both unprecedented opportunities and complex challenges for organizations worldwide. To help today’s organizations and professionals secure the most value from AI in 2025, I’ve shared my thoughts and anticipated AI trends for this year.
Organizations Must Strategically Plan for the Cost of AI
The world continues to be ecstatic about the potential of AI. However, the cost of AI innovation is a metric that organizations must plan for. For example, AI needs GPUs, however many CSPs have larger deployments of N-1, N-2 or older GPUs which weren’t built exclusively for AI workloads. Also, cloud GPUs can be cost prohibitive at scale and easily switched on for developers as projects grow/scale (more expense); additionally, buying GPUs (if able to procure due to scarcity) for on-prem use can also be a very expensive proposition with individual chips costing well into the tens of thousands of dollars. As a result, server systems built for demanding AI workloads are becoming cost prohibitive or out of reach for many with capped departmental operating expenses (OpEx) budgets. In 2025, enterprise customers must level-set their AI costs and re-sync levels of AI development budget. With so many siloed departments now taking initiative and building their own AI tools, companies can inadvertently be spending thousands per month on small or siloed uses of cloud-based GPU and their requirement for AI compute instances, which all mount up (especially if users leave these instances running).
Open-Source Models Will Promote the Democratization of Several AI Use Cases
In 2025, there will be immense pressure for organizations to prove ROI from AI projects and associated budgets. With the cost leveraging low code or no code tools provided by popular ISVs to build AI apps, companies will continue to seek open-source models which are more easily fine tuned rather than training and building from scratch. Fine-tuning open-source models more efficiently use available AI resources (people, budget and/or compute power), helping explain why there are currently over 900K+ (and growing) models available for download at Hugging Face alone. However, when enterprises embark on open-source models, it will be critical to secure and police the use of open-source software, framework, libraries and tools throughout their organizations. Lenovo’s recent agreement with Anaconda is a great example of this support, where the Intel-powered Lenovo Workstation portfolio and Anaconda Navigator help streamline data science workflows.
AI Compliance Becomes Standard Practice
Shifts in AI policy will see the computing of AI move closer to the source of company data, and more on-premises (especially for the AI Development phases of a project or workflow). As AI becomes closer to the core of many businesses, it will move from a separate parallel or special workstream to that in line with many core business functions. Making sure AI is compliant and responsible is a real objective today, so as we head into 2025 it will become more of a standard practice and form part of the fundamental building blocks for AI projects in the enterprise. At Lenovo, we have a Responsible AI Committee, comprised of a diverse group of employees who ensure solutions and products meet security, ethical, privacy, and transparency standards. This group reviews AI usage and implementation based on risk, applying security policies consistently to align with a risk stance and regulatory compliance. The committee’s inclusive approach addresses all AI dimensions, ensuring comprehensive compliance and overall risk reduction.
Workstations Emerge as Efficient AI Tools In and Out of the Office
Using workstations as more powerful edge and departmental based AI appliances is already on the increase. For example, Lenovo’s Workstation portfolio, powered by AMD, helps media and entertainment professionals bridge the gap between expectations and the resources needed to deliver the highest-fidelity visual content. Thanks to their small form factor and footprint, low acoustic, standard power requirements, and use of client-based operating systems, they can be easily deployed as AI inference solutions where more traditional servers may not fit. Another use case is within standard industry workflows where AI enhanced data analytics can deliver real business value and is VERY line of sight to C suite execs trying to make a difference. Other use cases are the smaller domain specific AI tools being created by individuals for their own use. These efficiency savings tools can become AI superpowers and can include everything from MS Copilot, Private Chatbots to Personal AI Assistants.
Maximize AI’s Potential in 2025
AI is one of the fastest-growing technological evolutions of our era, breaking into every industry as a transformative technology that will enhance efficiency for all – enabling faster and more valuable business outcomes.
AI, including machine and deep learning and generative AI with LLMs, requires immense compute power to build and maintain the intelligence needed for seamless customer AI experiences. As a result, organizations should ensure they leverage high-performing and secure desktop and mobile computing solutions to revolutionize and enhance the workflows of AI professionals and data scientists.