Cerebras Unveils Six New AI Datacenters in North America, Europe

NewsCerebras Unveils Six New AI Datacenters in North America, Europe

Cerebras Systems Unveils New AI Data Centers to Enhance Global Inference Capacity

In a significant development for the field of artificial intelligence, Cerebras Systems has announced the launch of six new AI inference data centers. These state-of-the-art facilities, which are powered by Cerebras’ Wafer-Scale Engines, are set to dramatically increase the company’s capacity to handle AI tasks on a massive scale. The new data centers are expected to serve over 40 million tokens per second, thereby positioning Cerebras as a leader in high-speed AI inference. This move is designed to cater to the growing demands of enterprises, governments, and developers around the world.

Launched on March 11, 2025, the new data centers are a cornerstone of Cerebras’ 2025 AI inference scaling strategy. By expanding their capacity by 20 times, Cerebras aims to meet the burgeoning demand for high-speed AI processing. These facilities are strategically placed across the globe, with the majority located in the United States, underscoring Cerebras’ commitment to advancing AI infrastructure and leadership in the nation.

Locations and Strategic Partnerships

The new data centers are spread across several key locations: Santa Clara, California; Stockton, California; Dallas, Texas; Minneapolis, Minnesota; Oklahoma City, Oklahoma; Montreal, Canada; Atlanta, Georgia; and France. Each location is either already online or set to be operational by the end of 2025. The Oklahoma City and Montreal centers are exclusively owned and operated by Cerebras, while the remaining sites are operated in partnership with G42, a strategic partner of Cerebras.

The concentration of 85% of their capacity in the United States highlights the company’s role in fortifying the nation’s AI capabilities. This strategic positioning not only enhances the domestic AI infrastructure but also ensures that Cerebras can provide high-speed AI services to a global clientele.

Meeting Growing Demand

Since it announced its high-speed AI inference offering in August 2024, Cerebras has seen an overwhelming demand from leading AI companies and other enterprises. French AI startup Mistral uses Cerebras technology to power its Le Chat AI assistant, while Perplexity, a major AI search engine, relies on it for rapid search results. Recently, HuggingFace and AlphaSense have also adopted Cerebras’ solutions for their AI applications, further cementing Cerebras’ reputation in the industry.

Dhiraj Mallick, the Chief Operating Officer of Cerebras Systems, expressed confidence in the company’s ability to meet growing demand. “Cerebras is turbocharging the future of U.S. AI leadership with unmatched performance, scale, and efficiency. These new global data centers will serve as the backbone for the next wave of AI innovation,” he stated. The addition of six new facilities is a testament to Cerebras’ commitment to providing world-class AI infrastructure that fuels critical research and business transformation.

Advanced Infrastructure and Technology

The Oklahoma City data center, scheduled to go live in June 2025, will house over 300 Cerebras CS-3 systems. It is a state-of-the-art facility designed to withstand natural disasters like tornadoes and earthquakes, thanks to its robust construction and triple redundant power stations. The center’s custom water-cooling solutions make it ideal for large-scale deployments of Cerebras Wafer Scale systems, ensuring efficient and reliable performance.

Trevor Francis, CEO of Scale Datacenter, expressed enthusiasm about the partnership with Cerebras. “We are excited to partner with Cerebras to bring world-class AI infrastructure to Oklahoma City. Our collaboration underscores our commitment to empowering innovation in AI, and we look forward to supporting the next generation of AI-driven applications,” he said.

In July 2025, the Enovum Montreal facility, operated by a division of Bit Digital, will become fully operational. This marks Cerebras’ entry into the Canadian tech ecosystem, offering AI inference speeds ten times faster than the latest GPUs. Billy Krassakopoulos, CEO of Enovum Datacenter, highlighted the significance of this collaboration. “Enovum is thrilled to partner with Cerebras, a company at the forefront of AI innovation, and to further expand and propel Canada’s world-class tech ecosystem,” he commented, emphasizing the potential for high-performance AI colocation solutions tailored to next-generation workloads.

Accelerating AI Inference

Cerebras’ expansion is a response to the evolving needs of AI technology, particularly in reasoning models such as DeepSeek R1 and OpenAI o3, which require rapid processing to generate results. Traditional models might take minutes to produce outcomes, but Cerebras’ technology accelerates this process by ten times, enabling near-instant results. With hyperscale capacity coming online starting in the third quarter of 2025, Cerebras is well-positioned to lead the market in real-time AI inference.

Despite the ambitious plans, the final locations of some data centers may be subject to change, reflecting the dynamic nature of global tech infrastructure.

About Cerebras Systems

Cerebras Systems is a collective of computer architects, scientists, and engineers dedicated to accelerating generative AI. Their flagship product, the CS-3 system, incorporates the world’s largest and fastest commercially available AI processor, the Wafer-Scale Engine-3. This system can be easily clustered to create some of the largest AI supercomputers globally, simplifying the deployment of AI models without the complexity of traditional distributed computing. Cerebras Inference delivers breakthrough speeds, enabling customers to develop cutting-edge AI applications. Their solutions are utilized by leading corporations, research institutions, and government entities for developing proprietary and open-source models. More information about their offerings can be found on their website.

Looking Ahead

This expansion by Cerebras Systems reflects the broader trend in the tech industry towards building robust, high-capacity infrastructure to meet the demands of increasingly complex AI applications. As AI continues to permeate various sectors, the need for rapid and efficient processing power becomes more critical. Cerebras’ strategic investments in new data centers underscore their commitment to staying at the forefront of this evolving landscape.

For further insights and updates on Cerebras Systems, interested readers can visit their official website or follow them on social media platforms like LinkedIn and X.

In summary, Cerebras Systems’ new data centers represent a significant leap forward in AI infrastructure, promising to deliver unprecedented speeds and capabilities to meet the demands of today’s AI-driven world.

For more Information, Refer to this article.

Neil S
Neil S
Neil is a highly qualified Technical Writer with an M.Sc(IT) degree and an impressive range of IT and Support certifications including MCSE, CCNA, ACA(Adobe Certified Associates), and PG Dip (IT). With over 10 years of hands-on experience as an IT support engineer across Windows, Mac, iOS, and Linux Server platforms, Neil possesses the expertise to create comprehensive and user-friendly documentation that simplifies complex technical concepts for a wide audience.
Watch & Subscribe Our YouTube Channel
YouTube Subscribe Button

Latest From Hawkdive

You May like these Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.