Liftr Insights, a pioneer with over 5 years of market intelligence experience, began offering power insights for data center analysts.
This comes aside news reports about the latest chips, such as the NVIDIA H100, requiring more power than an average household and warnings about the growing demand to support AI computing power requirements. Each of the latest GPUs driving AI Training and AI Inference require large amounts of power to produce the high workload output. And hundreds of thousands more GPUs are on their way into data centers.
Liftr® is using its domain expertise for sizing and scoping data centers. Knowing what the public cloud providers (and niche AI providers) are offering is valuable information for consultants attempting to maximize the output for private data centers.
“Our services portfolio is a key part of analyst considerations,” says Tab Schadt, CEO of Liftr Insights. “Companies and consultants can leverage the millions of dollars in research that the public cloud providers have invested in where they spend their AI funds.”
Since OpenAI helped inspire the age of AI with the release of ChatGPT, the demand for AI will only continue to grow. With that growth will be more chips able to meet those demands. While the performance per chip has and will continue to improve, understanding the power needs is a critical component for any cost and design analysis.
For example, Liftr can show a customer that their footprint for H100’s might require 828 kW for 100 servers even before factoring other relevant costs. But, several variables can alter those numbers as Liftr works with consultants on their specific configurations and restrictions.
“By extending our information and domain knowledge into the power and scoping side of the equation, we’ve opened new doors for our clients,” says Schadt. “As with AI, this is only the beginning. More powerful insights still to come!”
For more such updates, follow us on Google News Martech News