As AI technology
works its way into many applications in many sectors, it will likely appear in
different configurations. Companies and institutions that work with
infrastructure, safety, and public trust have special needs for AI tech to be
under control and secure. Veronica Thums of Dell Technology writes about the
Electric Power Research Institute’s (EPRI) configuration for safe and secure
“local” AI via a special Dell compact supercomputer and a NVIDIA superchip.
EPRI is an energy and electricity research institute that is deeply involved in
integrating AI and data center power loads with power grids. The author notes
that:
“Data sensitivity, regulatory requirements, and the need
for deep domain reasoning make cloud‑only approaches difficult to scale
responsibly.”
“With the Dell Pro Max with GB10 equipped with the
NVIDIA Grace Blackwell Superchip, EPRI can develop, test, and run custom AI
workflows locally — unlocking new levels of insight while keeping sensitive
information firmly on premises.”
Aside from better control and
security, there are more reasons to deploy local AI instead of cloud-based AI.
There are exposure concerns with external networks. Some advantages of local AI
are given below.
As noted, the Dell Pro Max
GB10 NVIDIA combo, which costs between $5500 and $6300 depending on the amount
of storage, can provide high-tech AI at a reasonable price for local or
on-premises applications.
“Powered by NVIDIA DGX OS, the GB10 system delivers a
stable and fully integrated NVIDIA AI software stack, enabling researchers to
develop AI tools locally using the same environment they would rely on in
larger enterprise infrastructures. With the Grace Blackwell Superchip,
acceleration becomes a catalyst for real productivity. Delivering rapid
inference, fluid model execution, and smooth interaction with large, reasoning‑intensive datasets.”
“In early 2026, EPRI developed Power Chat, a prototype
demonstrating how document-grounded AI assistant can run entirely on a compact
supercomputer—the Dell Pro Max with GB10. Power Chat enables deep,
conversational exploration of technical documents through a streamline
interface and a fully local workflow designed for controlled environments.”
“The goal is a faster path to insight from large, dense
documents, improving access to critical information and supporting
decision-making.”
The compact GB10 setup can
replace larger and more expensive GPU servers. The system takes about 15
minutes to boot up, load the models, and encode documents into the KV cache.
Below is a sample of an EPRI Power Chat query.
NVIDIA provides several AI
playbooks that provide instructions for running AI workloads on its GB10 Grace
Blackwell Superchip.
With the success and advantages of this system for specific applications requiring data security and control, I am guessing that local AI options for organizations that have such requirements will be the wave of the future due to lower costs and better control.
References:
EPRI:
Local AI for Energy Research. Veronica Thums. Dell Technologies. April 6, 2026.
EPRI: Local AI for Energy Research |
Dell



No comments:
Post a Comment