Nvidia-Backed Energy Startup Raises $28 Million To Improve The Grid
PEMBROKE PINES, FLORIDA - MAY 16: High voltage power lines run along the electrical power grid on May 16, 2024, in Pembroke Pines, Florida. The grid is strained by increasing demand from electricity-hungry data centers and electric vehicles, disruptions due to severe weather events, and more. The Federal Energy Regulatory Commission recently issued a sweeping reform to transmission grid planning in an effort to improve the nation's aging power grid. (Photo by Joe Raedle/Getty Images)
Source: Forbes
Written By: Esha Chhabra
ThinkLabs AI, a spin-off from GE Vernova, which has raised $28 million, including from Nvidia, is using deep learning to automate the electric grid. Given that we’re in a race for more energy, and ideally, more clean energy, founder Josh Wong is hopeful that his second venture will open up the bottlenecks in the modern electrical grid.
He says the US energy grid is an aging, and increasingly complex infrastructure struggling to keep pace with a world demanding more of it than ever before: more renewable energy, more electric vehicles, more data centers to power AI, and essentially a heavier load.
Though Wong says he was an early adopter of EVs, he notices that electric vehicles have had a bumpy ride, even in a state such as California. And that’s because the infrastructure and the energy capacity is lacking. Add to it now AI. That’s resulting in bottlenecks, which could be one of the central constraints on the next phase of economic growth.
Ironically, Wong also thinks artificial intelligence is the way out of this mess.
ThinkLabs AI, a Canadian-led, New York-based startup that has built what it describes as the first AI co-pilot for the electric grid. The company, which officially spun out of GE Vernova (where Wong worked) as its first startup spin-off, is developing deep learning models capable of running real-time analysis of entire utility networks, a task that has defeated traditional engineering software for decades.
That promise has now attracted major backing. In late March, ThinkLabs announced a $28 million Series A led by Energy Impact Partners, with participation from NVIDIA's venture arm NVentures, Edison International, the parent company of Southern California Edison, and returning investors including GE Vernova, Blackhorn Ventures, Powerhouse Ventures, Active Impact Investments, and Amplify Capital.
”The energy transition is far from complete," Wong said in an interview. "We have just begun to step on the pedal. This is still very much so early days.”
A Career Built on the Grid
Wong's path to ThinkLabs spans more than two decades in clean technology. He began his career working on hydrogen fuel cells, moved into battery storage, and spent years trying to improve the modern electric grid. In the early 2010s, he founded Opus One Solutions, which focused on modeling power systems and building distributed energy resource management tools. GE acquired the company in late 2021.
Josh Wong of ThinkLabs AI
ThinkLabs AI
Rather than step back after the exit, Wong turned his attention to what he saw as the central, unsolved problem of his career: utilities move too slowly. Not through negligence, he is quick to say, but because the grid is a high-stakes system where errors carry serious consequences.
"Do it wrong and you shut off the lights for thousands, if not millions, of people," he said. "So how do you enable utilities to move faster, but in a trustworthy manner?"
His answer was autonomy, and to achieve autonomy at scale, he concluded, you need AI. He began incubating the idea within GE, launching what became the company's autonomous grids unit. When the time came, ThinkLabs was spun out as an independent entity, with full investor backing and a core team drawn from power systems engineering, artificial intelligence, and cloud computing.
The new capital gives that thesis more urgency as utilities confront a collision of trends: electrification, renewable integration, interconnection backlogs, and new data-centre demand tied to generative AI. VentureBeat reported that ThinkLabs is positioning itself squarely inside that crunch, arguing that the grid can no longer be planned or operated with workflows that take weeks to model what operators increasingly need to understand in real time.
Teaching AI to Understand Power Flow
At the core of ThinkLabs' technology is a departure from how grid software has traditionally been built. Conventional digital twins, the software models engineers use to simulate the real world, are constructed from first-principles physics. They are precise, but they are also slow, computationally expensive, and brittle: small errors in input data can produce wildly incorrect outputs.
ThinkLabs takes a different approach. Rather than programming the formulas that govern electricity flow, the company trains AI directly on power systems engineering in the same way, Wong explains, that AI is trained to pass a medical licensing exam or understand legal reasoning. The resulting models are faster, more data-robust, and capable of operating in real time.
"We're not on the large language model side," Wong said, distinguishing the company's work from general-purpose AI tools like ChatGPT. "We are on the deep learning side, which is building deep understanding of how the grid works, and building up an intuition about it. We also don’t face the same issues with hallucinations that the LLMs do and we recognize that would be disastrous here.”
One early proof point came from a long-standing challenge in grid engineering known as distribution state estimation, or the problem of calculating electrical conditions across an entire network when direct measurements are only available at a fraction of its nodes. Traditional physics-based approaches have struggled with this for years. ThinkLabs solved it in two months using AI, with results that continue to improve, he says.
The company says its broader platform can evaluate millions of scenarios and reduce studies that once took 30 to 35 days to less than 90 seconds, with accuracy above 99.7 percent in some applications. Those numbers help explain why investors ranging from climate-tech funds to strategic energy players are paying attention.
"The engineering formulas are too sensitive," Wong said. "They don't converge well. With AI, we solved it — and it keeps getting better."
The product also targets one of the clean energy sector's most persistent bottlenecks: the interconnection queue. Connecting a new solar project, wind farm, or electric vehicle charging facility to the grid currently takes anywhere from one to three years, requires extensive paperwork, and can cost developers significant sums, only for the process to restart whenever another project enters or leaves the queue.
"Imagine a developer comes in and, instead of submitting a stack of paperwork and waiting six months for a response, they get an immediate AI response," Wong said. "Here's where you have capacity. Here's where you don't. Here's a flexible interconnection contract you might consider. Here's a battery storage option for your project. That can become a real-time interaction rather than paper shuffling."
That same capability is increasingly relevant for utilities trying to assess whether circuits can absorb the load from new AI data centers without triggering costly delays or reliability risks. Southern California Edison has already worked with ThinkLabs on projects aimed at identifying power-flow risks faster, underscoring the practical appeal of software that can help utilities see around corners.
Clients, Partners, and a Reluctant Industry
ThinkLabs' primary clients today are electric utilities, both transmission and distribution operators, along with renewable energy developers seeking faster grid access. The company operates across Canada and the United States, and is in active discussions with utilities in regions Wong describes as among the most forward-leaning: California, the northeast United States, and Ontario in Canada.
The utility sector has long been known for its caution with new technology. Even he was warned by colleagues and friends to not go into the energy sector again for his second startup because of how slow it can be.
"In the grid space, nobody wants to be first, rather everybody wants to be second," he said. "What we've noticed recently is that people are actually stepping up to be first, more than anything I've seen in the past 20 years."
On the partnership side, the company is working with large incumbents: GE Vernova's software manages roughly 40 per cent of the world's grids as well as systems integrators like major consultancies and the major cloud providers. Wong is candid about the reasoning: "Startups can't do it alone. The strength of incumbency is very strong in this space."
The investor list reinforces that point. ThinkLabs is drawing support not only from venture capital, but from companies embedded in the future of electricity demand and compute infrastructure. NVIDIA's participation links the startup to the same AI boom driving new power demand, while Edison International's involvement signals that utilities see value in tools that can compress analysis timelines and make grid planning more proactive.
A Bigger Ambition
The model, he argues, could also help democratize the energy transition globally. Utilities in progressive markets like California or Ontario are accumulating hard-won experience managing large-scale renewable integration. ThinkLabs believes the learnings from training AI on that experience, without transferring sensitive proprietary data, can be used to help utilities in the developing world leapfrog decades of infrastructure development.
"Can we put the energy transition into cruise control?" Wong said. "Can we take any sector and say: give me a pathway to 100 per cent decarbonization and have it generate those solutions? I can see a pathway to that. Right now. It is absolutely within reach.”
That ambition now sits inside a larger market narrative: the race to build more AI is becoming inseparable from the race to modernize the grid that powers it. ThinkLabs' pitch is that software can become a force multiplier for both. And while AI does use natural resources, such as water, Wong sees it as a short-term challenge for a longer-term solution. By allowing more renewable and clean energy to flow through the grid, and upgrade the very systems that are broken, they could find cleaner sources for the data centers and reduce their footprint, he said.