We are making Quantum Leaps. I am not referring to the 80s/90s TV show, but rather, I am referring to quantum computing and competing for better and faster artificial intelligence.
Quantum computing is a field that excites me to my core. I’ve always been driven by pursuing what’s next, whether mastering tactics in the Marines or navigating complex policy challenges in government.
Quantum computing feels like the ultimate “what’s next.” Its potential to solve problems in seconds that would take today’s supercomputers millennia is a quantum leap. I talked a bit about Quantum Computing in one of my recent newsletters.
However, potential doesn’t turn into reality without investment of time, money, and strategic resources. My experience has shown me that technological superiority is a strategic advantage, and right now, nations and companies worldwide are racing to claim the quantum crown.
We risk falling behind if we don’t pour resources into research, development, and deployment. This is more than an opportunity; it’s a call to action. We must invest heavily and deliberately to ensure quantum computing becomes a cornerstone of our competitive edge.

Bill Gates recently suggested that energy sector jobs are among the few fields that will survive an AI takeover. According to him, the energy sector’s immense complexity and ever-evolving regulatory frameworks mean that human expertise will always be necessary.
AI alone cannot navigate the intricate challenges of mapping regulatory landscapes or developing sustainable solutions to meet our world's diverse energy needs.
“Within 10 years, AI will replace many doctors and teachers; humans won’t be needed for most things.” – Bill Gates
Bill Gates’s premonition that only a few fields will survive over the next 10 years insinuates massive shifts happening, and rather quickly. Welcome to the quantum computing era.
As we are trending towards a new era of AI, called quantum computing, it becomes increasingly evident that our approach to both data centers and the broader energy grid needs to change.
Next to Bill Gates’s statement, AI is continually seeping into every industry and every corner of life. I believe this is causing and will continue to cause a surge in demand for advanced computing power like never before, which will not only drive the need for redesigning smarter, more efficient data center strategies but will also require a fundamental advancement of our energy grid infrastructure.
To start with, we must focus on data centers and our energy grid; it’s important to illustrate at a high and broad level how we get push-button AI at our fingertips.
See Figure 1 below
- The underlying process that supports modern AI begins with power generation. Energy is produced at large-scale facilities, ranging from nuclear reactors (both current 3rd-generation and the emerging 4th-generation designs), coal-fired plants, wind farms, and solar arrays, which convert various energy sources into electricity. This electricity then travels through an extensive distribution network of power lines and substations before reaching data centers. That is the energy grid infrastructure.
- Inside these data centers, thousands of powerful servers process and store vast amounts of data, running complex AI algorithms and machine learning models. These facilities employ advanced cooling systems and high-speed networking infrastructure to ensure optimal performance, allowing rapid global data transmission.
- When a user interacts with an AI application, whether a virtual assistant or a personalized recommendation engine, the input is processed at these centers through model inference, and the output is swiftly delivered back via the internet to consumer devices.
In layman’s terms, our modern AI experience relies on the vital integration of robust energy grids and sophisticated data centers, flawlessly powering the technologies at our fingertips. This interconnected infrastructure is necessary for delivering the immediate, push-button AI capabilities that are so commonplace now that we tend to take them for granted.
Advancing the energy grid: Nuclear or solar, which is better for the emergence of quantum computing?
Solar and nuclear power are set to emerge as the two dominating sources for our future energy mix. Solar energy is a game-changer due to its virtually limitless potential, rapidly declining costs, and the strides we've made in efficiency and storage technology.
As our digital demands continue to surge, especially with the rise of advanced computing and quantum technologies, solar's scalability makes it a feasible choice for powering everything from sprawling data centers to localized grids. At the same time, nuclear energy is indispensable because it provides reliable, around-the-clock baseload power with minimal carbon emissions.
With next-generation advances like small modular reactors addressing traditional safety and waste concerns, nuclear power is well-positioned to deliver the steady energy output necessary to support our ever-growing, high-demand digital infrastructure.
With nuclear and solar together, these two sources balance flexibility with stability, making them my top picks for the future.
Based on the photo in Figure 2 below, solar currently contributes less than wind and hydro, but it's set to catch up and eventually overtake them. Solar's lower current share is primarily due to its relatively recent adoption and earlier cost barriers, whereas wind and hydro have been established for decades.
However, I’m excited about the rapid cost declines in solar panel technology, energy efficiency improvements, and storage system advancements that address intermittency.
Unlike hydro, which is limited by geography, or wind, which requires consistent breezes, solar panels can be deployed almost anywhere, with considerations, i.e., my references to Washington State later in this writing. As energy demands grow, especially with emerging technologies like quantum computing, I expect solar to scale quickly to meet nuclear energy as a complete hybrid energy strategy.
Nuclear: As quantum computing matures and its demands for energy reliability, efficiency, and stability increase, the debate over the ideal energy source intensifies.
Nuclear power has very low CO₂ emissions during operation and exceptionally high energy density, offering a compact but potent solution. Its ability to generate massive power from a relatively small footprint makes it an attractive option for powering data centers that support quantum computing.
However, nuclear reactors come with a lot of challenges. The production of radioactive waste, which requires long-term, secure management, and the inherent risks associated with potential accidents, remain top concerns. Additionally, the regulatory landscape for nuclear power is intricate, necessitating continual human oversight and specialized expertise to ensure safe operations.
With that said, let me reference Bill Gates’s perspective below: one area where jobs will continue to reign over AI in the future is in energy experts.
“2. Energy experts: The guardians of power
The energy sector is too vast and intricate for AI to manage alone. Whether dealing with oil, nuclear power, or renewables, industry experts are required to navigate regulatory landscapes, strategize sustainable solutions, and handle the unpredictable nature of global energy demands.”
Solar: On the other side of the debate, solar power is a renewable and environmentally friendly alternative. With minimal operational emissions and the potential for scalability, solar reactors can leverage the sun's abundant energy. This scalability is particularly appealing for decentralized energy production and for applications where geographical distribution is advantageous.
Whenever I return to California from Ireland, my flight path often passes 40,000 feet over Arizona, where a colossal solar farm sprawls across the desert. The farm is called the Agua Caliente Solar Project.
Even when I'm cruising from San Diego to Phoenix, its vast expanse is unmistakable, visible at 70 mph on the highway or from high above. The Agua Caliente Solar Project is a 290 megawatt (MWAC) photovoltaic power station, built in Yuma County, Arizona, using 5.2 million cadmium telluride modules made by the U.S. thin-film manufacturer First Solar. It was the largest solar facility in the world when the project was commissioned in April 2014.[1][2]
While the striking installation thrives in the sunny, arid conditions of Arizona, replicating it in regions like Washington State could be more challenging due to:
- There is a need for vast, open, and flat land, conditions less common in much of Washington, though some areas offer suitable terrain.
- The abundant, consistent sunlight is essential for large-scale solar farms, which is less prevalent in Washington compared to Arizona.
Here is an aerial photo - Figure 3 - of the farm from about a height of 10,000 feet.
Yet, solar power systems face their own set of hurdles. Large-scale solar installations demand very large land areas, which can be a limiting factor in densely populated or resource-constrained regions.
Additionally, the environmental impact of manufacturing solar panels, including resource extraction and waste generated at the end of their lifecycle, cannot be overlooked. Like my example of Washington State, variability in solar energy production due to weather conditions further complicates its reliability as a sole energy source for critical applications like quantum computing.
When evaluating these trade-offs, it's important to consider the specific energy needs of quantum computing. Quantum computing centers require not only massive, uninterrupted power but also an energy infrastructure that can scale with the rapid growth of data and processing demands.
Nuclear reactors, particularly next-generation designs, could provide the consistent, high-output energy necessary to run these power-hungry centers efficiently. In contrast, while solar power offers a cleaner, more renewable option, its dependency on external factors like sunlight means it might best serve as a supplementary source rather than the primary backbone of energy supply for such high-stakes applications.
Out with the old, in with the new: Leaping from 3rd generation nuclear, to 4th generation
The need for such sophisticated infrastructure becomes even more important as the demand for AI and quantum computing applications continues to grow.
Although current 3rd-generation nuclear reactors can power today's data centers and support quantum computing, there is a convincing argument to expedite shifting to 4th-generation reactors.
These advanced reactors promise enhanced safety features, improved fuel efficiency, and reduced radioactive waste. The U.S., for example, is actively pursuing these 4th-generation reactors through initiatives like the Department of Energy’s Advanced Reactor Demonstration Program, with demonstration projects expected in the early 2030s and broader deployment possibly by the mid-to-late 2030s.
Meanwhile, countries such as China and Russia are already experimenting with advanced reactor designs like China’s HTR-PM and Russia’s BN-800, though no nation has yet deployed a large fleet of fully commercial 4th-generation reactors.
The integration of AI and quantum computing is driving a transformative rethinking of both energy generation and data center ecosystems. Advanced power generation from the next wave of nuclear reactors to innovative renewable energy sources is going to be standardly needed in meeting the escalating energy demands of these emerging technologies.
As our reliance on AI and quantum computing grows, so does the need for human expertise to navigate the complex regulatory and technical challenges inherent in this evolution.
Whether nuclear or solar reactors ultimately prove superior in specific cases may depend on regional needs, technological breakthroughs, and the balance between efficiency, safety, and sustainability in the long term.
So, it's highly unlikely that the grid and the economy would go with one or the other as we emerge into the era of quantum computing, but rest assured, they will both be absolutely necessary.
The 4th-generation nuclear reactors are an increasing necessity for quantum computing because they provide the ultra-stable, high-density energy needed for sensitive quantum systems.
Unlike 3rd-generation reactors, these advanced designs offer enhanced safety features, more consistent power output, and improved fuel efficiency, all while reducing radioactive waste. These improvements are critical for powering the data centers that drive AI and quantum computing, ensuring a resilient, sustainable energy grid for future technological advancements.
Below in Figure 4 is a comparative chart outlining some of the main pros and cons of nuclear power versus solar power. This chart summarizes key points to consider when comparing nuclear and solar power. Each energy source has distinct advantages and challenges that must be weighed in light of factors such as environmental impact, reliability, cost, safety, and waste management.
Rethinking data centers
In my time as a Marine, I learned the value of strategic positioning, never putting all your resources in one vulnerable spot. That lesson resonates with me now as I look at the digital landscape.
My military background taught me to anticipate risks and plan for redundancy, and that’s exactly what decentralized data centers offer. They’re not just infrastructure; they’re a strategic asset, and I believe investing in them is non-negotiable if we want to stay ahead in the digital race.
To realize Bill Gates’ statement, which I originally referred to in this writing, I believe the final shift to his proposed future state reality will be a commoditized approach to data centers; again, similar to my gas station theory, I mention below.
In my view,
“We are on the brink of a transformation that I would call the ‘real estate data center market’ (watch video), where data centers become as ubiquitous as gas stations.
This vision is driven by the fact that our growing population and escalating energy demands necessitate a robust, reliable, and scalable power supply.
With a decentralized data center environment as frequent as gas stations, less strain will be placed on the environment, and AI will be more productive.
Imagine if a town the size of 500,000 people only had one gas station. It would not be productive, and the strain on supply would be unfeasible. Now, if you have 500 gas stations, that is per 10 people, then the situation gets much more manageable.”
Data centers are not typically seen as attractive-looking landmarks. They are also not typically used for anything other than a data center. However, with the remote work society, and logistics and distribution changing for things like shopping malls, movie theaters, and skyscrapers, there sure is a lot of empty building space sitting around that can be repurposed into mixed-use complexes and buildings as data centers.
There are many landmarks today that have been turned into data centers, called adaptive reusage, but otherwise would have been decommissioned and destroyed.
For example, historic structures like Milwaukee’s Wells Building and Portland’s Pittock Block have been repurposed into state-of-the-art data centers, preserving their architectural legacy while meeting modern technological demands.
Many of these buildings have historical value and meaning, and are not such a “community eyesore” and very “dystopian” looking. For example, the Western Union Building, 60 Hudson Street, has historically served as an office and telecommunications hub.
Today, however, it is primarily used as a data center and colocation facility, making it a prime example of adaptive reuse. While its core function is data center operations, elements of its traditional usage, such as office space and support services, still remain, reflecting its multifaceted role in New York City's evolving infrastructure.
These buildings were once at risk of demolition or decommissioning as their original uses became obsolete. Instead of letting them be destroyed, innovative adaptive reuse projects have transformed these historic landmarks into modern data centers.
This approach not only preserves architectural heritage but also meets today's growing technological needs. For instance, projects have repurposed structures like Milwaukee's Wells Building and Portland's Pittock Block, buildings that might have otherwise been demolished, into state-of-the-art data centers.
In Figure 5, the data center on the left, though not unattractive, feels plain and lacks character, intrigue, and meaning. In contrast, the images of 60 Hudson Street on the right showcase a building rich in substance, personality, and historical charm.
From a purely architectural perspective, many contemporary data centers follow a utilitarian, box-like design that prioritizes efficiency, cooling, and security over aesthetics. They often feature minimal ornamentation, subdued façades, and large footprints for equipment. While this design approach is practical, it can lack the visual appeal and historic character seen in older structures.
By contrast, 60 Hudson Street exemplifies an era of architecture in which buildings were designed to showcase craftsmanship and artistry. Built in an Art Deco style, its brick façade, ornate lobby detailing, and dramatic setbacks reflect the period’s emphasis on ornamentation and grandeur.
Even after being repurposed as a data center and telecommunications hub, it retains much of its original design intent and ambiance, giving it a sense of place and history that many newer facilities don’t replicate.
In short, the difference lies in the guiding priorities behind each building’s construction. Modern data centers focus on function first, large-scale power capacity, robust cooling, and physical security, whereas older buildings like 60 Hudson Street were shaped by an architectural tradition that valued aesthetic richness and craftsmanship as essential parts of a structure’s identity.
This article proposes that data centers be integrated as a component of modern residential developments. The idea is to design mixed-use projects where data centers and residential units coexist in the same vicinity or even within the same building complex, creating synergies such as shared infrastructure, improved connectivity, and more efficient land use, rather than literally housing residents within a data center facility.
The other telltale sign about a commoditized approach to data centers comes from Sam Altman. In a recent interview, OpenAI co-founder Sam Altman said:
“We’re going to see 10-person companies with billion-dollar valuations pretty soon…in my little group chat with my tech CEO friends, there’s this betting pool for the first year there is a one-person billion-dollar company, which would’ve been unimaginable without AI. And now [it] will happen.”
If both statements become true, imagine the data center requirements. I repeat my gas station analogy.
If we accept Bill Gates' perspective on the survival of certain energy sector jobs and Sam Altman’s prediction about the rise of hyper-efficient, lean companies, then the infrastructure supporting these trends, data centers, will need to evolve dramatically.
The idea is that data centers could become as ubiquitous and commoditized as gas stations. Just as gas stations are scattered throughout our landscapes to provide quick, localized fuel access, future data centers might be deployed in a decentralized manner to meet the explosive demand for computational power.
This transformation would be driven by the exponential growth of AI-driven applications, the need for ultra-low latency processing, and the energy requirements of quantum computing.
With advances in modular design, improved cooling systems, and energy efficiency, smaller data centers could be rapidly deployed in nearly every urban and rural corner, supporting the next wave of technological innovation.
While challenges like regulatory hurdles, cybersecurity, and capital expenditure remain, the convergence of these trends suggests that a commoditized, widely distributed data center model is feasible and likely necessary to sustain the future digital economy.
Feeding off my gas station analogy, let’s look at power substations. Imagine Chicago’s power grid: the city relies on around 1,300 substations to distribute electricity efficiently across neighborhoods.
These substations act as critical hubs that step down high-voltage electricity from power plants to levels safe and usable by homes and businesses. Now, consider the digital equivalent of these substations, data centers.
As our reliance on digital technologies grows, especially with the advent of AI and quantum computing, we need a similarly robust network to process, store, and distribute data. Just as substations are strategically positioned throughout Chicago to ensure reliable power delivery, data centers need to be widely distributed to meet increasing digital demands.

This analogy suggests that as our energy infrastructure scales up to support a city like Chicago, our digital infrastructure must also expand proportionately, necessitating more localized data centers to ensure low-latency, high-performance computing across the urban landscape.
Rethinking our digital infrastructure, I believe it's ever more important to evolve data centers into a decentralized network as ubiquitous as gas stations. In today's rapidly expanding digital landscape, which is driven by the exponential growth of AI and quantum computing, the demand for computational power is skyrocketing.
Just as in my example above, a town with only one gas station for 500,000 people would struggle with supply. Relying on a few centralized data centers creates bottlenecks and latency issues.
A commoditized model, where data centers are as common as power substations in a city like Chicago, would distribute computing resources evenly, ensuring ultra-low latency and high-performance processing across both urban and rural areas.
My vision aligns with Bill Gates' perspective on transforming energy sectors and Sam Altman’s prediction of hyper-efficient, lean companies emerging in our digital future.
With modular designs, improved cooling, and energy efficiency, widespread, localized data centers are feasible. They are becoming the lifeblood for sustaining our digital economy, reducing environmental strain, and supporting the next wave of technological innovation.
Leaving you my perspective
As an AI venture capitalist, I am shaped by my years as a U.S. Marine and my extensive experience in government. These roles have given me a front-row seat to the indispensable role that infrastructure and policy play in safeguarding national security and driving economic resilience.
Today, I stand at the intersection of technology and investment, and from where I see it, the future hinges on bold, strategic moves in three critical areas: next-generation data centers, quantum computing, and advanced energy solutions. These aren’t just trends or buzzwords; they are the pillars of a secure, competitive, and prosperous tomorrow.
Technology alone doesn’t win the day; policy and leadership do. My years in the public sector drilled this into me. I’ve been in the rooms where decisions are made, and I’ve seen how effective collaboration between government and industry can turn ideas into action.
Right now, we need thinking and doing of substance vs more of the superficial developments we have seen with AI. We need regulatory frameworks that don’t stifle innovation but propel it forward while keeping security and sustainability front and center.
This isn’t about bureaucracy for its own sake. It’s about creating an environment where bold investments can flourish responsibly into technologies of substance, not superficial trends and hype.
Policymakers must work hand in hand with industry leaders to craft guidelines that protect our national interests, think cybersecurity, data privacy, and environmental impact, without slowing the pace of progress. My experience tells me this is possible. When the government and private sector align, the results are transformative. We need that alignment now more than ever.
As an AI venture capitalist, I’m observing these shifts and urging us to act on them. I call on my fellow investors, government officials, and industry pioneers to champion these strategic investments with me.
By rethinking our approach to data centers and advancing the energy grid infrastructure, we are creating the next wave of digital innovation and building a nation that’s secure, competitive, and ready for what’s ahead. I’ve seen what’s possible when we commit to a vision, whether on the battlefield, in the halls of government, or the boardroom.
Let’s not wait for the world to change around us. Let’s be the ones to drive that change.
You can connect with Paul on LinkedIn and through his CV site.