The History of Great Sky

How NIST incubated the team that will transform AI computing

Jeff Shainline, September 2025


TL;DR

At the launch of Great Sky, we reflect upon the path that brought us here. NIST provided a sustained foundation to conceive of and de-risk this ambitious technology. In an era marked by intense competition between AI companies and nation states, it’s important to remember that exploratory and interdisciplinary research brings long-term benefits. Now it’s time for us to focus on scaling and building great products that solve real problems, and the startup world is the place to do that.

Context

This is a peculiar time in society and in the US in particular. We face immense global challenges and international competition, but our approaches to building big, hard things are not what they were during the glory days following WWII. AI has emerged as a potentially transformative technology, possibly shifting many aspects of how we live and work, but today’s approaches require massive energy consumption, tremendous expense, and struggle to understand reality through world models the way humans do. At this crucial moment when there is potential for fundamentally different approaches to AI hardware and algorithms to address these impediments, US culture and policy have drifted toward a climate better suited to incremental change rather than big, disruptive innovations. This is not to say the AI boom since GPT3 has been incremental—far from it—but many more disruptions and advances as impactful as transformers and GPUs are yet to emerge.

Great Sky is taking a radically different approach to AI. Our chips are built from circuits that physically behave like the components of neural networks rather than digitally computing the functions that model these components. We use superconducting circuits to implement neurons and synapses with extreme speed and low energy. These neurons are interconnected by optical waveguides that deliver tiny pulses of light for communication to thousands of destinations with single-photon signal levels. Conceiving and de-risking this technology took more than a decade of support from NIST. In this post I’ll describe Great Sky’s origin from fundamental research, how the unique environment at NIST provided a context in which this new approach could originate, and why Great Sky is an example of the multi-tiered innovation engine that keeps America strong.

A new paradigm

Much has been written about the scaling challenges of current approaches to AI based on the transformer architecture running on GPUs. The symptoms of this problem are evident when the latest data centers require gigawatts of power, Nvidia’s investing in nuclear energy companies, Microsoft will reopen Three Mile Island, and all this still isn’t enough to realize an AI system that can watch a video as fast as a human, let alone understand the content in all its context and complexity. There are many reasons to be optimistic and cautious about the near-term future of AI, yet it’s clear that the benefits of new approaches will be welcome. But that’s not the point of this post. The point is that to do big, hard things, many elements must align to produce the conditions in which tremendous, innovative change can occur. Great Sky was born from those conditions at NIST from 2013 - 2024, and the outcome is an ambitious startup with the potential for a massive impact on the US economy and a crucial advantage in the global AI competition. This is our origin story.

Barriers to building big

The US appetite and patience for taking on massive projects with a mindset of shared objectives has waned. I’m certainly not alone in seeing the need to prioritize building boldly into the future. We need new infrastructure projects for energy, electric cars, transportation, housing, etc. Amazing progress is occurring in AI software, while new chips are not the primary driver of conceptual advances in AI. Nvidia makes one main kind of chip, and for algorithms to be fast, those algorithms must stay within the constraints of the chip. To realize chips overcoming these constraints requires significantly departing from conventional architecture, which is quite a lift due to switching costs for chip designers, foundries, and end users. Nearly all AI research and startups address the software layer within the constraints of pre-existing chip architectures because progress can be made so much faster. There’s still a ton of room for advancement in software, but dramatic paradigm shifts to the complexities of brain-like intelligence require hardware innovations. To reach that stage, we need a tectonic shift, like moving from vacuum tubes to transistors.

If this is such a big opportunity, why isn’t everyone rushing to be the first? As they say, hardware is hard. In the domain of computing, addressing the opportunities requires not just knowledge of AI and the operations to be performed, but also knowledge of device physics to design circuits that optimally perform these operations. In my opinion, knowledge of neuroscience and the operations of the brain is also required to leverage principles beyond today’s AI. Hardware improvements also require fabrication, which necessitates a cleanroom. Making chips is expensive and slow, especially when you have to do it yourself because no foundry in the world runs the process to go beyond today’s practice. And making chips also requires complex characterization and measurement capabilities using expensive equipment and hard-earned expertise. Universities and academic researchers may have the will to pursue such endeavors, but they typically don’t have the financial resources or physical infrastructure to support these activities at the scale necessary to surpass current practice. To make matters worse, grad students and postdocs don’t have the time in their positions to master this diversity of subjects before they must pursue the next stage of their career. Nvidia and TSMC, Google and Broadcom, Apple and Meta have the resources for such a massive effort, but the competitive business landscape—especially in the heated AI race—disincentivizes open-ended, exploratory research. There’s little time to be distracted by completely different chips that could bring orders-of-magnitude improvement, but might not.

A home for radical innovation

Broader exploration and innovation requires particular conditions, and NIST provided this lush environment. I joined NIST in 2013 as a postdoc funded by the NRC Postdoctoral Fellowship This federally funded program gives recent PhD graduates an opportunity to work in national labs. Receiving this fellowship was one of the most fortunate events of my life and gave me two years to earn a place at NIST. I joined the group of Rich Mirin and Sae Woo Nam, a world-class lab working on quantum optics. The goal of my project was to combine my knowledge of integrated photonics with the group’s expertise in single-photon sources and detectors to build the components of photonic quantum computers on a chip. This was before PsiQuantum or Xanadu, and now the group works closely with both these companies to improve and scale up the single-photon detectors used in photonic quantum computers. I spent a couple years integrating these detectors with waveguides, working on microring-based entangled-pair sources, and thinking through the extreme performance requirements of components for photonic quantum computers. On nights and weekends my thoughts drifted to circuits using the same devices, but operating like neurons in the brain.

From photons to neurons

During my first few years at NIST, I was overcome by the rushing ideas at the confluence of so many technologies. Nobel-prize-winning physicists were building the world’s most accurate atomic clocks down one hallway and superconducting sensors to observe the glow of the early universe down the next. I’d spent my career up to that point harnessing light to move information across computer chips, and now I met folks using superconducting circuits for new kinds of digital systems.

Each day I walked the sunlit hallways and felt the potential for countless careers. All the while, my fascination remained on the subject of building technological minds reaching the physical limits of intelligence. One Monday morning I went into Sae Woo’s office and drew on his whiteboard a circuit based on superconducting single-photon detectors that I was trying to use as a neuron. This wasn’t my assignment. Sae Woo was more interested in quantum optics, but I was stuck on how to increase the output voltage, so I wanted his opinion. He could have said, “Stay on task.” Instead, he said, “You need an nTron right there.”

That was the kernel of our first patent together. Over the coming months, he grilled me hard, challenged every corner of my thinking, questioned every one of my assumptions. But ultimately he let me explore, and it grew into the first paper on the subject of superconducting optoelectronic networks.

The team comes together

Both Rich and Sae Woo continued to support this effort of building something completely new. Resources were scarce, but three young hot-shots got NRC fellowships themselves—Sonia Buckley, Adam McCaughan, and Jeff Chiles. Their contributions were enough to breathe life into the project and led to additional intellectual property.

Crucially, NIST had a cleanroom in which we could build sophisticated prototypes—not foundry-scale processing like TSMC, but a much better fabrication facility than nearly any university. Along this timeline, NIST was appointed by executive order to contribute to American leadership in AI. This turn of events brought some funding to NIST specifically for AI, and while most was allocated to develop “tools in support of reliable, robust, and trustworthy systems that use AI technologies”, a small but meaningful allotment was given by Elham Tabassi to fuel the nascent AI hardware effort. This was enough to bring two more team members on board. Bryce Primavera was a new grad student at CU. He reached out to me as a potential thesis advisor, and we had the resources to support his PhD. Rich and Sae Woo brought me into an IARPA program to which they were contributing, and this allowed us to bring Saeed Khan onto the team.

The leap to Great Sky

For 12 years, NIST incubated this embryo. Jim Kushmerick worked our blurbs into high-level NIST vision documents. Our team produced nine patents and 23 journal publications. We laid the theoretical foundations and built myriad subsystem prototypes. Not in a sudden flash, but gradually, as I remember it, I became convinced we could build brains the size of the sky, with trillions of neurons and quadrillions of synapses—larger and faster than human minds by many orders of magnitude. These intelligent machines of immense consequence were physically possible, technically feasible, economically viable, and ultimately inevitable. Once you believe there’s a path to construct a mind a million times larger and faster than your own, it becomes difficult to sleep if you’re not doing everything you can to make it happen. NIST was our nest, but when we reached a certain stage, we had to try our wings to see if we could fly into the great sky.

Why public investment still matters

This post is not just a thinly veiled excuse to boast about the bonafides of the Great Sky team. It’s to point out that our existence—the inception of a wildly ambitious, potentially transformative, and indisputably radical company—would not have come to pass without the vibrant intellectual environment at NIST. Decades of investment filled those labyrinthine hallways with deep knowledge and competence. Our team grew with government support from much of the alphabet soup: NIST, DARPA, IARPA, NASA, and NRC.

Now I live in startup land, venture investors are my allies, and my sphere of experience has expanded. In this context, we’re driven to be more efficient and to build products that meet real-world needs. This is where we belong at this stage of our growth, but we couldn’t start here. It’s curious to make this transition as the zeitgeist becomes more infused with government resentment, and NIST faces its first layoffs in decades. These cuts come as part of efforts to reduce the federal workforce, but they strike me as an ineffective means to achieve cost savings, with little return to the taxpayer in exchange for omission of important scientific functions. As I look back on our formation, it’s clear we wouldn’t be here without federal funding. Now it’s time to scale, and that requires the full engine of industry, but that engine could not have been where we began.

I don’t point this out as an ideological or political position. It’s just an observation. Don’t read this as an assertion that government-funded research is functioning as a well-oiled machine or that continuing with current practice is adequate. I’m not addressing those issues here. My observation is that ambitious research with a long time horizon was essential to our inception, and NIST enabled that for us. Our technology combines semiconductors, superconductors, and photonics. There are few institutions in the world with comparable cross-fertilization and deep, institutional expertise. Leading chip makers in their heated competition don’t have the luxury of such broad exploration. As our country competes for AI supremacy, I hope we remain aware of the benefits of exploratory investments.

I was very encouraged by DARPA’s recent Spark Tank event. At a time when the government is perceived as stale and calcified—in some cases justifiably—the Director of DARPA’s Microsystems Technology Office, Whitney Mason, is shrewd and nimble. She morphed a conventional DARPA conference into a shark-tank-style competition, a rapid means to assess and invest in multiple high-upside emerging technologies. In contrast to typical DARPA funding processes—with extensive paperwork and time-consuming reviews—this event got right to the point. At the end of the event, Whitney stood on stage, dressed playfully like the Monopoly Guy—top hat and cane—and gave oversized, $400k bills to the competition’s winners. We’re grateful one of the awards went to Great Sky. We just closed a $14M Seed round of venture funding, but $400k of non-dilutive capital from one of our nation’s research leaders sure helps us build this ambitious company.

Launching the future

Great Sky is born. NIST was our origin, supported us from a miniscule grain of an idea through to licensing our IP. They remain our partner as we launch into a startup, venturing to build what we think has potential to be a generational company at a time and in a domain where deep innovation has historical implications. We thank NIST for all it provided, our new venture partners for fueling the next immense stage, and the ecosystem of discovery that depends on the mutual benefits of the interacting sectors of our society.

Much of the cultural conversation, and the latest language from the White House, emphasizes the significance of winning the AI race. I agree with the sentiment and appreciate the focus on the issue. But the image of a race implies there’s a finish line, a discrete threshold to cross, perhaps not far in the future, when a winner will be determined. Progress is certainly happening fast, and several models are already far better than me across multiple domains. Nevertheless, at Great Sky we’re envisioning systems that dramatically surpass anything on the horizon today – systems so fast and capable they can interact with a million users at once through rich sight and sound and provide guiding insight incorporating context from across society. We must leverage all our intellectual and social resources – earned across the centuries and extending well into the future – to conduct this great project. There is not a finish line.

In the past few months, we designed, taped out, and fabricated our first chip. It’s a modest network that classifies small images, but it works. And it can process sequences at tens of millions of frames per second. That speed won’t decrease as we scale to megapixel video frames processed by 100-trillion-parameter models.

We’re growing our team. Our first hire was Matt Cox, an algorithms engineer designing our networks. We just hired a photonic engineer and a superconducting measurement specialist; both will start September 15th. We’re seeking candidates for algorithms, software development, and a Chief of Staff. We’re buying equipment, building our lab, and generating a force to push this technology forward.

Disclaimer

These are my views, not those of NIST.



Jeff Shainline Boulder, CO, September 2025