December 5, 2022, will be remembered as the day of the first verified fusion energy success. Achieved by researchers at the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory, the results were hailed around the world as an early, yet vital step toward harnessing fusion energy—the natural process that enlivens stars—to power human society.
In the most basic terms, what NIF researchers did was use 2 megajoules of energy shooting lasers at a hydrogen atom and got 3 megajoules of energy out.
What's less well known is that in August 2021, a prior experiment resulted in significant burn of the fusion fuel for the first time. It wasn't announced like the December news because the increase in output didn’t yet exceed the laser energy delivered to the target. But the NIF team—who had been working the problem for nearly a decade—knew they were on the threshold of something big. From that moment on, “our experimental and operations teams kicked it into high gear,” says Phil Adams, chief technology officer at NIF, whose team manages the infrastructure that runs the laser control system and captures the resulting data for analysis.
It's a big job for Adams’ tech team, one where the measures of success focus more on reliability and availability than rolling out dazzling new features. The team deals in precious data, since each experiment holds some lesson to be learned. Just imagine if on December 5 scientists ran their fusion experiment—but the databases failed to capture the result.
The NIF is a science facility an hour inland from San Francisco, California, that focuses 192 of the world’s most powerful lasers on a single point to recreate the inner workings of stars and other nuclear reactions. The NIF team must manage controls so that all 192 beams arrive within 10 trillionths of a second, aligned to within 50 microns (half the diameter of a human hair), with the power level and other parameters just right.
Seconds after the shot, data is accessible. But the revealing analysis takes about 20 minutes, says Adams, an 18-year veteran of NIF who’s risen from managing databases to become the facility’s lead IT architect. “That twenty minutes can be unbearable for a scientist who’s been waiting for these results from the time they conceived the experiment months or even years ago.”
The NIF is huge: part cavern, the size of three football fields, which laser beams traverse while gaining speed and focus; and part Star Trek, with a gleaming 10-meter orb where the beams slam into a BB-size target in a finely choreographed implosion.
If you want to push human understanding deeper into the fundamental workings of our universe, the NIF is one of a handful of tools you can use. NASA’s James Webb Space Telescope and Europe’s CERN nuclear accelerator are other examples. Even amid a worldwide push for fusion energy—led by government programs and growing private investment, which Bloomberg News says topped $2.1 billion in 2021—it’s not surprising that NIF is the first to step over this threshold.
For a year and a half following the promising result in 2021, the NIF team continued to hone their approach. A target shot involves more than 60,000 control points, such as motors, sensors, and switches, powered by one of the world’s most sophisticated computer control systems. Operating systems, databases, and millions of lines of code running on over 2,000 servers make it possible to efficiently and reliably fire the NIF laser several times a day. “Our engineers and developers have brought everything together to make a reliable, available, and manageable platform for conducting these fusion experiments,” Adams says.
Each shot, which can produce pressure and heat up to six times that of our sun, tests its own mix of power levels, exotic hydrogen isotopes (such as deuterium and tritium), and diagnostics—10,000 different adjustable parameters—designed to reveal something new about how matter and energy work. Experiments have led to new understanding of brown dwarf stars and black holes and gamma-ray bursts, as well as more academic lessons on the changing opacity of radioactive materials over time as part of research into aging nuclear weapons.
For each experiment, the NIF team starts with parameters from researchers and builds toward the desired outcome. “It’s like a view of the Mona Lisa that’s all pixelated at first,” says Adams. “As we tune the knobs for all the 60,000 control points to match what that scientist wants to do, the picture gets clearer and clearer.”
“There's just so many different skills and capabilities that we were able to bring to the table to make ignition happen.”
After the promising fusion ignition shot in August 2021, more fine-tuning ensued. Scientists asked, “What if we turn the gain on the laser up just the hair? What if we alter the design of the target capsules or change the pulse shaping?” Adams explains. “On the IT side, we figured out how to give them a bit more horsepower where they needed it. We paid a lot of attention to the systems that capture, store, and analyze data around target capsule fabrication.” Adams reels off familiar equipment by name, almost like they’re colleagues: “Dante” is an x-ray spectrometer that senses minute temperature changes inside the hohlraum; “nToF” (sounds like “entoff”) notes the neutron yield, ion temperature, and event duration; and “FODI” examines optics for damage, which is critical for high-energy shots, says Adams. “There's just so many different skills and capabilities that we were able to bring to the table to make ignition happen.”
To accommodate the number of worthy proposals from the world’s researchers, the NIF runs day and night on a tight and continuously adjusting schedule.
Each shot is fired and monitored from a control room reminiscent of a NASA space launch. “The shot itself is only about 20-billionths of a second, but everything has to be performing optimally for that period of time,” Adams says.
It’s a simple idea that belies the constant computer monitoring and tweaking needed to keep the NIF firing. “By having very good monitoring in the environment, you can start to get a sense of the heartbeat of the NIF,” says Adams. “We’re so tightly coupled to the way the laser functions we’re able to leverage system information to visualize issues such as if there’s a capacitor that’s not charging fully or a motor is starting to generate a few extra logs, we get an indication that a particular system requires a deeper look.”
The computing work is done in a vast data center built around the principles of unbroken availability and high data throughput rates. “If you look across our environment right now, we’ve got 2,000 virtual machines running Oracle Linux and Microsoft Windows on Oracle Virtualization,” he says. “It’s all managed by Oracle Enterprise Manager.” NIF pumps out on average about 50 terabytes a year of both machine data and the data coming from the experiment itself. In terms of assuring NIF reliability, Adams’ team needs to see the entire machine as an ecosystem, not just bits of data about all those 60,000 control points.
“Our architecture gives us the ability to pull a lot of metrics about a running machine, especially around the context of an Oracle Database. We take that data and correlate that with log data, and we get a composite view of the full environment,” says Adams. “That's just invaluable as we assess how the environment is performing and work through any operational issues in terms of facility uptime or maintenance cycles and, of course, cybersecurity.”
Then it’s time to run another high-stakes experiment. “Each time a laser’s fired there’s a lot of nonrelational object data that gets produced by these scientific instruments,” he says. “Analysis systems pick up the experiment data and then generate representations of either X-rays, plasmas, or other phenomenon, and we store that all in an Oracle Database.” These are used for in-database analytics or for sharing with other specialized software.
For Adams’s IT team of about 30 people to consistently meet the needs of NIF researchers, they are constantly on the lookout for an edge. For that, they maintain a close relationship with their vendors such as the teams at Oracle who produce NIF’s database, virtual machines, IT management software, and Linux operating system. “The engineering support we received enabled us to resolve issues and keep the NIF firing,” says Adams.
For its part, Oracle tries to learn from the NIF’s ambitious use of its software. “Their operation is built around database performance, so they’re willing to test new features if they think it can help their mission,” says Honglin Su, vice president of product management for Oracle Linux and Virtualization. “We sometimes share early versions of upcoming releases to learn from their experience using it.”
For Adams and his team, the focus is always to help researchers meet their goals. “You’ve got to make sure that if plan A takes a hit, you can pivot very quickly into plan B, plan C,” he says. “We have solid partners like Oracle and a methodology of doing failure modes and effects analysis to assess what can go wrong and deliberately switch to an alternate running state that keeps the science going.”
Uptime, says Adams, is the name of the game. “The human challenge of being able to create something this big and this precise and being able to repeat it over and over again and continually hit the milestones and benchmarks that it does is a thrill,” he says.
In December 2022, a year and a half after that promising shot—after so much tinkering and fine-tuning with one of the world’s most unique and powerful machines—fusion ignition happened. Now, says Adams, it’s time for the world’s engineers and physicists to take it from here. “We had to make sure that we proved the physics works, the math works,” he says. “Now industries and engineers will come together” to build on the NIF’s foundation. “That’s how you get the difference between the first plane that the Wright brothers flew to today.”
The NIF, of course, will continue to lead—moving, if possible, even faster. “I think we’ll see an increase in complexity of the type of shots and the demand to bring this energy, temperature, and pressure to bear on every type of experiment,” Adams says. “Right now, we put two [megajoules] in and got three out. Who’s to say we can’t put two in and get 200 out?”
Oracle database services and products offer customers cost-optimized and high-performance versions of Oracle Database, the world's leading converged, multi-model database management system, as well as in-memory, NoSQL and MySQL databases.
A highly performant and secure operating environment, Oracle Linux delivers virtualization, management, automation, and cloud native computing tools, along with the operating system, in a single, easy-to-manage support offering.