Discover more from The Asianometry Newsletter
The PUREST Water in the World
Ultrapure water for semiconductor manufacturing
Author’s note: If you want to watch the video, it is below
I have been asked what Ultrapure water tastes like and whether it will slowly kill you or something. It’s water. By the time you even look at it, it’s no longer ultrapure.
It is the purest water you will ever know. And every day, chip factories are sloshing their wafers with it.
Ultrapure water or UPW is an industry term. A term that describes its product quite well. Water with purity requirements so strict, you're more likely to win the national lottery than to find a non-water molecule inside it.
Companies have contorted themselves into pretzels making ultrapure water. And the bar keep getting higher year after year. How pure can you possibly get?
In this short video, we are going to look at how semiconductor companies make the world's purest water.
Ultrapure water is very pure water. I don't really have any other way of explaining it. It is used in a few industries with high quality control standards.
For instance, the pharmaceutical industry uses it to prepare injection and inhaled therapy medicines. Though I should note that while the term is the same, the standards for purity differ between these two industries.
But it is in the semiconductor industry where the chemical is most-widely used. Fabs use ultrapure water to rinse wafer surfaces, to dilute certain chemicals, and for immersion lithography.
The average fab uses about 2,000 gallons of ultrapure water each minute, 2-3 million gallons each day. Each wafer uses about 2,200 gallons of water throughout its production lifecycle - spending several hours at a time getting dunked in water-rinse systems.
Why It Matters
Why does water purity matter? Because impurities can affect the wafer in a number of ways.
They can land in between the lines on the wafer design and cause electrical short circuits;
Or they can land on the photoresist during the lithography phase and thus cause a projection error on the wafer. Kind of a piece of dust on your camera lens causes a shadow.
Or they can cause little, nano-second long static shocks that affect the wafer’s design.
All of this affects the wafer yield. Defects from killer particles historically have interfered with the yield learning process - the work of getting from 50 to 90 percent in yield. It was a critical problem for Intel scaling up its 14 nanometer process node.
What's in the Water?
With this in mind, ultrapure water producers classify the impurities they can expect to find and remove into several categories.
First are the particles. This is the hardest part of the whole process. Any item can shed particles, especially if they are being shaken, heated, or cracked.
Pipes and tubing are constantly shedding particles into flowing water - with random bursts that drive everyone crazy.
Second, bacteria. Keeping bacterial colonies under control has been historically difficult because they grow and all. Modern cleanroom procedures however do a good enough job, and today’s companies only have to deal with such colonies in brand new facilities.
Third, you want to remove total organic carbons. These serve as food for bacterial colonies - see the preceding point - and can leave unwanted residues on the wafers.
Fourth, silica or silicon dioxide. The element is abundant and can be found in either a dissolved or discrete format. The latter is particularly detrimental to wafer manufacturing.
And finally, there's all the other stuff: Metals, ions, and dissolved oxygen. Metals can diffuse into the silicon surface and cause issues. And dissolved oxygen in particular can make bubbles and oxidize various parts of the wafer.
Of course, to remove every existing thing in the water is impossible - especially at an industrial scale. But reach for the stars, right? And what producers can definitely do is filter out everything that can be measured with existing equipment.
Even back in the distant past of 2001 - when the leading edge process node was 150 nanometers - the limits for ultra-purity approached the cliffs of insanity.
In 2001, the definition for ultrapure water set the killer particle size at about 65 nanometers - half the feature size.
That’s about the same diameter of the COVID virus and 1,500 times smaller than the diameter of a human hair. Every liter of ultrapure water can have only 20 particles of such size.
For boron, sodium, potassium, and chloride ions, the expectations are even lower - from 5 to 50 parts per trillion.
To compare, the odds of winning the jackpot in the UK National Lottery are 45,057,474 to one, or 22 parts in a billion, or 22,000 parts in a trillion.
You are some 450 times more likely to find a winning jackpot ticket than to come across a sodium ion in semiconductor-grade ultrapure water.
And remember, these were the purity standards from back in 2001. As the leading edge advances, the standards continue to ratchet down. More on that later.
Getting Raw Water
So how do you get to such insane purity levels? Modern ultrapure water systems have three main stages - which in turn have many little sub-stages of their own - as well as a distribution piping system connecting them.
You start with raw water. So you draw this raw water from sources like the city water system, desalination plants, nature, or an in-house recycling system.
The raw water source can make a big splash. In one infamous example in 2001, Intel suffered an incursion of organic carbons into their ultrapure water. They traced the source to 100-500 pounds of urea amidst 6 million pounds of raw water, or 0.006%.
The urea persisted through the filters into the ultrapure water at about 15 parts per billion. Again, lottery jackpot odds. But this number was still too high. Because when the urea was exposed to DUV light during the lithography phase, it decomposed into ammonia and created errors.
The urea had always been there. Probably due to runoff from nearby farms. But it occurred in unusually high concentrations in 2001 because the rains that summer were at a 20-year low, preventing dilution.
The first thing to do with the raw water is to put it through the pre-treatment stage. This is where machines and filters remove the majority of the water impurities - suspended solids, chlorine, and other items.
Generally these systems make extensive use of reverse osmosis - where you use pressure and a non-porous membrane to filter water. This removes the vast majority of particulates including organics, silica, salts, particles, bacteria and viruses.
After that, the water is de-gassified and then hit with UV rays. The UV degrades the organics, disassociating them into organic compounds.
Then the water sent into the primary stage. This is often a deionization step - sometimes also referred to as ion exchange - where you remove as many ions as you possibly can.
After this, the water is degassified again and then goes through a polish loop. This is responsible for maintaining the circulating water and keeping its quality high.
A critical feature is to keep a high flow rate of water through the pipes. A stagnant pipe can allow bacteria to grow, creating biofilms that mess with wafer quality.
I do want to note that semiconductor fabs generate their ultrapure water in-house. So setups and definitions can vary from fab to fab - combining, re-arranging, or repeating several processes.
For instance, some Japanese fabs include another round of reverse osmosis along with the ion-exchange step inside the primary stage.
Despite their diversity, almost all ultrapure water systems share the same processes: Ion exchange, Reverse osmosis, degassification, and UV irradiation.
Monitoring and filtering nanoparticles is difficult and is especially tricky within a solution rather than on a surface. How do you remove something you cannot easily detect? How do you close this gap in measurement capability - referred to as a “metrology gap”?
The common thing engineers do is measure the water's electrical conductivity or resistivity. The two units are reciprocal of each other. The semiconductor industry tends to measure the latter.
Absolutely pure water at 25 degrees Celsius has a conductivity of 0.05501 microsiemens per centimeter or resistivity of 18.18 megohm per centimeter. This is the benchmark people are going for.
Contaminations change water's physical attributes. For instance, a contamination of sodium chloride ions at the level of 0.1 parts per billion lowers the water's resistivity from 18.18 to 18.11 megohm per centimeter.
So metal probes are installed throughout the ultrapure water piping system to provide real-time continuous monitoring of how far along the water is going through the system.
In the case of one Reverse Osmosis water filtration module, measurements show the resistivity slowly approaching the 18.18 number as it gets processed over time in the RO system.
The Big Metrology Gap
To measure the amount of particles in the water, companies shine a laser through it. The particles would scatter the light and they measure that. As the leading edge advanced, companies had to use increasingly more powerful lasers with smaller wavelengths.
But these tools were most effective for particles sized at about 30-100 nanometers. Once the killer particle size limit ratcheted down to 20 nanometers - a limit we hit roughly about ten years ago - engineers realized that there existed no detection tool for consistently detecting sub-10 nanometer particles in low quantities.
It doesn't mean that they are totally blind. Existing tools can still detect these tiny tiny particles. But it either means having lower detection efficiencies or requiring higher quantities of the tiny particle.
Companies are investing in alternatives, and there are some promising candidates. Dynamic light scattering is an interesting one that measures the intensity fluctuations of light as it travels through the sample.
The problem is that the killer particle size limit is shrinking faster than these measurement technologies can keep up their measurement advancements. Unfortunately, this metrology gap cannot be closed with technology.
Today, the leading edge semiconductor industry standard is 10 particles larger than 3.5 nanometers for each milliliter of water at the point of entry into the tool. This is far beyond what's measurable. Thus, companies have resorted to a form of proactive risk control.
For instance, the use of a power law correlation to project the contamination level at the target size. First they measure how many larger particles there are in the water higher upstream in the filtration system.
For instance, maybe they find 30 50-nanometer sized particles upstream.
30 particles might mean 10 5-nanometer sized particles based on the extrapolation.
This practice is paired with the rigid maintenance of the high efficiency filters and known particle shedding materials like the resins used in the ion exchange system.
The SEMI organization has standards for all of this - and serves as the source for much of this video - and C93 covers this particular thing.
Companies have been using this "proactive risk control" practice in their ultrapure water systems for over the past ten years. Yields haven't crashed, so it seems to have worked.
In recent years, a new challenge has arrived as fab engineers now have to deal with nano-sized particle contamination on the wafer surface. A similar metrology gap, now applied to other parts of the manufacturing footprint.
In the process of researching this video, I read through entire books about water - written by teams of people. It is amazing to think that more than a significant number of people have spent their entire professional lives studying the physical traits and properties of something as seemingly mundane as really really pure water.
It is thanks to them that we can build the cutting edge technologies we have today. They're the real MVPs.
I hope you enjoyed this video. I hope you think of it next time you’re in the tub and bathing yourself with that warm water full of killer particles, silica, urea, ions, bacteria and total organic carbons. Yeah. So relaxing.