Discover more from The Asianometry Newsletter
Microfluidics and the Elusive Lab-on-a-Chip
Microfluidics describes a portfolio of technologies that move and process fluids through channels smaller than 1 millimeter wide or long.
Its study has led to the creation of two multi-billion businesses today. Try to guess what they are.
One of the science’s big dreams has been to leverage these technologies to radically miniaturize and encapsulate the laboratory: the Lab-on-a-Chip
People have been pouring a lot of time and resources into the space. There are thousands of papers published on microfluidics. And it does offer tantalizing promise.
But it is also littered with many commercial failures ... and one very big fraud.
In this video, we take a look at the microfluidics industry, its pursuit of a mythic ideal, and why they have so far failed.
Let us start first by talking about the special sciences behind fluids at a micro-level. Why do people care? There are still so much we do not know. But what we have learned already offers truly exciting possibilities.
For instance, there are two ways fluids can flow through a thing. First, it can flow in a layered, or laminar way. The streams are parallel with one another, so no mixing between the streams.
Then there is turbulent flow, where various inertial forces cause the fluids to move in a near-random manner. Turbulent flows tend to affect reading accuracy, so technicians try to avoid this.
At very small dimensions - about a micrometer in width or length - liquid flows tend to be laminar. So when two fluid streams join together in a micro-channel, they end up going side by side.
This makes mixing fluids a little more complicated - new structures have to be introduced for it and diffusion is involved - but the upside is that liquid flows and pathways are very predictable.
Interactions like these give microfluidics special mechanical advantages in addition to the substantial economic advantages of shrinking and integrating together discrete components at scale. Thus why people have consistently considered microfluidics a top-10 revolutionary technology candidate.
This is a massive field. There has been over two thousand papers published on the subject over the past 20 years. It is truly sprawling, which makes it pretty hard to corral into a simple video.
What lies up ahead is a golden path, but keep in mind that there are many other interesting tidbits inside the woods beyond.
Researchers have been studying the behaviors of micro-sized amounts of liquids since the 1950s for inkjet technology.
The emergence of the inkjet printer represents the first real multibillion dollar success in the microfluidics industry. But a number of other market demand and technology trends supercharged microfluidics research and really pushed it towards the biology and health fields.
First, semiconductor manufacturing. As technologies like photolithography and chemical etching methods gained prominence, scientists started applying these methods to different science disciplines.
For instance, applying them to the world of mechanical engineering would give us MEMS - mechanical structures etched in silicon. And applying them to photonics would give us silicon photonics. Both fields have given us billion-dollar products.
MEMS techniques would in turn give birth to the modern microfluidics industry - as well a cousin, BioMEMS. In the late 1970s, scientists explored using MEMS techniques to craft small structures in silicon for manipulating and analyzing liquid chemicals.
In 1979, they were able to create the first microfluidic sensor/tool. Using photolithography, they were able to create a miniaturized gas analysis platform on a single silicon wafer. It didn't work, but it was a breakthrough.
To manipulate fluids across the entire system, scientists developed a portfolio of microfluidic devices. Working together, these micro-sized pumps, mixers, concentrators, valves and channels can manipulate fluids as needed.
Our understanding of these effects would lead to the industry's second - and probably biggest - commercial success yet: Test strips.
Sometimes also called "lateral flow assays".
I have seen it spelled - or misspelled? - as "test stripes" as well. Potato potatoh.
Paper test strips have been around since the 1960s. The first ones were for diabetes. You put a drop of blood on it, waited for it to react with the chemicals on the paper, and then reviewed the color of the paper after washing.
That’s fine. But in the mid-1980s, the industry commercialized capillary-driven test strips. These paper strips are made up of structures called fleeces that use the capillary effect to draw liquid across the strip.
The capillary effect is a long-known effect where liquids can be drawn up into a thin tube without the need for pumps or external power sources.
Different types of fleeces can be used to different effects. For instance, a separation fleece to separate out blood cells. A reaction fleece contains suspended chemicals that do the testing. Such chemicals tend to be immobilized antibodies or the such that bind to the desired object.
These disposable, fast-reacting strips have been incredibly valuable as a diagnostic platform - particularly in the diabetes or pregnancy test markets. They are the other big commercial success to come out of the microfluidics space. The glucose test strip market alone is worth over $2 billion each year.
They are the gold standard in cheap and versatile diagnostics. They are largely self contained and do not need an external power source. This makes them ideally suited for low budget or more rugged situations.
I know. Your standard over-the-counter pregnancy test or blood glucose test might seem to be the farthest thing from a leading-edge 5 nanometer semiconductor. But I would indeed call them distant relatives!
The Next Step
For all their upsides, Capillary-powered test strips aren’t perfect. One downside is that we are unable to perform the full array of available laboratory actions.
For instance, mixing. The liquid flows inside capillaries are laminar, which as I previously said means that they do not mix very well. If you want to mix things together then you have to actively intervene, which adds complexity and costs.
Another downside is that the readings lack pinpoint accuracy - because most of the time the result comes up as a simple color on a strip.
The vision was to use what we know about microfluidics to integrate together all the parts and performance of a wet bench, creating what is called a lab-on-a-chip.
With a lab-on-a-chip, we can take the test strip concept a step further to make it possible to run a variety of tests on small fluid samples right at the point of care. They would also be far cheaper to produce and much easier for untrained personnel to use.
Here’s an example use case. Over 40 million people around the world are HIV-positive. About 90% of those people have never even been tested for the virus before. Nor are these people able to receive accurate counts of the CD4 lymphocytes in their blood to track the infection's progression.
CD4 lymphocyte counts are most accurately tested using a procedure called flow cytometry. This fancy sounding technique shines a light through a fluid sample and uses photodetectors to count particles through measurements of scattered light.
I recall a derivation of this technique - called dynamic light scattering - being used to count and extrapolate purity counts in ultrapure water production.
Go check out that video if you're interested.
Anyway, this procedure is the gold standard. But requires a fully equipped laboratory and trained technicians. Imagine if we can shrink the whole thing down to a single piece of silicon? Delivering proper health care would be immensely easier.
Building off the concept of the test strip, you start with a sample. That can be blood, tissue fluid, urine, milk, water, sludge, or whatever have you.
Most samples have to be prepared before they can be introduced to the lab-on-a-chip. For example, you might need to break down a cell before running a test - a procedure known as lysing - or to amplify a DNA sample using polymerase chain reactions.
After that, the prepared sample's fluids are guided around the chip through micro-channels. This is where you need to perform laboratory actions like mixing, separating, or filtering.
For instance, mixing the prepared sample with various reactants so to discover a result. The result is then interpreted and delivered to the end user.
Figuring out how to accurately perform these actions without requiring outside intervention has been one of the bigger challenges in the Lab-On-A-Chip industry.
What actually will go onto the lab-on-a-chip can be as varied as all the thousands of papers being published about it. It really does depend on the goal procedure.
However, we can broadly lump the lab's components into three separate categories: Actuators, sensors and circuits.
The actuators help produce the electrical or mechanical microfluidic forces that circulate the fluid samples around the lab.
The sensors measure the electrical, optical, magnetic, or thermal traits of the targeted samples. These can be performed by micro-electrodes or optical sensors.
Then you have the electronic circuits, which provide support. They can help amplify the signals or reduce noise to help make the test's results more clear. They also produce the read-out, coordinate the system's actions, and cook your mom dinner.
The microfluidic industry has been able to develop a series of ingenious actions so to achieve certain laboratory results right on the chip.
For instance, let's take the simple act of separating out cells in blood. This action can be useful for diagnosing sickle cell anemia, discovering parasites, or finding cancer cells.
We got dielectrophoresis, part of the dark arts of using electric fields to manipulate liquids - digital microfluidics. In this case, you can use it to separate cells out of a salty medium. Scientists have used this to pull a parasite out of a blood sample.
It works but is slow and needs an external power source. Some actions require more power than others, but too much power runs the risk of damaging the chip itself.
There is Deterministic Lateral Displacement. You take a laminar flow of a fluid you know to have different sized particles within it.
Then you put down an array of obstacles into that flow. The flow then separates in routes around the obstacles. And the remarkable thing is that all particles of a certain size will always choose the same route.
Accidentally discovered in 2004 by Lotien Richard Huang, the physics behind this method are apparently hard to describe. It works very fast and doesn't use energy but is subject to clogging.
The list goes on and on. We have microfluidic structures designed for heating up DNA samples so to encourage polymerase chain reactions. Light scattering effects. And optical tweezers. I love all of this stuff, but sadly we have to move on.
What the microfluidic community has been able to achieve so far with their technology is remarkable. There is no doubt of this. But we also have to acknowledge that it all falls far short of a fully functioning lab.
One appealing part of the concept is end to end integration. But right now there remains a whole lot that needs to be done off-device.
In a previous video, I spoke about the challenge of packaging a MEMS sensor so that it could deliver an accurate result without causing damage to the system itself.
Labs-on-a-chip face a similar "real world to chip world" interface problem. How do we transition a sample from the real world to the chip?
I briefly mentioned preparatory procedures for fluid samples. Sure the chip might be able to separate some cells out of blood. But can it also dilute or incubate? Prepping a sputum sample for a respiratory disease test requires vortex mixers, strainers, and a centrifuge. Without these, the lab on a chip is not complete.
Just FYI. I do note that centrifuges based on microfluidic principles do exist. There is a cool one made in Japan that can do up to 100 crystallization experiments in parallel. It works, but it's not suitable for integration into a complete lab yet.
Systems for doing these preparation actions need to in turn include auxiliary systems like microscopes, fluid pumps, computers, and spectrometers. You end up needing the lab, anyway!
Thus the frequently cited joke - "It's not a lab on a chip. It's a chip in the lab!"
The technical challenges can probably be overcome, but will it be good and affordable enough to succeed in the highly regulated, very competitive health industry?
Such product development needs money and time. Universities are not really suited to do this. Research grants do not last long enough or offer enough money to fund such efforts.
And would winning the market even be worth it? While the $200 billion laboratory industry has a large Total Applicable Market or TAM, the market for an individual test isn't particularly large.
The entire blood glucose monitoring industry - probably the largest single market and includes more than just the test strips - is estimated to be worth some $10 billion in 2022. That's about the same size of the Apple AirPods market.
So you need to bundle together a bunch of tests. Not just a few dozen but hundreds. To help water down the task of doing this, people have introduced the microfluidic large scale integration platform, a riff on the VLSI paradigm that helped make semiconductor design so successful.
It uses combinations of micro-valves to build up more complex units like pumps and mixers to make continuous processing possible. The downside is, again, aforementioned sampling and preparation. Not to mention the Theranos problem, where you need far more than a drop of blood. Woof.
A more pragmatic solution would be to integrate microfluidic principles into existing commercial products. For instance, the i-STAT device by Abbott is a point-of-care blood analyzer that provides lab-quality results in minutes.
The tool uses electrochemical detection principles to do its blood analysis tests. But newer versions integrate microfluidics to reduce the amount of blood required and add additional functionality.
The whole kerfuffle with Theranos has been a very public failure for the lab-on-a-chip industry, which Theranos had claimed to be using for their fraudulent product. Investors now ask, "How is this different from Theranos?"
The lessons people should draw from the Theranos debacle are related to white collar crime, the culture of founder worship in startups, and the importance of peer review.
I do not think it is an indictment of microfluidics and the lab-on-a-chip industry in general. But there is a reason why millions of dollars and decades of study haven't yet brought out a competent candidate. The chip-on-a-lab itself remains elusive.