If you’ve been on the internet (ever), you’ve probably seen this term being thrown around a lot:
When you hear that buzz word, what comes to mind? Let me guess — maybe you thought of big-hitters like AI, alternative energy, or even colonizing the final frontier with space exploration. At least, those were some of the highly-publicized, controversial, and obvious ones.
But what about the one that’s staring you in the face? No, I’m not talking about the consumer electronics revolution that was responsible for your handy-dandy PC or phone. I’m talking about the thousands of LEDs and electronic light sources that power your screen, and the billions more that you might not even notice all around you.
From lightbulbs to neon signs, to lasers built to shoot planes straight out of the sky — every time you’ve ever experienced light, you can thank the (insanely) underrated field of photonics.
Well, even if photonics might be underrated today, it feels like the tip of a massive iceberg. A whole lot similar to the revolutions that happened with factories, cars, and computers.
Out of its many applications, and the countless more we haven’t even discovered, some of the most interesting ones could impact medicine. And no, I’m not talking about getting more vitamin D from the sun.
So that begs the question, where in the world does the field of photonics intersect with medical imaging? The short answer is a lot of places.
Medical Imaging 101:
It all started with a “little” discovery. To be more specific, the start of such a massive technological revolution began with a finding that dated back as early as 1880 — by some guy named Alexander Graham Bell (you might’ve heard of him before).
Bell was experimenting with how he could transmit sound waves over long distances, through a device he called a “photophone.” Graham Bell observed how light could interact with objects, cause them to heat up quickly, expand, and produce sound waves in the process. Now, we just call that the photoacoustic (or optoacoustic) effect. Thanks for everything, Bell!
You see, the actual concept of photoacoustics had already been around for ages — the way we leveraged ended up being the breakthrough. But before we get into the nitty-gritty of photoacoustics, let’s take a moment to think about how other forms of medical imaging work.
Take this as a rule of thumb:
“Every single form of medical imaging we know of (at least for now), has two major components — an energy or radiation source, along with a system to detect it.”
For now, let’s take light as an example — think of light as the combination of quadrillion of particles (photons) moving at…well, the speed of light. Now, even though the speed of light remains constant (in most situations), their energy levels can vary a lot.
The higher a photon’s energy level is, the more it vibrates, and vice-versa. We call that a particle’s frequency. Photons located at different frequencies are energized differently and have unique properties. You can visualize all a photon’s possible energy states using a handy-dandy tool called the electromagnetic (EM) spectrum:
As you can see, forms of electromagnetic radiation like X-Rays, microwaves, and gamma rays are just highly energized forms of the same light that might come out of your nightlamp.
And it’s basically the same story for sound-based medical imaging methods like ultrasound — it just uses airwaves (which is just sound) vibrating quickly, and reconstructs images of our bodies with it.
Now, in the case of both sound and light, they move at a pretty stable rate through the air, but they travel (propagate) at a slightly lower or higher speed through different materials. Our bodies aren’t perfect conductors either, so the particles that pass through our bodies tend to lose a bit of energy too.
Remember, we’re trying to image our bodies — not solid rocks. After all, if our bodies were uniform (homogeneous) all over, medical imaging wouldn’t really be that useful in the first place.
Fortunately, our bodies are chock full of organs, skin, tissue, and muscles! And you know what that means? Every single one of those elements in our bodies would react to the energy in its own way.
Then, you measure how that energy gets transferred around — we do that through sensors. Whether that’s a piezoelectric crystal or photodiode array that converts those quadrillions of photons into an electric signal — medical imaging would be impossible without them:
All that’s left is for a computer to use (deviously complex) algorithms to convert signals from our sensors into an image. Actually, another way to think about it would be that the computer assigns different shades or colours to each piece of data it receives from the sensors — allowing us to visualize every part in our bodies as separate pieces.
So, zooming out to diagnostic imaging as a whole— it’s really as simple as emitting some form of acoustic or EM radiation into our bodies, and recording what passes through (or reflects back) with precise sensors and translating that data into an image. No biggie.
“Wait, “you might ask: “so how does all of this fit in with photoacoustic imaging?” Amazing question!
Hearing Light — Then Seeing It Again
Now that we know (quite a bit) about light — let’s get right into lasers! Think of LASERS (Light Amplification by Stimulated Emission of Radiation), as beams of photons that have been energized and converged down to a tiny focal point. Other than that, lasers are just the same as your run of the mill LED — or even a candle!
The reason you’ll see lasers used so often in photonics (and especially in photoacoustics) is that they’re focused. Again, leveraging photoacoustics means trying to produce a sound wave from light — but only lasers have a chance of being strong enough to heat our bodies up fast enough to do that:
But as it turns out, lasers that powerful can actually be a bit harmful (they’ll burn you to the bone). In photoacoustics, we pulse the beams rapidly (usually every couple of nanoseconds)— it maintains the photoacoustic effect without…having to meet the secret service.
So when it comes to the process — you can probably guess how photoacoustic imaging works. It’s all in the magic of the photoacoustic effect! First, take any part of the body you want to analyze (or the entire body), and pulse it with a specific pattern of laser pulses:
Generally, these patterns really depend on the type of image you’re trying to construct— 2D would be different from 3D would be different form 4D (yup, that exists). Let’s say you were trying to take a 2D image of your arm from a top-view.
You would pulse lasers around your arm in a whole bunch of 1D points and let an algorithm reconstruct the rest. If it was 3D, you would need to take a ginormous amount of cross-sectional scans of it and ‘stitch’ then together with another algorithm. The same goes for 4D imaging (which is just a timelapse of hundreds of 3D images).
We’re just missing a single link — the acoustic part of the photoacoustic process. Remember, all that light our bodies absorb makes them expand ever so slightly and ever so quickly — but that’s all we need to create ultrasound. The only question left is how you detect it.
When our bodies heat up from light (AKA thermoelastic expansion), they’re releasing pressure — which moves air in waves (that’s sound). But since that air’s to pressurized for us to hear — optoacoustic imagers use sensors built to detect ultrasound waves, known as ultrasound transducers.
Those sensors have to be built for the job — if not, they could register any sound in the room as an acoustic signal, rather than just the pressure waves that propagate from the laser pulses. Not to mention, those sensors need to be pretty sensitive to detect something so tiny.
Most of the time, the ultrasound sensors used in a photoacoustic setup are of a type called two-way piezoelectric transducers. They sound complex (scientists tend to do that a lot), but they’re basically just a bunch of microscopic crystals that respond to a range of vibration — including sound.
The reason they’re two-way sensors is that they function in two ways — when exposed to pressure waves, the crystal produces an electric signal, but when exposed to current, the crystals produce ultrasound of their own:
And finally, it’s a computer’s job to take thousands of those electrical signals from thousands of different chromophores and reconstruct them into a coherent image. Now, while we have dozens of algorithms to do this — most of which involve AI, some HEAVY calculus, they all rely on the principle of triangulation:
By figuring out where ultrasound signals interfere with each other, algorithms make sense of where individual parts of our body are located, and they repeat the same process for progressively smaller features to add detail.
That’s all there is to it! While the details may be pretty complex, you learned about the overall process of imaging the body. Honestly, with this article alone, you have all the steps you would need if you wanted to build a photoacoustic imager of your very own!
But what’s even more promising are the applications. Now that you know the gist of photoacoustics, it’s not that hard to guess some of the ways people might already be using it to change lives.
The most obvious one’s probably how photoacoustic imaging has so many functionalities. Even though a well-built imager might run you upwards of a $100k (USD), it has the potential to replace almost every single form of medical imaging we have today.
When it comes to our bodies, emitting a laser beam (usually near the infrared spectrum of light) only stimulates compounds (chromophores) or tissue that respond to that wavelength. When you expose a known quality of light at something, only chromophores that match the light’s frequency are going to expand from the photoacoustic effect.
Imagine what you could do if you tuned the wavelength of the device to the responsive wavelength of a part of your body like your nerves or muscles — you’d be able to isolate just those parts in an image!
Or even better, you could inject a chemical into a patient’s body that responded to a specific wavelength and detect diseases from there. That alone could make PET scans obsolete — a technique known to be expensive and unsafe over the long-term because of its use of radioactive substances.
Not to mention how photoacoustic imaging isn’t just functional in black and white like X-Ray or CT — it can create images in full-colour to highlight features like the concentration of compounds like oxygen and hemoglobin in your blood.
Those pieces of information could be vital to locating diseases like cancer or heart problems where every other imaging tool would fail. That’s revolutionary:
Currently, CT, X-Ray, and CAT scans face similar problems of speed, safety, and precision. Those are problems photoacoustics doesn’t even face — and we haven’t even seen a large body of research on the technology yet.
But notice how I said that this has potential — photoacoustic imagers aren’t something hospitals use today — at least not as a way to actually diagnose diseases. Even if photoacoustics was the best medical technology in the world, there wouldn’t be any use if nobody saw its value, or just stuck with the status quo anyway.
Out of everything we’ve seen in so far, it’s pretty crazy to realize just how tiny what we’ve talked about is in the grand scheme of photonics. It’s a subset of a subset of a subset of something massive.
But in the end, everything you’ve needed to see to get started is in front of you. Photoacoustics could change everything — not just in medicine, but the entire world as we know it. The question is:
“What are you going to do about it?”
Thanks for reading, and stay safe,
**Thanks for reading! If you see the potential I see in photoacoustics (or photonics in general), leave a private note on this article and I’ll reply as soon as I can — I’d love to have a discussion about our thoughts and ideas 👋**