Smartphone camera sensors explained – PhoneArena

admin10 January 2024Last Update :
Smartphone camera sensors explained - PhoneArena

Smartphone camera sensors explained – PhoneArena،

Today's smartphones are so powerful that they could not only have launched every Apollo mission to the Moon, but also taken every photo and video from the Moon's surface and posted them on Instagram.

All joking aside, the technology behind modern smartphone cameras is so advanced, from the sensors to all the software algorithms, that one can use a smartphone to capture images and videos for almost any scenario imaginable.

There is an entire list of films shot entirely on smartphones, and many photography exhibitions and competitions also rely on the smartphone as their primary tool of choice. These exhibits not only feature some pretty popular names in the industry, but they are also pretty amazing.

What is behind all this? What makes smartphones good enough, even for professional photographers and filmmakers? What is a CMOS sensor and what exactly is a dual-layer transistor-pixel design? Today we will answer these questions.

A Brief History of the Camera Phone

We're not going to bore you with lots of historical details, but just for context, the first patent for a camera phone dates back to 1994, when four Nokia employees decided to create a camera phone integrated inside.

Oddly enough, the first camera phone wasn't made by Nokia. He was a Japanese Kyocera VP-210called Visual Phone, released in May 1999. In the intervening 25 years, things have evolved rapidly and we now have 1-inch smartphone sensors, stacked sensors, dual-layer transistor pixel designs, and even more.

But what hasn't changed is the basic technology that captures light and turns it into an electrical signal to store it in your phone's memory or display it on the screen.

What is a CMOS sensor?

This technology was invented in 1968 and the abbreviation stands for Complementary metal oxide semiconductor. It's not as complicated as it seems. There are photodiodes which capture light and transform it into an electrical signal, then transistors which amplify this signal.

These are also called APS, or active pixel sensors, because each photodiode has a separate transistor that actively amplifies the signal. So we have a lens focusing the light, a photodiode capturing the photons of the light and converting them into electrons or electrical signals, and transistors amplifying this signal and sending it to the ISP (image signal processing) chip to further processing.

So far, so good. Early CMOS sensors had their metal circuits between the photodiode and the lens, obstructing the light and making the image darker. These were known as ISP (front lighting).

The next generation of CMOS sensors moved the photodiode above the wiring and next to the lens, capturing the light, allowing more light to be captured by the diode. This design is known as BSI (rear lighting). The iPhone 4 was one of the first phones with this type of sensor.

Invention of stacked CMOS sensor

Around 2008, Taku Umebayashi, an engineer at Sony Semiconductor Solutions Corporation, decided he could improve on this design and began working on a stacked CMOS sensor.

The idea was to completely separate the circuit section from the photodiode. Before that, the circuit was wrapped around the photodiode, taking up valuable space and forcing the photodiode to be smaller. The smaller the photodiode, the less light it can capture, resulting in reduced sensitivity to low light and also more noise from the circuitry around it.

The first commercially available stacked CMOS sensor was introduced by Sony in 2012 and revolutionized digital photography. Umebayashi received the 2016 National Invention Award from the Prime Minister of Japan and the 2020 Purple Ribbon Medal for his invention.

Another fun fact is that at the time the stacked CMOS sensor went into mass production, there were literally no customers for this technology. It was a huge gamble and Sony decided to put the stacked CMOS sensor in its Xperia Z range of smartphones in order to popularize it.

This gamble paid off. Soon after the launch of stacked CMOS, smartphone manufacturers recognized the superiority of the technology, and today most smartphones (including all iPhones) feature Sony image sensors. Sony is the absolute leader in terms of market share when it comes to CMOS sensors, with a huge 42% (2022 data), with Samsung far behind at 19%.

Dual-layer transistor-pixel design

Another buzzword, or should we say buzzphrase, was coined last year, and it's double-layer pixel-transistor mobile sensor. Stacked CMOS sensors were good, but the transistor part amplifying the signal was still part of the photodiode, limiting the surface area that captures light.

In 2021, Sony Semiconductors announced a new breakthrough in the field of CMOS. It is the world's first 2-layer transistor-pixel CMOS sensor. Engineers found a way to move the transistor part, amplifying the signal under the photodiode, leaving more surface area to capture light and improving parameters such as dynamic range, low light sensitivity and noise reduction.

Here's an excerpt from Sony's site explaining the technology: “The 2-layer transistor pixel is the world's first stacked CMOS image sensor technology with a pixel structure that separates photodiodes and pixel transistors on different substrate layers, as opposed to the conventional style of having the two on the same substrate. This new structure has approximately double the saturation signal level compared to conventional image sensors, expands the dynamic range and reduces noise.

The technology made its way to the Sony Xperia 1 Mark V (dubbed Exmor T) and subsequently to many other flagship phones (Xiaomi 14 series, OnePlus 12) under the rebranded Sony LYTIA sensor lineup.

Conclusion

As experts in the field of digital photography say, “the future is stacked!” » This means that in the coming years we will see this technology in more and more flagship (and mid-range) phones. What happens next? Only time will tell !