What is color science and why does it matter?

gsmarena_008.jpg

Color science is the latest term to hit the internet and thrown around by anyone who has an opinion on cameras. These days, you can't go through any camera review, whether of an actual camera or of a smartphone, without someone mentioning how good or bad the color science is.

So what exactly is the color science of a camera and is it really as important as thecoolcameraguy69 says it is in some YouTube comment you saw online? Let's find out.

What is color?

Before we find out what color science is, we must learn what color is and how it comes about in a digital image. Color (or chroma or chrominance), is one part of every digital image, the other being luminance.

To capture color information, digital cameras usually use a color filter array laid on top of the sensor. The most popular CFA is the Bayer filter, which is a 2x2 mosaic consisting of one red, one blue and two green filters. The Bayer filter uses two green for luminance information, as our eyes are more sensitive to luminance and the color green.

gsmarena_001.jpgBayer color filter

The Bayer CFA filters the light that passes through to the sensor, and each photosite or pixel on the sensor receives a specific color value, whether it's red, blue or green. This results in an incomplete picture, which is then completed using a process called demosaicing or debayering. In this process, an algorithm interpolates the color value of each pixel using information from a neighboring pixel and thus creating a complete image.

This may seem like a bit of a hack, but that's because it is. Basically, pretty much every digital image you've captured is only 1/3rd actual color information and 2/3 interpolation or guesswork to obtain the rest of the colors.

Between capturing the raw data and saving the image in a JPEG file, the camera software puts in a lot of work to render the final image from the various bits of data it receives. This is where color science comes in.

What is color science?

The color science of a camera is a catch-all term for how the camera software chooses to render the colors in the final image from the information it originally captured. Now, colors are only one aspect of the image but it's usually what decides the look of the image and what gives a particular camera its personality.

Part of the color science process is in the demosaicing that I mentioned earlier. A lot of time and effort goes into making the right demosaicing algorithm that correctly guesses the right color value for each pixel in the image. One of the reasons modern digital camera images look better than older digital cameras with fewer artifacts is due to the advancement in the demosaicing algorithms.

gsmarena_002.jpgColor wheel

Figuring out the color value of each pixel is one part of the equation. The camera also has to do white balance adjustments to remove any color cast that may inadvertently be captured due to lighting. This is usually done by adjusting the color balance between blue-orange and green-magenta. Blue and orange sit on the opposite side of the color wheel, as do green and magenta, and together they form the four spokes of the color wheel. Adjusting the blue-orange level also makes the image cooler or warmer to the eye, which is why it's called color temperature adjustment.

How a particular camera adjusts this depends upon how the software is configured. Some cameras are designed to set the white balance accurately for every scene while some others may be more partial towards a warmer or cooler color tone. Often companies will set their cameras to prefer a warmer color tone as it generally makes the image look more pleasing. In smartphones, this can be easily seen in iPhone photos, which have a very distinctive bias towards the orange-green spectrum of the color wheel, which makes skin tones pop and the colors generally look warmer.

gsmarena_003.jpgTypical warm iPhone image

Lastly, there are the usual picture settings, which includes color saturation, brightness/gamma, sharpness, contrast, and tone-mapping. Each of these contributes towards the particular look of the final image and the personality of the final image.

For example, Samsung phones in the past tended to bump up the saturation and contrast of the images, although in recent times Samsung tends to veer towards increased brightness in images. Huawei cameras have an overly exaggerated contrast and sharpness setting, which makes the images pop a bit when viewed on the phone screen but makes them rather unpleasant on the big screen. Pixel phones also have a very distinctive gamma and contrast curve with the aggressive HDR+ tone-mapping, where the images are generally underexposed for highlight detail with increased saturation and contrast and a cooler color temperature, which creates that very appealing Pixel look people are so fond of but can't always explain why they like.

gsmarena_004.jpgTypical Pixel image

Meanwhile, Apple generally chooses a conservative contrast, gamma and sharpness values for its images. This is why many people consider the iPhone images to look comparatively more natural, although that has changed a bit in recent times as the iPhones launched in the past couple of years have a more aggressive saturation and contrast values along with the aforementioned warm color tone. The 2018 iPhones also have aggressive tone-mapping due to the Smart HDR feature.

Some of the new smartphones these days also use artificial intelligence to apply color settings on a scene by scene basis. While previously phones applied a preset color profile to every image, a lot of the newer phones, such as those from Huawei and LG, can detect the subject of the scene and adjust the saturation, contrast, and sharpness curves to suit that particular subject. They can also selectively boost colors from a particular spectrum, such as green for predominantly plant-based subjects, orange for food, and blue for daylight landscapes and also apply more aggressive tone-mapping for backlit or high contrast subjects.

gsmarena_005.jpgComically saturated Master AI mode on the P20 Pro

As you can see, you can pretty much break down the overall final look of any smartphone's image to a set of parameters. Once you understand the different parameters at play, you can figure out what each manufacturer is increasing or decreasing to get to that specific look that is typical of their brand. A lot of the big brands have now settled on a particular look that is easily identifiable while many of the new brands can still be seen flip-flopping from one look to another between product launches.

I should mention that the camera software does a lot more adjustments to the final image outside of what's mentioned in this article. Things like exposure compensation, noise reduction, lens correction, and stabilization are also some of the things that go into the final image, but they are outside the purview of this article.

How much does color science matter?

So now that we know what color science generally refers to, to what extent does it really matter?

Actually, quite a bit, especially on smartphones. The thing with the color science of a camera is that it only comes into effect when shooting images in JPEG or while recording video. If you shoot in RAW, the images are completely unprocessed and you can choose to make them look however you want. This is why talk of color science with proper cameras is meaningless as you'd be usually shooting in RAW (as you should) and bypassing all the color science the manufacturer has baked in.

However, on smartphones, we usually shoot in JPEG, so the color science of a respective smartphone manufacturer is really important. With JPEG, you are handed a fully baked image that is compressed and lacking in information to make any drastic changes to in post, so you better be fond of the images that are coming out directly from the camera.

gsmarena_006.jpg

Besides, the photography workflow on a smartphone is completely different from that of a DSLR or any proper camera. You usually shoot the image and upload it instantly to a messaging app or social network, often with no adjustments. People who usually shoot on smartphones are also generally photography novices. As such, a good out of the box image is more important to smartphone photographers than DSLR users.

This is why images from iPhones and Pixels are generally sought after. Both are capable of churning out excellent quality JPEGs straight out the camera that require minimal adjustment and are ready for sharing. It's also why Sony smartphone images have been blasted in the past because the color science of the images was rather poor.

As I said, the color science of a camera only comes into the picture (geddit) if you shoot in JPEG and if you shoot in RAW you can make images from different phones look pretty much identical. However, RAW is simply not a practical workflow for smartphones and not something most people are interested in (no matter how much Sony fans tell you to use the manual mode). If you have the opportunity, time and patience to shoot in manual mode and then process the RAW file on your computer, then you are wasting your time doing it on a smartphone instead of on a proper camera. It's like choosing to go hunting with a Swiss army knife.

So, color science is, like, super important, then?

Well, yes and no. Color is one part of the image. You also have other things to consider, such as the sensor and lens quality. A good quality sensor will be able to capture more detail with less noise in both the shadows and highlights and a good lens will produce sharper results with less distortion. These two elements contribute more to the final image than any color science. It's why you can get the RAW file and often come up with a better image than what the manufacturer could do with the JPEG, provided you have the knowledge, skill, and patience.

So, the primary concern should always be to get a camera with a good sensor and lens array, which, thankfully, is getting quite common these days. What isn't common, however, is the good taste that comes with color science.

What separates the images of a good camera from a subjectively bad one these days is quite often the color science. A lot of phones these days use basically the same Sony sensor or a variant that's more or less similar. Yet, you see a notable difference in the image quality between, say, Google and LG cameras. Color science is as much art as it is science and as with any art, there is a good dose of taste that's involved in getting the various parameters just right to produce that perfect image that's universally appreciated. Of course, there is no perfect image but you can still come up with something that most, if not everyone, will appreciate.

gsmarena_007.jpg

Also, it must be said that good color science is not the same as accurate colors, and vice-versa. Accurate colors are great to have if that's what you want. They are super important on high-end cameras, which are used for professional client work. However, most people shooting on smartphones don't want accurate colors, as the images are mostly used for casual viewing and sharing. An accurate image will have more leeway in terms of manual editing but people generally prefer images that look good straight from the camera for immediate sharing. This is why even iPhone images these days don't look as natural as they used to.

So a more natural image doesn't mean better color science nor is a more vibrant image bad. Color science is very subjective and there really isn't a definite good or bad (within reason). It's a matter of taste and some may prefer a particular image over the others. Over the years, manufacturers have learned what it is that most people like and that's what you usually see on phones these days, although some are still learning, which is why you see the occasional missteps.

Now that you know what color science generally refers to, which smartphone brand do you think has the best image processing right now and which do you think needs the most improvement? Let us know in the comments below.

Source link

« Previous article Esporte ensina valores muito além do simples fair play
Next article » CNJ deve confirmar a determinação de reajuste do salário de juízes estaduais