I’ve started a few months ago to work on an application called Eternal Minis. It’s a social network where warmongers can share their painted miniatures. If you don’t understand what I’m talking about, check out Wikipedia.
Anyway, the app allows users to add paints they’ve used to a posted minis. It’s a very interesting information if you want to learn how someone has painted a specific color effect. However, it can be a bit tedious to add all the colors you’ve used, especially when there are dozens of them! My idea was to help the user by suggesting the first color. Maybe it’d help him start, and he’ll keep adding other paints after that.
My first idea was to vectorize the RGB components of a color picked on the photo (using Palette), and then make a distance calculation between this color and the predefined paints included in the app. Even if it may sound like a good idea, a short distance between RGB components might not be really relevant.
So I started to search for some actual color comparison algorithms. What I found is that you can calculate an Euclidian distance between two colors, from actually any format. So you can do the following with RGB components.
However, RGB components, even if they’re great for computers, may not be very relevant for how humans perceive colors. Actually, if you take a look at the Wikipedia article on Color difference, you can read about the work of International Commission on Illumination (CIE). the CIE76 formula for color distance looks like this:
You’ll notice that the color components are called L, a and b. It’s because the CIE colors are defined with the Lab color space, which is trying to approximate how humans perceive colors.
So we can easily convert RGB colors to the Lab space, and then use the Euclidian distance to find a relevant color from a mini picture!
Make the robots working!
As an Android developer, the good news is: Google has already made all the heavy stuff. The ColorUtils class has a method RGBToLAB (which returns an array with Lab components), and a distanceEuclidian method has well.
So, all you have to do is to pick a color using Palette, and make the comparison with ColorUtils.
What I actually do in Eternal Minis is loop over all vibrant colors in all swatches, and then keep the color with the lowest distance to any paint color included in the app. Then I suggest the user to add the found paint. And here’s the result:
So what’s next?
The approximation works fairly well, but there are a lot of things to consider outside the color comparison. Photos are often taken with a white or black background, so you don’t want to suggest this color to your user for example. So there is still a lot of work to ease the life of painters!