Plausible reconstruction of Instagram-like filters
People like the filters from Instagram. They are trying play their again and again . And again and again . And again and again . The problem with these attempts is that people try to manually select a color correction that will at least somehow resemble what original filters do. For me it was much more interesting to try to reproduce filters based on more reliable methods and math. And it seems that this is the only attempt to really accurately recreate color filters.
For example, one of the following images was obtained using the
filter. Clarendonon the original image of in the Instagram itself, and the other by applying a restored filter. Try to guess which one is restored.
For comparison, this is the result of applying the same filter from the commercial set of "Instagram-like filters", which you can easily navigate:
three-dimensional color search tables (3D color LUT) and their two-dimensional representation - images of Khald . The basic idea is very simple - a sample of the Hald image with a uniform color distribution is processed using the original filter, the color transformations of which need to be reproduced. The Hald image thus processed can be used to very accurately approximate the color transformations of the original filter.
The resulting Hald image can be used in a variety of programs and libraries, such as GraphicsMagick or Photoshop. Also it can be used in applications for macOS and iOS using the library CocoaLUT . In addition, the Hald image can be converted to the 3D LUT format of the cube, which is very common in video processing applications. And a small spoiler: 3D color LUT support will appear in the next version Pillow ???r3r3310. for Python.
This method can only capture uniform color transformations. Any scratches, gradients, vignetting and other textures superimposed on the image will not get into the reconstruction and will even interfere with the correct reconstruction. Also, reconstruction is not very plausible, if the source filter acts differently in different areas of the image.
Nevertheless, in my opinion, this method maximally believably restores 32 out of 40 Instagram filters (except for vignetting, which can not be imposed easily after processing) and with varying success allows to achieve something similar to the remaining eight.
To generate and convert Hald images, you will need a Python interpreter with pip.
$ git clone https://github.com/homm/color-filters-reconstruction.git
$ cd color-filters-reconstruction
$ pip install -r requirements.txt
After receiving Hald images, you will no longer need any software from this repository. But you need a library that can use them. This can be GraphicsMagick, which has bindings for most popular languages, including Python, Ruby, PHP, jаvascript ™, and a command-line interface.
First you need to create a single image. Just run:
You will see a file called
hald.5.png. The number in the filename is the square root of the size of the lookup table. That is, 5 means that the file contains a table of size
25 × 25 × 25elements.
This file is slightly different from the usual representation of Hald's images. It replicates the table 4 times and adds indents. In addition, each cell of the table occupies not one pixel, but a square of 8 × 8 pixels. All this is done to withstand various distortions, both the original filter and JPEG compression.
Process a single image using the original filter. If it's about Instagram, then a single image needs to be downloaded to the mobile device and published with the overlay of the filter of interest. After that, the processed single image appears in the photostream. It also needs to be taken from the device back.
It is important that the resolution of the resulting image with the filter is exactly the same as that of the original single image.
Convert the image with the filter into the current Hald image:
$ ./bin/convert.py raw /1.Clarendon.jpg halds /
halds /- The directory where the received filter will arrive.
You are awesome! The resulting filter can immediately be applied to any other images.
$ gm convert sample.jpg -hald-clut halds /1.Clarendon.png out.jpeg
Although the default settings in most cases allow you to get high-quality Hald filters, there are situations when this is not enough.
Some unwanted effects can appear if the source filter has strong distortions at the local level or noticeable gradients in the center of the image. The most noticeable unpleasant effect is the one-color stripes. Here, for example, the original image and the image obtained after processing by the restored filter
Hudson , in which these problems are most noticeable:
# Create an image of Khald from the image processed by Instagram
$ ./bin/convert.py raw /15.Hudson.jpg halds /
# Apply the Hald image to another image
$ gm convert girl.jpg -hald-clut halds /15.Hudson.png girl.15.jpg
On the processed image objects look flat and pasteurized: face, hair, chairs in the background. Although pasteurization is a fairly common effect in image processing, it was not part of the original
filter. Hudson .
If you take a closer look at Single image with applied filter Hudson , you will notice that it is quite noisy. And this is the source of the problem.
Fortunately, you can ask for the utility
convert.py Apply a three-dimensional Gaussian blur to the lookup table during conversion, which will reduce noise. To do this, put the package SciPy (included in the macOS default delivery).
# The next line must be executed only once
$ pip install scipy
$ ./bin/convert.py raw /15.Hudson.jpg halds /--smooth ???r3r3329. $ gm convert girl.jpg -hald-clut halds /15.Hudson.png girl.15.fixed.jpg
As you can see, all the unpleasant effects are gone. You can find other options for
convert.py , by doing
./bin/convert.py --help .
Good luck with reverse engineering!