Given my background in computer graphics, I was able easily establish 462 points of correspondence between two images in the software Autodesk Maya by creating a polygonal mesh composed of points and triangles. The topology of the triangles was created manually given my previous experience in character modeling. I then exported the geometry as an obj file which I then read into my python script.
One of the goals of this project is to learn the low level computer graphics involved with being able to morphing between the faces of two people. Given I already had the 3D meshes and textures in Maya, it was fairly trivial to achieve the final result of the face morphing within the software, which I could then use as a target to compare my results to. Here are the results of the morphing between Gael Garcia Bernal and myself.
Here we show the points of correspondence between the two faces.
We start with point positions linear transformations since those are the easiest. Here we show that a simple 45° transformation can be represented by the following 2D rotation matrix: $$ \begin{bmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = \begin{bmatrix} x \cos \theta - y \sin \theta \\ x \sin \theta + y \cos \theta \end{bmatrix} $$
The goal of this section is to create a halfway point between my face and Gael's face. We first start by computing the correspondences for both faces: We first average the points of the two faces to create a midway face. This is done by averaging the x and y coordinates of the two faces.
Then we figure out the inverse warp for each of the triangles in the mesh and we recompile it into an image using polygon masking. I used bilinear_interpolate to get the pixel values for the transformed triangles.
A linear interpolation between the source and target triangles was created with parameter 't' and an interval of 45 transitions frames were evaluated. This is the final result of the morphing sequence:
The Danes facial dataset is composed of 40 individuals with images and points of correspondence. It we simply average the pixel values from all of the images of this dataset, we very some very rough structure:
Instead, if we use the points of correspondence for the entire dataset to average the images, we get a much more realistic looking face. Here is an example of the points of correspondence on two individuals:
If we then get the average position for those points of correspondence, warp the texture in the triangles appropriately, and then average the resulting textures for all the people in the dataset, we get a MUCH more realistic looking face:
Here are some examples of what an individual face looks like warped to the mean face:
This is what the texture of my face looks when morphed into the average geometry.
This is what it looks like when you morph the average Danes point positions into my geometry.
Now that we have a linear interpolation between my face and the average face of the Danes dataset, we can extrapolate from the mean face to create caricatures. Here is an example of what happens when you set the parameter 't' to 1.5 (The morph is most distinctive when the gif transitions back to the beginning):
Using Google's Imagen, I created a small dataset of 5 images of older men:
Like previously we found points of correspondence for the dataset:
I then morphed my texture and geometry towards the mean face of the synthetic older dataset. Here is the results: