All they’ll need is this system from Berkeley computer scientist Richard Zhang, which allows a soulless silicon sentience to “Hallucinate” colors into any monochrome image.
It uses what’s called a convolutional neural network – a type of computer vision system that mimics low-level visual systems in our own brains in order to perceive patterns and categorize objects.
Trained by examining millions of images of- well, just about everything, Zhang’s system of CNNs recognizes things in black and white photos and colors them the way it thinks they ought to be.
Grass is usually green, right? So when the network thinks it recognizes grass, it colors that region green.
In the paper describing the system, Zhang describes this recognition and color assignment process as “Hallucination,” and really, the term is apt: It’s seeing things that aren’t actually there.
Zhang and his colleagues tested the system’s efficacy by asking people to choose between two color versions of a monochrome image: the original and the fruit of the neural network’s labor.
The paper has plenty of technical info, but also lots of interesting examples of how and when the system failed, when it was most and least convincing, and all that.