Artificial images: seeing is no longer believing

Loom.ai can generate a 3D avatar from a single image

“Pics or it didn’t happen” – it’s a common request when telling a tale that might be considered exaggerated.  Usually, supplying a picture or video of the event is enough to convince your audience that you’re telling the truth.  However, we’ve been living in an age of Photoshop for a while and it has (or really should!!!) become habit to check Snopes and other sites before believing even simple images1 – they even have a tag for debunked images due to photoshopping.

Although it doesn’t even need excessive image manipulation skills – a genuine photograph with out of context text over it can quickly form a viral meme.  Pervasiveness can quickly become seen as fact, particularly on social media.

Most people will see a photograph and believe it – it takes a lot to approach every image with scepticism and analyse whether it’s been digitally altered or not.  Mostly we rely on credible sources to do this checking for us, but the media and individuals we respect can sometimes make mistakes.

Right now, we are worrying about a relatively small number of images that are being created to support otherwise unbelievable stories.  We are moving beyond photographs being the “extraordinary evidence required for extraordinary claims“.  However, this is only going to get worse.

I’ve blogged before on machine learning enabling image creation based on descriptions.  Recently there have been developments of 3D avatars based on a single image of the subject with startup Loom.ai getting a lot of funding.  This is an interesting step forward.  Why would there be paparazzi stalking celebrities when from a single publicity still, they can put that individual in any situation to support their stories.  A combination of image generation, avatar creation, smoothing techniques and scene description with machine learning will make waiting for that perfect image redundant. More importantly, it will become easier and easier for people to be fooled by evidence that has been fabricated.

Don’t believe something just because it confirms your own ideas – think critically about all information presented to you.  Bear in mind that images and videos are able to be manipulated as much as text can be online, and it can be very difficult to spot.  Machine learning is already writing articles and it will very soon be adding images to them without human intervention – moving from stock photography to completely artificially generated images.

As we move more and more into the information overload we need to take a step back and rely on our intelligence and senses rather than just accepting digital information as truth.

  1.   Let alone the more “out there” claims of supersize insects, mythical beasts etc…

Published by

janet

Dr Janet is a Molecular Biochemistry graduate from Oxford University with a doctorate in Computational Neuroscience from Sussex. I’m currently studying for a third degree in Mathematics with Open University. During the day, and sometimes out of hours, I work as a Chief Science Officer. You can read all about that on my LinkedIn page.