The internet never forgets. It is both a blessing and a curse. A blessing when you are searching for a link to an obscure article from a few years back, and a curse when your potential employer scans all your social media accounts and finds pictures from that house party a few years back. Yeah, that party, the one where you got totally smashed and didn't even know there were pictures of you on the web until a month later.

For good or for ill, the benevolent god that is Facebook is looking to do something about that. Facebook researchers dream of a "digital assistant" that might stop, or at least warn, people before they upload a picture they might regret.

Yann LeCun and other AI researchers of Facebook are working towards the goal, and it might come sooner than you think. The proposed tool would need to be able to distinguish between what a drunk person and a sober one looks like. 

It's all possible thanks to AI "deep learning," a task that is being tackled by numerous tech companies as they look to automate online tasks by mimicking how neuron networks function in a human brain. Google uses it in its search engine, Microsoft uses it to translate Skype calls, and soon Facebook will use it for a number of things, ranging from warning you before posting certain images to understanding everything you type in your status.

That sounds scary for some, but LeCun says his work at Facebook is about giving users more control over their online identity, rather than less, all made possible through the Facebook assistant.

"You will have a single point of contact to mediate your interaction but also to protect your private information," LeCun tells Wired.

For anybody who has gotten in hot water for saying or posting something on social media they shouldn't have, deep learning AI and its Facebook uses could potentially save thousands a whole lot of headaches. That is if you are okay with a machine being able to detect whether or not you are drunk.

Lead Image: (CC) Brian and 

ⓒ 2021 All rights reserved. Do not reproduce without permission.