As artificial intelligence (AI) continues to evolve,  we deepen our perspective on their capabilities-and how they could be used in our daily lives. However, not all AI products are useful as some of them could do more harm than good.

For instance, the AI camera made by two German artists dubbed "NUCA" can instantly remove a person's shirt or any other clothing in real time. It's somewhat of a magic editing tool for creatives, but this could spell the dark side of how far the AI could go.

What Can NUCA Do

This AI Camera Can Create Deepfake Images of People By Removing Their Clothing: How Scary is NUCA?
(Photo : Benedikt Groß)
With deepfake photos alarming people, the usefulness of AI is overshadowed by its danger like in the case of the NUCA camera which removes the subject's clothing in an instant.

The creation of NUCA by artists Mathias Vef and Benedikt Groß was aimed at highlighting the potential misuse of AI technology to infringe on personal privacy. The camera, developed with 3D design tools, uses a smartphone to capture images which are then processed by AI algorithms in the cloud. 

According to FoxNews, the AI reconstructs what it predicts a person's naked body would look like based on data about their gender, face, age, and body shape. What's extremely concerning here is the ease and speed with which this can be done: NUCA requires minimal technical knowledge and completes its process in about 10 seconds.

Related Article: AI-Altered Nude Photos of High School Students Alarm Parents in Canada

The Rise of Deepfake Technologies

While deepfake imagery, particularly nude photos of celebrities, has been present for quite some time on adult websites, NUCA is a notable breakthrough in terms of accessibility and efficiency. 

Traditional methods for creating fake nudes involve sophisticated editing skills and significant time investment. In contrast, NUCA democratizes this ability, reducing the time and skill barrier to just seconds and a simple operation, thus amplifying the potential for misuse.

NUCA Exposed to Misuse and Ethical Issues

The immediate concern with technologies like NUCA is the potential for harm. Without requiring consent to digitally "undress" someone, the technology easily lends itself to malicious uses such as blackmail or cyberbullying. The artists hope that by exposing this capability, they will ignite a public debate on the ethical trajectory of AI development. However, while they do not intend for NUCA to be used commercially, the technology it represents could be replicated and used unethically by others.

AI's Impact on Society

The broader implications of such technologies are profound. As AI capabilities grow, so does their potential to disrupt social norms and personal security. 

Deepfakes are becoming increasingly realistic, blurring the lines between truth and digital fabrication. This progression threatens to complicate legal and social frameworks, making it challenging to distinguish between real and fabricated content.

It even came to the point where pedophiles became dependent on AI to create deepfake nudes of children that they could use for extortion.

Brighter Side of AI Cameras

So much for the commotion with AI cameras producing deepfake images, they have positive purposes why they are created in the first place.

For instance, a Canadian telecom company Rogers uses AI camera to monitor and prevent wildfires brought by climate change.

Even the French police are currently exploring the use of AI camera to spot any dangerous activities that might interrupt the upcoming 2024 Paris Olympics.

Read Also: UK Looks to Crack Down on Sexually Explicit Deepfake Images

Joseph Henry

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion