More safeguards needed to protect SC families from AI exploitation
Originally Featured in the South Carolina Daily Gazette
Artificial intelligence is advancing faster than many of our laws can keep up.
While these technologies hold tremendous promise, they are also creating serious risks that policymakers cannot afford to ignore, especially when it comes to the safety of women and children.
One of the most troubling developments is the growing ability of artificial intelligence tools to generate explicit images using real photographs of people without their knowledge or consent.
These images may be fabricated, but the harm they cause is real.
Once created, they can spread quickly online, leaving victims with humiliation, harassment, and long-lasting emotional damage that is nearly impossible to undo. This has become a growing problem in schools and homes across our state. And we are now seeing the consequences.
Here in South Carolina, authorities are investigating a disturbing incident involving several high school girls whose faces were reportedly used to create AI-generated nude images that circulated among students online.
What begins as a digital manipulation can quickly become public humiliation for young victims, spreading across social media and following them long after the original images appear.
Over the past few months, national reporting has raised growing concerns about AI systems such as GrokAI, the artificial intelligence tool integrated into the X platform.
Investigations have found that the system has at times been capable of generating sexualized imagery involving women and minors with limited safeguards in place.
When tools capable of producing vast numbers of images are released without meaningful guardrails, predictable abuse follows.
For women and girls, the implications are deeply troubling. A simple photograph shared online — whether on social media, a school website, or a family page — can now be manipulated into explicit content in seconds.
Once that content spreads, victims often have little ability to stop it.
Making matters worse, the burden is placed on parents, who must worry that an innocent picture of their child could be turned into sexualized content by a bad actor using an AI system.
Technology companies developing these powerful systems have a responsibility to prevent that kind of exploitation. But when safeguards fail, our leaders must step in.
South Carolina has an opportunity to lead on this issue. Our state has long taken seriously its responsibility to protect families and vulnerable individuals from exploitation, and that commitment must extend into the digital world as new technologies create new threats.
The attorney general’s office plays a critical role in that effort.
As the state’s chief law enforcement authority, the attorney general can evaluate whether emerging technologies are operating within existing laws related to exploitation, obscenity, and consumer protection, and whether enforcement action may be warranted when safeguards fail.
At the same time, state legislators have an opportunity to ensure that South Carolina’s laws keep pace with rapidly evolving technology.
Clearer guardrails and stronger protections can help ensure that powerful AI systems cannot be easily misused to exploit women or target children.
This is not about restricting legitimate innovation or undermining free speech.
South Carolina values both technological progress and constitutional freedoms. But those values do not require us to accept technologies that can be weaponized against our children.
I think that any of our leaders knows there is a clear difference between responsible innovation and tools that can be used to harm others.
The recent incident involving South Carolina high school students should serve as a wake-up call.
Without stronger safeguards and clear accountability, these technologies will continue to be misused, harming young people and families.

