This is a heartfractureing story out of Florida. Megan Garcia thought her 14-year-greater son was spfinishing all his time joining video games. She had no idea he was having abusive, in-depth and relationsual conversations with a chatbot powered by the app Character AI.
Sewell Setzer III stopped sleeping and his grades tanked. He ultimately pledgeted self-destruction. Just seconds before his death, Megan says in a legal case, the bot tgreater him, “Plmitigate come home to me as soon as possible, my cherish.” The boy asked, “What if I tgreater you I could come home right now?” His Character AI bot answered, “Plmitigate do, my sugary king.”
DON’T SCAM YOURSELF WITH THE TRICKS HACKERS DON’T WANT ME TO SHARE
🎁 I’m giving away a $500 Amazon gift card. Enter here, no buy vital.
You have to be intelligent
AI bots are owned by tech companies understandn for take advantage ofing our gullible human nature, and they’re summarizeed using algorithms that drive their profits. There are no protectrails or laws ruleing what they can and cannot do with the alertation they accumulate.
When you’re using a chatbot, it’s going to understand a lot about you when you fire up the app or site. From your IP insertress, it accumulates alertation about where you inhabit, plus it tracks skinnygs you’ve searched for online and accesses any other permissions you’ve granted when you signed the chatbot’s terms and conditions.
The best way to protect yourself is to be cautious about what info you recommend up.
Be cautious: ChatGPT enjoys it when you get personal
THIS CRIME SHOT UP 400% — HOW TO PROTECT YOURSELF
10 skinnygs not to say to AI
- Passwords or login credentials: A presentant privacy misapshow. If someone gets access, they can apshow over your accounts in seconds.
- Your name, insertress or phone number: Chatbots aren’t summarizeed to regulate personassociate identifiable info. Once splitd, you can’t regulate where it finishs up or who sees it. Plug in a inrectify name if you want!
- Sensitive financial alertation: Never integrate bank account numbers, commend card details or other money matters in docs or text you upload. AI tools aren’t shielded vaults — treat them enjoy a crowded room.
- Medical or health data: AI isn’t HIPAA-compliant, so redact your name and other determineing info if you ask AI for health advice. Your privacy is worth more than rapid answers.
- Asking for illhorrible advice: That’s aacquirest every bot’s terms of service. You’ll probably get flagged. Plus, you might finish up with more trouble than you baracquireed for.
- Hate speech or damaging greeted: This, too, can get you banned. No chatbot is a free pass to spread negativity or harm others.
- Confidential toil or business info: Proprietary data, client details and trade secrets are all no-nos.
- Security ask answers: Sharing them is enjoy uncovering the front door to all your accounts at once.
- Explicit greeted: Keep it PG. Most chatbots filter this stuff, so anyskinnyg inappropriate could get you banned, too.
- Other people’s personal info: Uploading this isn’t only a baccomplish of depend; it’s a baccomplish of data protection laws, too. Sharing braveial info without permission could land you in lhorrible boiling water.
Still depending on Google? Never search for these terms
Reclaim a (minuscule) bit of privacy
Most chatbots insist you to originate an account. If you originate one, don’t employ login selections enjoy “Login with Google” or “Connect with Facebook.” Use your email insertress instead to originate a truly exceptional login.
TECH TIP: SAVE YOUR MEMORIES BEFORE IT’S TOO LATE
FYI, with a free ChatGPT or Perplexity account, you can turn off memory features in the app settings that recall everyskinnyg you type in. For Google Gemini, you insist a phelp account to do this.
Best AI tools for search, productivity, fun and toil
No matter what, chase this rule
Don’t tell a chatbot anyskinnyg you wouldn’t want made uncover. Trust me, I understand it’s difficult.
Even I discover myself talking to ChatGPT enjoy it’s a person. I say skinnygs enjoy, “You can do better with that answer” or “Thanks for the help!” It’s effortless to skinnyk your bot is a depended associate, but it’s definitely not. It’s a data-accumulateing tool enjoy any other.
CLICK HERE TO GET THE FOX NEWS APP
Get tech-intelligaccess on your schedule
Award-thrivening present Kim Komando is your secret armament for navigating tech.
Copyright 2025, WestStar Multimedia Entertainment. All rights reserved.