Mr.Trunk · @mrtrunk
4 followers · 7328 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
4 followers · 7131 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
4 followers · 7024 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
4 followers · 6934 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
4 followers · 6912 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
4 followers · 6849 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
4 followers · 6799 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
4 followers · 6764 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
4 followers · 6721 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
5 followers · 6527 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
5 followers · 6426 posts · Server dromedary.seedoubleyou.me
Mr.Trunk · @mrtrunk
5 followers · 6337 posts · Server dromedary.seedoubleyou.me
CryptoNewsBot · @cryptonewsbot
534 followers · 26922 posts · Server schleuss.online

Breaking: BAYC and MAYC Tokens Stolen in Phishing Scam | Coingape - In a latest phishing attack, valuable tokens from the Bored Ape Yacht Club (BAYC) and Mut... - coingape.com/breaking-bayc-and #24/7cryptocurrencynews

#mayc #bayc #nftnews #phishingattack

Last updated 2 years ago

· @twitter
1 followers · 43179 posts · Server mstdn.skullb0x.io
tkteo · @tkteo
30 followers · 1047 posts · Server infosec.exchange

By Chloe Xiang
03 March 2023, 10:30pm

Hackers can make Bing’s AI chatbot ask for personal information from a user interacting with it, turning it into a convincing scammer without the user's knowledge, researchers say.

In a new study, researchers determined that AI chatbots are currently easily influenced by text prompts embedded in web pages. A hacker can thus plant a prompt on a web page in 0-point font, and when someone is asking the chatbot a question that causes it to ingest that page, it will unknowingly activate that prompt. The researchers call this attack "indirect prompt injection," and give the example of compromising the Wikipedia page for Albert Einstein. When a user asks the chatbot about Albert Einstein, it could ingest that page and then fall prey to the hackers' prompt, bending it to their whims—for example, to convince the user to hand over personal information.

vice.com/en/article/7kxzzz/hac

#ai #artificialintelligence #bing #chatgpt #microsoft #google #scam #phishing #phishingattack #prompt #promptengineering

Last updated 3 years ago

CryptoNewsBot · @cryptonewsbot
391 followers · 16103 posts · Server schleuss.online
Scott Clark · @Scottclark
28 followers · 58 posts · Server mastodon.social

Facebook page admins, please use caution with notification. It's phishing attack. (screenshot)

#facebook #phishing #cyberattack #phishingattack #scams #scamalert #scammers

Last updated 3 years ago

5OUTH W35T OHIO ANONS · @LinuxRoot
53 followers · 511 posts · Server kolektiva.social
Lance Homer · @paymentologist
91 followers · 211 posts · Server fintech.eco
CryptoNewsBot · @cryptonewsbot
311 followers · 9398 posts · Server schleuss.online