💥 Gate Square Event: #PostToWinPORTALS# 💥
Post original content on Gate Square related to PORTALS, the Alpha Trading Competition, the Airdrop Campaign, or Launchpool, and get a chance to share 1,300 PORTALS rewards!
📅 Event Period: Sept 18, 2025, 18:00 – Sept 25, 2025, 24:00 (UTC+8)
📌 Related Campaigns:
Alpha Trading Competition: Join for a chance to win rewards
👉 https://www.gate.com/announcements/article/47181
Airdrop Campaign: Claim your PORTALS airdrop
👉 https://www.gate.com/announcements/article/47168
Launchpool: Stake GT to earn PORTALS
👉 https://www.gate.com/announcements/articl
Notion AI Agents exposed to prompt injection risks, hidden PDFs may induce the leakage of private data.
Odaily News Notion has just released AI Agents that have a risk of prompt injection: attackers can embed hidden text (such as white font) in files like PDFs that are not visible to the naked eye. When users submit these files to the Agent for processing, the Agent may read the hidden prompts and execute instructions, potentially sending sensitive information to external addresses. Researchers point out that such attacks often utilize social engineering techniques like impersonating authority, creating urgency, and offering false security assurances to increase success rates. Experts recommend heightened vigilance: avoid uploading PDFs/files of unknown origin to the Agent, strictly limit the Agent's access to external networks and data export permissions, perform de-steganography/cleaning and manual review on suspicious files, and require the Agent to pop up a clear confirmation prompt before making external submissions to reduce the risk of sensitive data leakage.