bug-bounty488
xss246
rce124
bragging-post117
google116
account-takeover104
microsoft96
facebook94
privilege-escalation83
authentication-bypass83
open-source81
csrf81
stored-xss75
malware66
access-control66
apple65
web-security64
reflected-xss63
ai-agents62
cve56
exploit54
input-validation53
sql-injection50
phishing50
cross-site-scripting49
defi48
smart-contract48
api-security47
ethereum45
ssrf44
information-disclosure43
privacy40
web-application39
vulnerability-disclosure38
dos37
tool37
burp-suite37
reverse-engineering36
automation35
cloudflare34
responsible-disclosure34
llm34
web334
opinion34
writeup34
idor33
html-injection33
smart-contract-vulnerability33
ai-security32
waf-bypass31
0
2/10
Meta unveiled four custom Broadcom-built AI inference chips (MTIA 300/400/450/500) designed for ranking, recommendation, and generative AI workloads, with plans to deploy multiple gigawatts starting in 2027. The chips use modular chiplet architecture with RISC-V cores and HBM stacks, with successive generations claiming performance competitive or superior to commercial alternatives like Nvidia.
custom-asic
ai-inference
chip-design
meta
broadcom
hardware
chiplet-architecture
hbm-memory
risc-v
genai
data-center
Meta
Broadcom
MTIA 300
MTIA 400
MTIA 450
MTIA 500