bug-bounty529
xss292
rce171
google143
account-takeover120
bragging-post118
exploit118
facebook117
privilege-escalation107
malware103
microsoft99
open-source94
authentication-bypass90
csrf89
cve82
access-control76
stored-xss75
ai-agents65
web-security64
phishing63
reflected-xss63
writeup56
input-validation52
reverse-engineering51
ssrf51
sql-injection50
cross-site-scripting50
information-disclosure49
smart-contract49
defi48
tool48
api-security46
apple45
ethereum45
privacy45
vulnerability-disclosure44
opinion39
browser39
web-application38
ai-security38
llm38
web337
burp-suite37
remote-code-execution36
automation36
race-condition36
responsible-disclosure35
supply-chain35
dos34
oauth34
0
2/10
opinion
A philosophical essay arguing that complex systems (like climate, economics, and human language) require billion-parameter AI models as theories because their true compression ratio is simply very large, unlike the elegantly compact theories that worked for complicated systems. The author contends that modern deep learning finally provides the tools to operationalize theories of complex phenomena that were previously beyond reach.
Sean Linehan
Santa Fe Institute
David Deutsch
Noam Chomsky