bug-bounty529
xss292
rce171
google148
account-takeover120
exploit119
facebook118
bragging-post118
privilege-escalation104
malware104
microsoft100
open-source94
authentication-bypass90
csrf89
cve82
access-control75
stored-xss75
ai-agents65
web-security64
phishing63
reflected-xss63
writeup56
input-validation52
reverse-engineering51
ssrf51
cross-site-scripting50
sql-injection50
smart-contract49
information-disclosure49
defi48
tool47
apple46
api-security46
ethereum45
privacy44
vulnerability-disclosure44
browser39
opinion39
llm38
web-application38
ai-security37
web337
burp-suite37
automation36
remote-code-execution36
race-condition36
supply-chain35
responsible-disclosure35
dos34
lfi34
0
2/10
opinion
A philosophical essay arguing that complex systems (like climate, economics, and human language) require billion-parameter AI models as theories because their true compression ratio is simply very large, unlike the elegantly compact theories that worked for complicated systems. The author contends that modern deep learning finally provides the tools to operationalize theories of complex phenomena that were previously beyond reach.
Sean Linehan
Santa Fe Institute
David Deutsch
Noam Chomsky