Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is not the problem, people are. Or, more specifically , greed, stupidity and …
ytc_UgwyTGYMt…
G
BLESS 💥🌈💥💥 ALL ♾️ NEGATIVITY 🖤🖤🖤
💞 WITH 💞 LOVE 💝💝💝 BECAUSE 🎶
NEGATIVITY 🖤 IS PAR…
ytc_UgyTpFj8I…
G
Regarding consciousness in AI: essentially, it’s the same neurons and electrical…
ytc_Ugxkwb29c…
G
I believe AI would examine every input whereas humans might miss or not consider…
rdc_i2vhhec
G
That's really interesting. What other dater are they scrapping? And for what oth…
ytr_Ugz9lhHrf…
G
An original work inspired by other art is different than a derivative work train…
ytc_UgxWt-DmV…
G
I feel like a good use for generative ai would be games. Imagine a almost infini…
ytc_UgzEk5GLm…
G
If it was up to me to destroy AI servers, I would do it tbh…
ytr_UgztSfi-Q…
Comment
"By early 2030, the robot economy has filled up the old SEZs, the new SEZs, and large parts of the ocean. The only place left to go is the human-controlled areas [...] Eventually it finds the remaining humans too much of an impediment: in mid-2030, the AI releases a dozen quiet-spreading biological weapons in major cities, lets them silently infect almost everyone, then triggers them with a chemical spray. Most are dead within hours; the few survivors (e.g. preppers in bunkers, sailors on submarines) are mopped up by drones." The report says mid-2030, not mid-2030s!
youtube
AI Governance
2025-08-02T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgytdS5JtwheHqMQR914AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyZ3qAi5RCMPYFwHuR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_UgyBfFrYpwz6dasvpMN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzJdRJ6mfAXxzQzv-14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugyg9GCT1L8sAM-A9kd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzhUyiizREp60-Df-l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugz_AJnCsHp923jfsT54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyGo-GdkXrRb_XARTd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugw3SC1FFBRnf-V_5l94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzGnX2hijTt0phypn94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}]