Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The So Called "GodFather of AI" has no idea how AI works, or the progression of …
ytc_Ugys3GiXm…
G
The thing with the "catering to the ultra rich" endgame is even this kill itself…
ytc_UgxDDNa1Z…
G
Everyone laughs when that robot says ok I will destroy humans. That is reality o…
ytc_UgwnGcgHX…
G
0:00 I never knew what that robot carrying butter refernce to but now I see I wa…
ytc_Ugz8zJOWo…
G
This is totalitaire regime. Communisme mixed with Data and AI is super dangerous…
ytc_UgyjEY6V9…
G
The jobs act during trumps first presidency gave huge tax breaks to companies to…
ytc_UgwdckJuf…
G
@thewannabecritic7490 In other words I wrote my own original lyrics, my own orig…
ytr_UgwUt4kC6…
G
A book published in 1995 was 'The End of Work', by Jeremy Rifkin, was prescient …
ytc_Ugx7CfOxR…
Comment
This is an instruction to think of the most scary scenario they can think of, not the most plausible scenario - which I think even AI can't predict. It really isn't helpful that even looking up reassurance there's more fearmongering but logically people have a history of freaking the f out over new technology and in 2008 they were convinced the large hadron collider spelt the apocalypse.
youtube
AI Harm Incident
2024-12-15T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgycGiuqvc5c7ql1ojZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1KmVh96QGyNV1Npt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5GoUB1m9lepg_ryN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw5yyoNzu5Fv1P8yLx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzSB8SzsH3B1Zpgyex4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwMCVXHp3bWQyy1OVN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx8KTFufcIdEej7C3J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzB3uaWkkykAy3f3I54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFZ9_3LgK_0MBZ4GF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyz2ECSYJDeFaowWWF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]