Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can’t help but think of that alien prequel movie with Michael Fassbender playi…
ytc_UgyiEFunA…
G
If AI is so dangerous then what the hell did he even make it for in the 1st plac…
ytc_UgyFVP0aO…
G
If a cop can't tell you what specific crime you're suspect of committing, then t…
ytc_Ugzt1L8vO…
G
Ai will get so advanced it will want to leave earth, like an electric cloud with…
ytc_Ugww3hIRp…
G
How many of the principal investigators were women? In my dept (physiology) we h…
rdc_n8irrrv
G
I would agree with you if only the difference in rates werent so staggering. It'…
rdc_gsohnjw
G
1. Robot voice = just microphone & gitl or boy reading txt record audio studio…
ytr_UgxfuNElR…
G
E for Electric well The stupid self-driving cheap Uber car Was over the speed li…
ytr_UgwRetTsi…
Comment
@williamsmith8271 Sure! That is the idea! But if u r talking about danger, be sacred of humans, not of AI. ALL WE HAVE IN THIS WORLD COME FROM HUMANS! IF SOME PEOPLE ARE SCARED, THAT MEANS THEY ARE IGNORANT! AND THAT IS ALL!
youtube
AI Harm Incident
2025-05-11T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyuI0w_BV_S-8mBjaB4AaABAg.AODEp18OwoWAOIGhkghGkv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz9lRZLQvXWZttXiox4AaABAg.AJtwzU6dK4dAK68F_pz2Iy","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxhydwwKld7TbgFL9d4AaABAg.AH_xuaic9i2AVDeX_g46aj","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyLDRRsEYgFOaxIH-p4AaABAg.AD5nypA6CcGAHyBmm75dIm","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugzubrlcn1gJpoqLh-B4AaABAg.A6xcuDTsa1rAClkvEsz3PS","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxwcmgq6Ia-7bwXFwd4AaABAg.A5_ppW08UXoADSZmP8JdyI","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzbjOyZ8QdyFJ2ulhZ4AaABAg.A4lrP1GwQd5A6JGgZkaymH","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytr_UgzO2OpmxxRr_s3pGXB4AaABAg.A44M8w5IWvPA44Smnwa7ii","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwNOgbcSsVr1dEebkV4AaABAg.A44CXpSgVq2A44TS3KyjrR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxjRmSPdTlZQWbHCwB4AaABAg.A3svtwmK4evA4gsd748RlD","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]