Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
19:45 Cory actually making AI horny is one of the best things i've ever seen…
ytc_Ugy2Y9ngd…
G
Blue blood my ass my friend drew stick figures with those goofy eyes but now she…
ytc_UgyXUGwXy…
G
Me when i hear a robot voice on the other end of the line: "I wanna talk to a hu…
ytc_Ugyxk30w0…
G
Fail safes need to be built into AI to allow us to always control, guardrails et…
ytc_UgxSTW5_4…
G
This is the industrial revolution of our times.
Creative manual labor will slow…
rdc_lgsglqw
G
@GeretTGH All your retoric is "Good point, BUT..." followed by some indirect bac…
ytr_UgzZ93FKC…
G
If shooting up school children didnt make America ban guns, it is extremely unli…
ytc_UgwYgHj5G…
G
I know how pieces move and still blunder some if I'm not focused, or distracted.…
ytr_Ugy2gU0N5…
Comment
Has an ai ever willfully murdered a person yet?
The answer is "no".
When a factory worker gets wrapped up in a high speed steel coiler, did the factory kill the worker, or was the environment constrained in such a way to allow an outcome?
Perhaps the factory worker jumped into the steel coiler.
The factory itself doesn't have agency.
youtube
AI Governance
2025-07-23T21:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzNPmyw_laJcS7Nn1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKzSGeodaU59sfUMJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyw6VE8mNO5kszjx_l4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyA2vdV8zHEs4V8kZB4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEgH-RODTwgNnD1bJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxNpApf6JEQzAC-zrR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw3nEHyR98K17toanB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxjFRMbUz68OXxTukx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxtaMXr7A0YSx1PuIV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwM2yTT6x7JmiKonnl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]