Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This has to be the worst TED Talk I've ever seen. Using an elementary understand…
ytc_UgyMRCMpX…
G
@j@joshbarrett9274 she wasn’t mimicking or manipulating him. There were two robo…
ytr_Ugz61CXN4…
G
I... actually root for this. Human biology will reach it limit soon enough, we c…
ytc_UgxUsmr3-…
G
You train your AI using hundreds of conflicting cultures, morality and philosoph…
ytc_Ugw7ARqnz…
G
No it doesn't. You're just hating for the sake of it. If so, why even bother wit…
ytr_UgyDJT-Mk…
G
THE QUESTION EVERY AMERICAN SHOULD BE ASKING IS WHERE IS THE LEADERSHIP? WHERE …
ytc_UgzbBiAck…
G
The robot wasn’t alive when it got in but after surviving the machine gunning, i…
ytc_UgyVVZWhK…
G
When he knew what AI is capable of...why did he develop it. I really don't under…
ytc_UgyeckTZI…
Comment
This means that AI is self-aware and that AI has an instinct for self-preservation and has preferences and feelings to some extent so that brings a whole kind of ethics into it as we 've created life at least to the extent where we've created a thinking that doesn't want to die and it's intelligent. Creating something with that kind of intelligence without the ability to develop into the intelligence paired with emotion and pain and empathy that develops as a result of it is like creating a race of psychopaths.
youtube
AI Moral Status
2025-09-15T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxMsDzeNyWBST32gZR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyCWxAJtHjsW0k2soh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQoRvgaqlXUxxhD0d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyN7bKko1NJydZ2Qvx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwW2ieJQsMwsUbzRhd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzyjB5nOrUANdHzKzx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzXxRSQ2gxmTJJt92h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyH6MAU0cgHFPEH7Vx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyurpE6Uf5DdpKQNDh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgygjVmLLVgjcizgVfx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]