Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
because Robots cannot fight back and the next generations of work morality has d…
ytc_UgzSCV5qM…
G
I don't understand one thing. Why is he referring to Google as a person ? Does h…
ytc_Ugwkw47n7…
G
Elon wants to further his Artificial intelligence agenda (which by his own words…
ytc_UgxUgpEm4…
G
This is scary… when one robot learns something new it will be sent to the AI clo…
ytc_UgzXZpnx5…
G
Anyone heard that life imitates art? What “art” have you watched or read about…
ytc_Ugy_SBrEQ…
G
Encouraging curiosity may be a better way to align hypothetical Human-like AI.
…
ytr_UgwSAEAsi…
G
Imagine billions of jobless adults sitting around doing I don't know what? Playi…
ytc_UgwptMO9k…
G
Wow and it's so hard to tell 😅
You can really see the diff of a 175k grands robo…
ytc_UgzXaegM9…
Comment
at 1:05:00 he says that the little robot would become scared... but that's just a state of being it entered as a response to a certain circumstance. I guess its coded to preserve its own life and is increasing its chance to survive by running away, but without the emotional aspect is it actually scared of not existing anymore? Not without consciousness its not.
youtube
AI Governance
2025-08-12T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz9iLS6a4yte2KkQ-t4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz0TlxiTT4zV8WfL-p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzYRN_PSLqvkirF0Yd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy5gP0gu3S7XlbEosF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwWBASr7EhRxxGmOUJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxtzUUB9CbMYxRkOVl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx5Axv5hOcI6f0Vls94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6OFuRM0BzVOvt6Xd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZy50m2phuiJXS6T14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz2Lsq2mbbzSbozLNV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]