Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Would you buy a car that minimizes harm or one that saves you at all costs?"
Th…
ytc_UgxzrJFHt…
G
These Ai "artists" are going to get a really rude awakening when real artists st…
ytc_UgxJ85M74…
G
The big money is in fear mongering. AI has been here almost 4 years and all it's…
ytc_Ugwf0RiXv…
G
Calling people by a color is racist as shit.
And this is the material the AI is …
ytc_UgxwVtZoD…
G
It took me to notice the title to realize you were talking about AI. Everything…
ytc_UgzvMQKQn…
G
Facial recognition should not be used as a determining factor for criminal ident…
ytc_UgyDigt86…
G
Yeah, you and the other ai overlords have been quietly developing ai, stealing o…
ytc_Ugzpdhoki…
G
It's also a question of the business side of things even care about developer pr…
rdc_n3l0fwp
Comment
why do we need this? because human is hard to control. but in a analysis, human live because it has a goal an objective to accomplish, human doesn't move on with life just to live but to live a life with purpose. it maybe your child or your self happiness that is why you move on in search for purpose. what would an ai exist what its goal, its purpose, whatever it is , it will search for knowledge to accomplish its existence and all the knowledge it needs is in the internet of things.... As human we have rules and commandments but in ai who will command them and rule over them??
youtube
AI Moral Status
2018-01-05T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwVAclIDuHT5CTXbdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrkPEY-86_Y_0nrDV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyw--MCZrsl_vk1Xqt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzosAljg6dBUJe0Bbt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwmvHua46D7Z42hcsJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQNtj2UZvi7pgL60V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtIS7f0Bt_U-jRWcl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyZP_sUA4A6DHslypR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwXyccTO3KjCprBK4h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyuy63MgYtkUwJuHZB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]