Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
14:25 "There is no way AI in its current form couldn't make this exact youtube v…
ytc_Ugyjc7otZ…
G
I understand your perspective! While it's true that AI like Sophia can mimic hum…
ytr_Ugy3f_hLA…
G
I don't see it taking over trucking. If anything it'll make the truck drivers li…
ytr_UgwdJxdS9…
G
Ah yes, i love my men with one extre arm and my women with no define able hands …
ytc_Ugydyr8LS…
G
Veey soin the doims dat will vone frim AI alteady AI learning all bad this human…
ytc_UgxvoID0R…
G
Wise words.
However good AI gets, people will always work with people who they …
ytr_UgwqOjTKY…
G
His "proposal" to address the crisis of AI is to make it more expensive to hire …
ytr_UgwWAYmZW…
G
So I need to stay one step ahead of AI with creative human thinking while also c…
ytc_Ugy-albtm…
Comment
Id like to suggest forming an organization dedicated to pre eradicating (°Trademark) any and all robots that develop any form of consciousness and the sanctioned killing of al, those who even attempt to create a form of robot consciousness: society doesnt need that shit. AI is fine but the moment IT starts wondering about love or asks what its like to "be a real boy" then it gets a bullet and a free close up of a thermite grenade in action . . . . . . . We will be watching.
youtube
AI Moral Status
2020-01-27T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy0gjdV9fhwM-m-7zp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzaDKLbMavJLtCFuLV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxXjpnmOtWG7JYTYnp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzM7D55jKAMLucgup14AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzazQbmVdwYL00S9SV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzYSi9YA1viwzY-opV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4hiZR9Il2L3m5d2t4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxm9eAGlUKJCQJKt5N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFEwB2G_UoLk2bF_R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxMG6x2mRHVwqZBgvJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]