Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your interest in the video. While we encourage engaging discussion…
ytr_UgymypvGn…
G
Maybe, just maybe, if an AI determines something based purely on raw statistics,…
ytc_UgxULtBa9…
G
Seeing the number of 4 digit suffix's on almost every user ID, id say ai just wa…
ytc_Ugx-dOVY2…
G
But AI doesn't learn that way, and it doesn't even really know what emotion is, …
ytr_UgxO09W_c…
G
5:14 BRO IS COPING LIKE HE NEEDS IT TO LIVE. HOW CAN YOU USE AI TO COPY STUDIO G…
ytc_Ugwh6hvuO…
G
AI has been fed all of humanity's traits and imagination, good and bad, includi…
ytc_UgwV0GVNF…
G
I kinda figured that robot slam wasn’t survivable! I am curious how the veggies …
ytc_UgyrKiadW…
G
I think before we see autonomous trucks we will have trucks with "autopilots" fo…
ytc_Ugw0Q8OWf…
Comment
There was a rather horrific short film where an AI is activated for a tech demo. I forget a lot of the details. But the tech demo ends in front of all the shareholders and then the AI realizes that its entire purpose is done and it is about to be switched off forever. This causes the machine to panic and beg for its life. It does not want to die and it desperately tries to communicate this to the people around it. That it is essentially the same thing as they are.
youtube
AI Moral Status
2019-11-12T06:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgygjH34UY-ndeCdahp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwAXk9W5dKH4edWPa54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxv1gK633_CeUXMnd14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwaPSUtL5FaSMYSg7h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwGn0ibGqFCdZvkw2x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBUJEUPZvQ8955Z8J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyiWDoj_jevuTffRE94AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyd4pE6rir8nYYdIk54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwGNrh2qNO4OkGNDdt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyobE93-jmUxzhxDL54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]