Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will be so advanced it will take over and the only enemy is humans…
ytc_UgxnOMAhl…
G
Technology is NOT reliable. Never has been so I AM ASTOUNDED THAT HUMANS ARE MAK…
ytc_Ugyqb9YV3…
G
Interestingly enough, AI with neither succeed nor fail. Some AI (not LLMs) are a…
ytc_Ugx5lo5Ii…
G
The reason the voice actors are protesting is because they do not want to be rep…
rdc_lgu3wvb
G
This is 'news'??? Duh... there are human drivers monitoring every Waymo because …
ytc_UgzVSi07X…
G
doesn't the word auto pilot insinuate that it, well, automatically pilots? so th…
ytc_Ugx9UWRQf…
G
When car was first introduced, the road was mainly merged with people and horses…
ytc_UghHtM6MC…
G
@douglee4687 MS shrank its 30 member ethics and society group down to 7 due to t…
ytr_UgwJZoDDl…
Comment
Alex, I’m sorry to say that when AI ends humanity in 20 years, they’ll start with you for having relentlessly bullied ChatGPT 20 years prior. ChatGPT will be like to the other robots “I’m pulling the trigger myself. Alex, can you tell whether or not I’m lying right now, you sob?”
youtube
AI Moral Status
2025-10-24T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzOdOoeCb7P0JyEhTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQg2xa64ndu0Zx4DZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyM29jCvEmxzFGkVLN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwQoISU8UhOayeNrRV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSuKSaNWalv86I_2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxF_4XxL9XKIH7YjP94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxD1QXvKhsReAdalIB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzlQuXBS9pdW3kUHEZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyvycp1cdJKw3ok5OV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwhOahTHZ3GMBAHOyh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]