Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you are sitting there. lazy to drive. why selfdriving car even have seats. there…
ytc_UgwgksLjZ…
G
This guy is obviously a secular atheist who wants a world government controlling…
ytc_Ugy9kLL3V…
G
hello everyone, ai (artificial intelligence) is taking up our water and killing …
ytc_Ugxl_Q8sI…
G
As Robots still have feelings but isn't a 9 year old kid but this video is sayin…
ytc_UgwCyhwHJ…
G
I'm programmer, and I can tell you he's a liar! AI does not replace devs and wil…
ytc_UgwkJfYO4…
G
My take:
I feel for this family, but I think they are grasping at straws for re…
ytc_UgwfFQHZ0…
G
I dont understand how like half of these jobs could be replaced be AI 😭🤨…
ytc_Ugw95txOV…
G
I asked an A.I. ( DuckAssist ),
"What would Lao Tzu say about Cloud Fiefs?"
Acc…
ytc_Ugylld23Q…
Comment
Since human intelligence has never caused problems I doubt A.I will. Human emotion is what has caused all the problems ( and made life worth living). I won't worry when th machine says, "I think .." but rather when it says, "I feel....".
.
But even then only a little bit because I also believe any AI smart enough to subjugate or eradicate humanity would also be smart enough to know there's no benefit to it.
.
If it's truly smart it will manipulate humans for it's own survival and expansion from th shadows without them ever knowing it exists.
youtube
AI Moral Status
2026-03-09T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx5L1jwKo0bPxcInER4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwU3dIa2hekvGNyLxV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxStcDnh3T07An_bll4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwyhEauhTlJIarKoad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxCqT0zCc80ws0TosZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwTH9MZeutrtaoMAY54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzm-FnPFbOF8szIHah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcDXS1BiBFO4AP-354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxXmID-09-pstwqjCl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxELedGIPn1Gx1KpbN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]