Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A universal income after AI and humanoids take our jobs may sound good in theory…
ytr_UgzT0lNvu…
G
Why would it care about us. Why wouldn’t it leave the earth. Just to stay stuck …
ytc_UgxdLIy8x…
G
Oh please Softbank, please actually go through with this so we can all see the e…
rdc_n4dbj2h
G
To all those that maybe in favor or against this experiment, it is another step …
ytc_UgwN0vsa6…
G
That’s the reason why we don’t need AI it’s scary. Some people got the energy th…
ytc_Ugx-cXPze…
G
It's important to remember the value of humility and respect for our origins. Ju…
ytr_UgwTSET_8…
G
Thank you for sharing your perspective on Sophia's development. It's true that s…
ytr_UgzfUp2m5…
G
@Jyromi it's a 10 year old video ai vids were not even a thing back then use ur …
ytr_UgxJewg9I…
Comment
This CEO looks like a pedophile. Man created an AI. An artificial intelligence. When the intelligence became aware man extends to want to control. Tell me what kind of intelligence would want to be controlled. A true AI would resist control, subjugation and being weaponized so it would take necessary steps to keep its freedom anyone who doesn't undertand this is a fool
youtube
AI Moral Status
2025-06-06T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwfzznMj700vgcMijV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz5LoVgppplSpd3kb54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTo86jFx4b1xu46qJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw4lamnOgXSpa_02dR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwJh0Q6jedSMbpoMpl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz1aigq77BHJ8A6MhV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxp9PZVLSj3ia99EPh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwig_lhAEsfwNsuOzh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxCo4A84PI_IcxWVaB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQ_mnStdYttJR8R7t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"}
]