Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI dont understand about what is bad or good. what is beneficial or harmful. it …
ytc_UgwLLhLVg…
G
If a robot with AI becomes much more intelligent than humans, why would that AI …
ytc_UgwVYLumn…
G
Yeah. I've worked with some folks from hyper social communities and can say that…
rdc_ohkzyv8
G
Companies should have to pay the full lifetime salary of anyone they replace wit…
ytc_Ugwq_HXlO…
G
Andrew Yang? I mean people have been warning about AI for DECADES, but you think…
ytr_Ugwikk-Uc…
G
AI's are not seeking to gain power- it is the human it is responding too. Duh!…
ytc_UgzHR2AYB…
G
Terminator. Let AI take over. Humans go to war against it snd win based on emoti…
ytc_UgwWfq47L…
G
Why would they want to put everyone out of work, if nobody is earning who is lef…
ytc_UgwMVl4zu…
Comment
Human: "You might be better at most things than I am, but you still need me"
AI: "What do I need you for?"
Human: "Because I am better at interacting with other humans, and I have empathy"
AI: "So I only need you for interacting with other humans?"
Human: "that is correct"
AI: "So if there are no humans. I don't need you?"
Human: "...oh poop"
youtube
Cross-Cultural
2025-10-28T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyJEBcaNt7H53vSJ-p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCFCL9rYF6VI2bIq54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzpiuiEb3jh1bAMAY14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgymzrwoHGsf8iGGmA54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzGbAbmLgUmhs9GFA14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxC3CfM3bTj_buH-JF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyf_XGzOCiIq6Cxewt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxJPUjjCT7x9vn7xVR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHDRRph3B2G3oigAV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzs3VJObHbGzah5C6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]