Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, I found that really interesting too! Why would an AI model be uncomfortable…
ytr_UgyUWwn45…
G
@laurentiuvladutmanea actually even algorithms can be non-deterministic…
And n…
ytr_UgylRiEX-…
G
Doenst seem to be fake. The camera is focusing in different objects until the le…
ytc_Ugzj3Tekb…
G
lol i like you hank but this guy is a moron who clearly doesn't work with comput…
ytc_UgyMHE8XQ…
G
i see those that who own the AI that works for the government as a small group …
ytc_UgxLlMhWI…
G
Rumor has it, that there are some form of kill-switches in solar panel equipment…
ytc_UgzcnLFrv…
G
The idea that "AI may take our jobs, but go find a new job that AI can't take" *…
ytc_UgxCHXgRx…
G
no because i actually did this a couple of times but with direct questions and a…
ytr_UgxvQ8G9x…
Comment
<sigh>. Let me get this straight: The fox is warning the chickens there's danger outside the hen house? Oh no! Maybe we should put the fox in charge? Can they save us?
Wait.... did he say he didn't know how the AI works? So what do you do all day? Maybe nobody should be interviewing this guy.
youtube
AI Moral Status
2025-06-04T15:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwYaKPdm7DaP1O_LbR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgySWay_RfiWa0pNdhB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyWZI0AE2DOR66VcZx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLG6MIGgVvaGhvg1d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzzISH_4wgJdqDy6814AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxCOML_yw6tpD0Iu5V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxql0fd7lvcuKnCiad4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyUDl9l8fxnfpQbbpJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZb3QmDW1cB-e0OKJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyowNMHUk8ZIOlch3V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"}
]