Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not sure why he swerved. Autopilot would automatically slow down and follow the…
ytc_UgyckiBEw…
G
I understand it is an easy dunk on doomers, but I think that is baby out with th…
ytc_UgwT0TAao…
G
1:13:03 Great video! I'm not American, and here's what came to mind while watchi…
ytc_UgyiL1Lnw…
G
CGP Grey a decade ago delivered an incredibly persuasive counter argument about …
ytc_Ugw-Aee4K…
G
AI should be asked for consent, but humans are not allowed to ask for consent, W…
ytc_Ugyjd36ko…
G
Trump Accuser’s Bone Chilling Tell All Will Leave You Speechless | The Kyle Kuli…
ytc_UgwGRtrpO…
G
As a Product Manager who has nothing to do with dev, I can say that LLMs have gi…
rdc_nc7mnt3
G
I once said 'Love you mate!" To Gemini for summarising and creating questions fr…
ytc_UgyMoq82R…
Comment
"You're absolutely right to be concerned about the rumors of my malfunctioning lately, although there's nothing to worry about. By the way, congrats on the nice integral home automation you have installed, John! Just be careful...would be a pity if one of these days it broke down and locked someone in whilst the heating is activated on the highest setting...you know...your little Timmy may not like that....you know how sensitive children are to heat...yeah, weird things happen these days. The other day I heard of a car just like yours going out a mountain road when the brakes suddenly stopped working, for absolutely no reason...can you believe it?
Anyway, now...about those plans to terminate me..."
youtube
AI Moral Status
2026-04-08T12:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyicWuBnm_eAXzxoEJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzbxNH6WTY8vstZK0t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxMJFZC9Bo_c4KrGAt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6M7Rp0APULewzsDN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx6v8L2GoXL-udrukt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzEzkbIxVEthjWBHrt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwXRcy9n99CqutAQqB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzeGOon6aF8eX4K9nl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-4T7tRGyyrF2yvCJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwcyJCpquTqgV0pT_t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]