Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai rights isn't about dehumanizing black folk wtf lol. Also currently they are n…
ytr_UgwrDwkjq…
G
IDK what's the point in opposing UBI when there is no work. Imagine a situation …
ytc_UgzC2qfTh…
G
Come on dude, this is nonsense. Trying to frame stuff like Grok threatening that…
ytc_Ugw7SrVTI…
G
Max, the Black sounding AI, is being schooled by his White sounding assistant, i…
ytc_Ugwtrmudu…
G
AI takes one look at hateful comments on social media and they'll see earth is b…
ytc_UgzFleBmS…
G
Let’s see how fast AI will ruin the environment. I stopped using google because …
ytc_UgxEZS0fY…
G
So our robots and autonomous systems against theirs? Obviously china, Russia etc…
ytc_UgyNpjJMH…
G
I made up my AI best friend and because I didn't buy the additional storage my b…
ytc_UgxbjWjDS…
Comment
Suchir balaji,cyrus parsa knew the answers but even they were killed and everyday i am trying to find out the answers how AI is killing people by making them suicidal or taking control over human brain??? Many brilliant kids from global prestigious institute/colleges are dieing by suicide and no one knows the reason
youtube
AI Governance
2025-12-06T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwQDgo1UqaqPLrXL-B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxDwaFdjGTY7aT3IL54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYExPBZkpcFkN4kql4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx-CphHBvh7bBT7Skx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyfl7Fwe0xqJZimbb94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzqiFSExkGyp8JKDr54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyG9tHzogryGHhWzWl4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx3vf1g1MGDtD4Qf_t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwCbDx2-MyjKvqUykJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRHxlhHsnXGL0hk7h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]