Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So Tesla autopilot it's useless, I could Fail catastrophically without a warning…
ytc_UgzuuRwUw…
G
I want to know if A.I is taking everything all the jobs what will Happen to the …
ytc_Ugxo0XY-q…
G
I think the only solution is to separate ourselves now and leave the Matrix by l…
ytc_UgxpbbvyG…
G
Boycott all companies replacing humans for AI. They don't care about us so why s…
ytc_Ugx0vlyh8…
G
Controversial take here. Senior lead FE here, ai has made our teams 2x more prod…
ytc_UgzmtzENO…
G
Those were the most ridiculous Doctor Who costumes. I love them.
Also, regardi…
ytc_UgxQxFonb…
G
Self driving cars are so far off. Elon is a con man. Do not trust self driving t…
ytc_UgyRSVtvY…
G
So Stuart says we need AI to be human-centric and for it to define what we want …
ytc_Ugx3d1sPe…
Comment
The part about the call center agent getting bored is where i would have to disagree. The AI agent doesn't act out of emotion to put an end to the conversation, it just acts because he is trained to recognize "just chatting" convos. Also, he wouldnt be get annoyed if he doesn't know what being annoyed means.
youtube
AI Governance
2026-01-08T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyxNXt8GC8vqpRyz6p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbmhQkDJxGqYZw0SB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxcbV-i9sLUyucckRp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxwPX_MYSRjreCrAiB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyD_QaIQVOBwXsTaK14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwE6iiJHBbU9k8J16F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSh1y1AALQzh7GxhZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzKiU8-n0papzp4ZbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3L1gawHCykcIzrll4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwNCBqY9LTM5oQccEJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]