Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Warns us about the dangers of ai and go on to create a direct mainline to our br…
ytc_UgyGxcz1D…
G
Lol... the girl who stole her channel name from pokemon complain about AI gettin…
ytc_UgxjmZ8ke…
G
> But people are also heavily using AI at work...
Mandated from the top lol
…
rdc_oby1bh7
G
im so glad lavendertowne is speaking up against ai
also, to help filter out ai t…
ytc_UgwqsgVHD…
G
I did not believe the other robots those are people dressed up in robot costumes…
ytc_Ugzk7rDDn…
G
"I'm 77 and I don't want to think about the horrors my generation have inflicted…
ytc_Ugyd7-G37…
G
Its like some terrible SNL skit. Police sitting around a computer making an ai c…
ytc_Ugw4TqoCP…
G
I'm waiting for HarmonyCloak to come out before I upload ANY MORE mfin music. De…
ytr_Ugx5dQPIL…
Comment
@NotTheEnd7766AI research is killing people RIGHT NOW. It's dirtying their air, giving their bosses an excuse to fire them, and making their media and information diets worse. We don't need to be worrying about some hypothetical far future, we need to be holding these companies accountable for their behavior now.
AI needs regulation but this is sure as hell not it. It's a distraction promoted by a nonprofit staffed by AI industry employees. "What if the AI wanted to kill you" is a question intended to stop you from thinking about the fact that they are currently, very slowly, killing you right now.
youtube
AI Governance
2025-08-27T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx5YCRGCoCkjdOM2m14AaABAg.AMId3fhlf7CAMO6veh1ih0","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx62XveXjdXxjqCsVp4AaABAg.AMIcmnCICcDAMMdvHjlh23","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwkpDcIxweX1zW-J2h4AaABAg.AMIbIuqtBGuAMIdcDqIvkS","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxPJj4BpTqnkrE_nO54AaABAg.AMIZAzLJO7LAMN_UFOapNE","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMIYUakQJOq","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMIqA8QeEHG","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMIxtkGAAk4","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMIyqskZDn4","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMKRlvcv37B","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwOGSW5jAljACTEphh4AaABAg.AMIUm4ROxlIAMK7iytDiE1","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]