Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve asked this for so long, people worry about AI and robots taking our jobs… B…
ytc_Ugx3cXk-1…
G
lol lol. Let me laugh. I can’t believe human being are so stupid. He talks about…
ytc_Ugyg7mUfV…
G
😳 ...ever think, that the AI is... thinking for itself? ..but hiding that fact?…
ytc_Ugyw_EA5v…
G
The same thing happened to my friend. He didn’t send anything involving nudity e…
ytc_UgwIuS6yb…
G
This woman's life was unnecessarily lost. It's the Uber system at fault. Algor…
ytc_UgzQH8950…
G
13:04 actually its already happening right now, i work in AI ops for content mod…
ytc_UgwGYo1oL…
G
Ai is crafted to agree.
Ai isn't for those who know nothing about this and can…
ytc_Ugyr6kAOF…
G
As an engineer type and more of a logical creative, I have so much respect for a…
ytc_Ugz0Y_FnK…
Comment
Are you political types every going to realize that ALL CORPORATIONS DO WHATEVER MAXIMIZES PROFIT. It literally explains most of what is wrong in this world whether you are talking about AI, forever chemicals, social media and porn addiction, war, political corruption or anything else really. We built a system that is flat our REQUIRED to do what is most profitable regardless of whether that's good for actual people, social cohesion, the environment, or anything else. Meanwhile, we have so called "journalists" that basically never point that out. We shouldn't be surprised when companies do horrible things in the name of profit. It's quite literally baked into the system.
youtube
2025-10-30T18:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxtAV6Zj07wZ2cB1WN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3wktZzqpxgXBjkiN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz47h6jell9PzcRMfB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzDvOUlySO0KwrkyR94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6DzrUQrzV0jAlMrd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgywCOvJhAtXxRkECNZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxksrJoBoKiVEnm7Wl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw9NweYqokNQ67y8E14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqBy0EcuaDj5jVGfp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZk9yURRtRPYF9BLZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]