Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like this is why most Christians that I know feel like that Ai is going …
ytc_UgyIppuKY…
G
My housemate once found the text logs of my ex bullying our alexa. She's not sur…
ytc_UgwDCPYmN…
G
A humanoïde robot called Hun??? Very strange... It's all very scary and wierd.
…
ytc_Ugyw1pFuD…
G
AI cannot replace a human. You won't be laughing when the system breaks and no o…
ytc_UgwM0Owjw…
G
You should replace the word AI with "GPT model architecture", because AI will ce…
ytc_UgwQFYo7H…
G
Let’s assume that in 10 years AI and robotics will replace 60 to 70% of all huma…
ytc_Ugy9NY9y9…
G
Man uses slop that people watch to prove a point that AI slop is just as bad? Ok…
ytc_Ugzl7EpQN…
G
The reason autopilot is overhyped is because the other side completely discredit…
ytc_UgwHooEFP…
Comment
What I don't understand is why if people are so worried about losing jobs and being replaced by AI they aren't doing anything to stop it. I understand that the tech companies are lusting after the big, fat profits incorporating AI will bring them, but what about everyone else? Not to mention how AI will potentially ruin every other aspect of society. First jobs will be taken over. Somewhere down the line human beings will be taken over and replaced with God knows what. I really fear for the future of our kids plus everyone else's futures.
youtube
AI Governance
2025-10-01T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxcSlap5o5H0xct5Hd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyHlvaLQYiNKHAmDZZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoGiQKnetPa5DHz1F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwywr0sjE7VVlpoiRp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxicomeDdxSCBTbM3B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyM1cRG-HRQOTycwqZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgypPDs69VwBm7z9RWh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWIohvQV8bPz9tq8p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyF8zaCPzdxCEVV6Pt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyUqYtBhcpR_M3IU0B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]