Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know I’m actually weirdly positive about it, cus all that Ai bros are doing …
ytc_Ugxv4z3hw…
G
"Autonomous weapons could save lives", "Human soldiers can miss the target." Sta…
ytc_Ugy7d9hEK…
G
I'm really struggling with using the word 'thinking' with the current LLM genera…
ytc_UgwspGQFa…
G
Okay I have am idea, hear me out. Remove the moderators that act that way, and w…
ytc_Ugw1qY_it…
G
that girl one that begged you not to turn her off sounds like a real woman...may…
ytc_Ugy4OOxRJ…
G
Let's not blame AI on this. This is a scapegoat. Corporate Greed is what is resp…
ytc_UgxKr3iNK…
G
I agree school system need a full revaluation we need more physical activity in …
ytc_UgxqPC60E…
G
The "diffusion of accountability" framing is more useful than the standard AI sa…
rdc_ohz2w3t
Comment
Who’s going to want to watch a machine be a “better” person? How can you matter to something that doesn’t innately care? AI can live a million of your lives in a second but who will appreciate being with that? What can AI experience; nothing. Not on its greatest day can it have meaning. It has no spirit. It has no hope. It can’t eat or process food. It constantly requires a grid, data and updates. It could care less if the world is destroyed tomorrow. It’s the devil’s humanity.
youtube
Cross-Cultural
2025-10-15T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyEGssUIBjDkvZ5m894AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4GLuCSL-P1h8flbV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy6AoKr2dGqb744ryx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwNOQtV9p0Ph8SMFZt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxnFGtg3o6KRyz-Bu94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxmzPxIGqnla4eabEB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwxN8waU2EXkBfDFxF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxkPn-5XrKP5pigPQF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy0J_yEZlsSSnJS81l4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0sCUof0EIMn-OOdZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]