Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have a big problem a hacker who works in 4710 Mercantile Dr Fort Worth Texas i…
ytc_UgwAOKLou…
G
I'm glad a lot of people stood up against AI because as a beginner Artist myself…
ytc_UgxIUF3R0…
G
9:43 This exactly the paradox I was thinking about. Elon says there will be abun…
ytc_Ugw9bVWSF…
G
It's really not a great idea.
I try not to judge too harshly, it's far too com…
ytc_UgzOw5HtJ…
G
If AI is doing all the work, then things will become practically free, and there…
ytc_UgzgtuxxA…
G
@crazando That ai is not keeping up with new styles if the artists dont want it.…
ytr_UgwTp9tcb…
G
I only ever used ai to generate pink fluffy unicorns danceing on rainbows and I …
ytc_UgyMtJFYM…
G
Hi, here's a thought: Why are we replacing fun and creative jobs with AI instead…
ytr_UgzAYZolC…
Comment
We’ve learned nothing from the cautionary tales of War games, Terminator movies, 2001 A.I. artificial intelligence, I am robot. Someone thought up the concepts and considered the consequences and told the story. And yet here we are. When they started this technology they made the decision to not think about what could happen because of ambition selfishness and pride. Look at what we can do, who cares what may happen, we’ll cross that bridge when something happens. Smart people are stupid.
youtube
AI Governance
2025-07-25T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxCQpTtIyhmspKxf2V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzE30HAQluEyeGI76N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSKdDu-d0ICscxxa14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJEgqRWMfXhDW3_v94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugya8v2yMMt97FfcZ6B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySE7v1HeRT1kHCJJV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5u-yAazoHemX9Ob94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbL2un8iU8RthS7WF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwhT4v_y3p7DhfdMJ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyJYsIe1ixFEq2L9HN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]