Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ebikes are actually a practical solution to the poor designs of modern cities
mo…
ytc_Ugws6gDCI…
G
I think when AI takes initiative to do something of its own without being told t…
ytc_UgzXjTgkS…
G
@phat-kidyes most ways you make money have nothing to your skills and even jobs…
ytr_Ugwn-Qg6P…
G
Honestly humanity sucks… we’re constantly killing each other anyway. If AI kills…
ytc_Ugxu4b0il…
G
Gee, how convenient. "It's not my fault that I leaked conficential data! I'm sim…
rdc_cjp249h
G
@41-Haiku The "leading experts" would be who? I take it you do not mean the prog…
ytr_Ugxf4oPmh…
G
Of course NOW MAGA wants to regulate corporations. If you don't want AI in your …
ytc_UgwiYRrB5…
G
That's the point ai bros miss: it was always possible, but in our era is as easy…
ytr_UgysFuPc1…
Comment
I love history, and history tells me that humans always make mistakes, even the greatest people make wrong decisions. So unless we establish a symbiotic relationship between humans and AI, and have a better goal, a madman in his basement could create a Pandora's box that ends humanity! Why is there only a little over ten thousand years of clearly defined human civilization history on Earth? Is it because each certain point in time triggers some kind of rewrite of Earth?
youtube
AI Governance
2025-11-27T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy56i3yCvIS1H-iQeF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1tWrtiZh4KkuVGtZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzEOJkDfdoFKcdAyGx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwfqBlH55FsMXp08YN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZWrR4HhsShzB4iFd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxZ0lKO7zFNfw7ArY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybKGti21A2jgR423R4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5AFYoHo6He3wwdlZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwuAEPdg1iTmQYX45p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQyI4PKDnhc11i59N4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"indifference"}
]