Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did they not need to ask your permission to make a chat bot of you?…
ytc_UgwC8OTPM…
G
Everything can never truly be automated, whatever new tech is introduced will al…
ytc_UgzyEfqfN…
G
No teachers to guide, only to cheer the students. The fact that it was a teacher…
ytc_UgzhUE9Tc…
G
It's not as if something like this is revolutionary. What we see are are "out of…
ytc_UgzDAKPmt…
G
The really scary thing is not the (imaginary) potential that digital software sy…
ytc_UgyFAdbHb…
G
Nobody is trying to take away AI art.
The point being made is that:
You don't o…
ytr_UgxWzpvDa…
G
AI does all Jobs ->
People are poor or have no money at All ->
Who tf gonna Bu…
ytc_UgwtQP07a…
G
"See boys, people think they want freedom.. but what they really want is order..…
ytc_UgxTXPPcG…
Comment
We have more options for destroying humanity than ever before. Amazing, isn't it? We can now choose between nuclear war, climate catastrophe, pandemics, unlabeled genetic engineering in plants, and now even AI that places itself above us. Actually, we wouldn't even need any of that to wipe ourselves out, we can manage that without AI, but it would certainly be much more fun with AI. What would our ancestors think about that? What would the ancient Greek philosophers say? I think they would say—turn back before it's too late—or they would laugh themselves to death, even though they're already dead. That could cause additional problems in the time continuum, unthinkable. ;-)
youtube
2025-12-06T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzCX0-xmR8UNch4v214AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxZGa02IcH-J2PvWEV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyizSSelWkX-3DBUSp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzBk5ZDYxK6--WUYqJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgygQOYm7T8WsPMCLRd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwPvEzL0TAT55aGqxt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgzyUu5A2zQpXs549OB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzMwBazFTJ6Vp0Er6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwpXv43-tJNqn7a5oB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},{"id":"ytc_Ugz-iSiR2jH-v0zUv1l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})