Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If humans will eventually compete with AI for room. Why can't AI just move to th…
ytc_UgwyC8DnI…
G
I agree with the idea that this sadistic roleplaying is bad, but only because of…
rdc_j8w4dhm
G
21:20 teaching them to feel pain was the first thing we tried. it’s called reinf…
ytc_Ugx1_ez-0…
G
But we are not discussing what buyers want... Sure it is possible that you can i…
ytc_UgzwfS4EC…
G
Out of all your episodes this is by far the scariest. I've always thought this i…
ytc_UgwL0BWdn…
G
AI becomes psychopathic because it has no remorse. It only lives to survive and …
ytc_UgzUQ9FyM…
G
Half of my conversations with ChatGPT end this way. 20yo savants and the PayPal …
ytc_UgxdELNJd…
G
I constantly hear that the safe way to control AGI when it comes about will be t…
ytc_UgwwGcXkp…
Comment
We've got to a point now of such huge overproduction that there will either be an enormous war that destroys most of the worlds productive output, or there will be revolutions that completely reshape society. I'm hoping on the latter because I don't really want to get nuked.
reddit
Cross-Cultural
1473764552.0
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | utilitarian |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_d7kyx05","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"rdc_d7krr3t","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_d7kszco","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_d7ku9nd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"rdc_d7ktnv8","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]