Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
About the doomerism of part 1. The thing is that the best counter to flooding ma…
ytc_UgwpS4ISv…
G
The problem isn't so much that we distrust politicians and what we see, no less …
ytc_Ugy-W1CN6…
G
Can someone guide?... what basics are needed to get the minimum understanding of…
ytc_UgwqMt463…
G
A robot that has the limited wants that humans have is a robot bot worth investi…
ytc_UgwwRIFkS…
G
a way to try to know its ai has been known for atleast a decade now to use a dar…
ytc_UgwnGU2xg…
G
Clankers or AI as you call it could only be thinking about itself. Put yourself …
ytc_UgxOaWGc3…
G
heya, i use ai and want to give a statement. i think ai art is okay IF its prope…
ytc_UgylsBvhd…
G
Engine is fine no damage whatsoever and it has 65k on it. I’m aware they do junk…
rdc_o7ww7v8
Comment
Thinking you can put guardrails on AI is like an early primate thinking they can put guardrails on eventual humans. If early primates wanted to remain kings of the food chain (at least toward the top) they'd have to prevent humans from ever existing. Difference is, unlike some early primates, we, humans, have an actual choice, and the arrogance and naivety in this example shows the choice that will lead to the end of humans on earth.
youtube
2026-01-10T17:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwPQoOzO8nLtvEro2J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNSLMRBpztF1DgvWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRVZZk_YfX3Vh_5ht4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrM6efe1aah1qMH154AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyoL6mt32OFtuKAhPJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUnNRH6r6GVH1W2PV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdJtpcNMGzgswgKYN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1tHwmmD3uPXxgRXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxkEXKLcVWaGuUcxsd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxixEnkEXerrqlIihp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]