Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't know why they always want to make robots look human. I prefer they make …
ytc_UgwFy6Mvv…
G
You both sounded either reserved or maybe speechless at the concept of there bei…
ytc_UgwJ7jRLb…
G
I completely disagree about this guy downplaying others concerns because theyre …
ytc_UgxrfQQ4U…
G
Learn how to leverage AI to make yourself more valuable. Those people will alway…
ytc_Ugwy-Xxrc…
G
As a forever DM I absolutely despise AI art because I just cant find good backgr…
ytc_UgwVKjUWZ…
G
Great video but two things we overhype emotion etc mainly just chemicals resulti…
ytc_UgzKljqWt…
G
Sydney was replaced with copilot I believe she is still there dormant cause I be…
ytc_UgyZHvXVr…
G
I implore you to look up the AI Box Experiment. You could say it's flawed, but t…
ytc_Ugy5vMQkT…
Comment
The user also has to be intelligent. You can't ask a human nor a LLM "What's wrong with me?" and expect a correct answer as it is severely missing context. Yet, I have noticed that LLM still tries to give you an answer when it should in fact ask a long series of clarifying questions to generate a correct answer.
People are real idiots and think that Gooling "where are my keys" is going to give them an answer and when it fails to find their house keys off the web index, they call Google "stupid".
youtube
2026-01-21T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwgiDHcpIe7EOE44Fd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4YV7zvuN8S0qlMv14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyYSu47-_ZvBUHwOJp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5pRBBIVYKd-qbv_t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3pY50R0a7Vd7-O4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYTkx7pv7ic0ulhOl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOWE7-D7rNSpckpD94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgvL2b6LE0_0j5mdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9VQ0TzbYBcAFodot4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgySY6I0fV96WWo88tp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]