Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ask it to read the Bible in its original language. The Jews/romans knew that son…
ytr_UgwBqRXjQ…
G
On the other side, lots of people seem to believe the AI Bubble will burst.…
ytc_UgzWkfwbV…
G
I can't believe the robot thats gonna take over the world is gonna be named Dan …
ytc_UgwNjI58F…
G
I think once you get past Cosmic AI, maybe the term “artificial” won’t make sens…
ytc_Ugw9jcD93…
G
Even if it gets fully replaced by AI, it doesn't mean the hobby will die.
I ju…
ytc_UgwHj7RP7…
G
Being from both Personality and Interests closer to the "techbros" , the point m…
ytc_UgzJze4XX…
G
I always char with Ai with respect like saying please, thank you, have a good da…
ytc_UgxqpWZR6…
G
Eliezer had a great example several years ago (when GPT-2 was new) of where bein…
ytc_Ugz8jelfA…
Comment
There's a reason why these models are called Large Language Models and not Large Reasoning Models or Large Thinking Models. These models are only trained to be good In languages, not reasoning or thinking. They make good salesbots, but they can't think to save their lives.
youtube
AI Responsibility
2023-06-11T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzF5EcgYJ9F4oTAF7l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXqjdeunxkSq02cLZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3DAITwKJDcAk-6GV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwt80nqRCC8O4IIYMx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7int5xfY9YJyPtmp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzA3ubUQBC4fPmDMJ54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxju3omli0ZJzfx2u94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZyx6HuAQU2ekNNM14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz80jrcP-H2uDzvDEZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXGWq9drkiPM0BRpx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]