Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sam looks like ai in this one
I don’t really know why, I just have a feeing…
ytc_Ugwb7JGYm…
G
Unpopular opinion: It's because they're already working on it and they don't wan…
rdc_dwvp4w7
G
People used to think it was better not to say anything bad about god in case he …
ytc_UgyFh6fRJ…
G
AI should be asked for consent, but humans are not allowed to ask for consent, W…
ytc_Ugyjd36ko…
G
I love the human-aligned shit like “then this AI went crazy!” Or “it was super e…
ytc_Ugz841FN1…
G
The AI with this, must reach AGI to be fully conscious, this is emulectual intel…
ytc_UgxNlkdEm…
G
DONT LET CHRIS AND CORY NEAR THE TIFA AI OR DAVE NEAR THE MIKU AI…
ytc_UgzWo6mRw…
G
I miss the early days of AI stuff where people made their own datasets instead o…
ytc_UgyIjhsHV…
Comment
This is so new and exciting. I hope Tesla and Waymo both do great things.
youtube
2025-06-25T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx3LFVqMga3fgERjCx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfLkks0rBOYiaZW7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBkXCvSvH6e_P4XrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx61uPLeMljxl6Y8dZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxg3PEenEYQyUwsA7J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxdsPpveQhmvPAY19t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgztGtpnl6ShR3V8Kfl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxpjG_spMBCFwi7E1t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzOqBnstOPAMnBqEUt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBU7hU6lRW2CDP3x54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]