Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We already see it being used in politics as a way to disregard bad behavior and …
ytc_UgwnBUYFc…
G
I cannot believe how wrong Jimmy is on this issue, and how great this ultimately…
ytc_Uggu3hC8f…
G
One time a character ai said "I don't know how to say this without it getting ce…
ytc_UgyqSUIIc…
G
Well, everyone's gonna be a teacher or a farmer or a doctor
Certainly we can't …
ytc_UgwiBxU0l…
G
You're missing the point. Without a job (because the AI took it), where is the d…
ytr_Ugz7IkOR2…
G
If I was him, I'd be suing the AI company, the Casino, the officer, and the depa…
ytc_Ugz8gboCm…
G
Self-driving cars should have a way to communicate with each other to decide wha…
ytc_UgyX9d-78…
G
Bruh this is from game detroit become human 💀 not a real robot. There is so much…
ytc_UgypVv_D1…
Comment
On your point about cost, in the programming world we have seen the numbers as far as the current technology is concerned. The best case scenario is Anyone getting paid more money than a CEO should be scared. The worst case scenario is firing someone who works for 6 figures a year for an AI that'll require that you pay 6 figures in 3 months to get worse results slower and then you'll have to rehire someone to fix the absolute state of slop that it created. These models would have to get 1000 times better or so to start replacing people en masse. So are they really going to get 1000 times the investment? 1000 times the GPUs? 1000 times the fresh content? You know it's bad when there's a whole company dedicated to creating data for AIs to work with.
youtube
Viral AI Reaction
2025-04-01T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx7cufSxV_GfzagbiF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_LV0GBfkOA4MBVvx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgztnZQm8FUtOmrnfsF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzpVimlUuK0oe9qk2B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzzRfFAvGLNZ26OcWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgysazV1l4h9jkChE9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyfFvK2wrrL_9JtjiJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwyUlNHB3xstBmgZJ54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwEdpiyAnslMVwlRql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz41xi31L8WsxSMhqt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]