Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In this particular case, his AI generated piece has garnered such a back story t…
ytc_UgzMofUI5…
G
I support your solutions, Senator! For those who missed it:
1) move to a 32 hou…
ytc_Ugy5N-54k…
G
If the only criterion , by which we allow ai in the work field is profit and dol…
ytc_UgynqU1AF…
G
And the sad part is, that he could have gained the skills. All this defending of…
ytr_Ugx-Ms4up…
G
What makes me hopeful is that actual artists are people with a true love of thei…
ytc_UgwgdJBPC…
G
The way that Krystal talks about AI and it’s utility seems DEEPLY out of touch t…
ytc_UgygKOFHd…
G
Well I’m fucked. Been in customer service for 20+ years. Any jobs out there that…
ytc_Ugz7FQoRs…
G
FINISH HIGH SCHOOL.. easy.. Then youll be smarter. By grade 11, “artificial int…
ytr_UgwVZhvVL…
Comment
Every movie we've watched about the future with robots (and probably aliens too) will likely play out in some form at some point. Change is inevitable. I think that the way they're developing AI and letting it run and change itself is complete madness. It's like building a 400-ft. high windy, slippery, hilly bridge without any lights or guardrails on it. Every interview I see with these AI creators/builders, they seem totally depressed, deflated and, resigned to our horrible future fate. Like Musk and the guest said, they don't even want to think about it. So, YOLO! Enjoy life!
youtube
Cross-Cultural
2025-10-01T06:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx8qBFXeaau7nhN_Q54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyarTIbNoaK0ujv8694AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwzcyeKiobtWrPmIwd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz75BGtlaht7pirqn94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdawH6NduXmktoLM94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwQFi3mOqFg3qWG8914AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwjxMTQAFNwAJkwOMF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhN9AOTvrOmVSqPH14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMcQsQ5H5s1rXjktV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy74SSgFc0rswU0ljB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]