Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we just keep pushing orphans into sam altmans protein smoothie, chatgpt will …
ytc_Ugzy8GiA4…
G
You idiots. Do you even know ow what AI is.this shows how dumb you people are…
ytr_UgyTHQtfD…
G
My new personal favourite - "AI - Asbestos Internally"
Because it looks function…
ytc_Ugy4mt3JX…
G
As a senior in highschool, do you still recommend Radiology? As bad as it may so…
ytc_UgxpNLWBa…
G
Your comment is actually two totally separate issues. You're acting as if people…
ytr_UgwRDjUQk…
G
I didn’t even break the filter I just put two of my OCs in a room and they did i…
ytc_UgxQ4SzEq…
G
I have never used AI, phones are already making us stupid enough as it is. I had…
ytc_UgyhP6s_E…
G
No, junior jobs will not go away. The threshold for juniors will increase in con…
ytc_UgxJtLwrd…
Comment
Humans can even agree on the future we want so how in the heck will AI be able to agree and create that future. The only plausible solution to make everyone happy would to give everyone the ability to create their own reality and the cosmic joke of that is that with Humans even when we get what we want we're not happy because most of what we chase after is what we think other people say we should have in order to be happy.
youtube
AI Governance
2025-12-27T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwMez3ywYObuHmBVTR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzt-63bsk_3bKJa_Eh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugweg5nuG-TlmDXZlC14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzXbptbjgzWwvYU95R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVJgP0kZPH80w-m_N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz1b_7LwmgsXOzzV3F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvB4Mc0fb4-z52rGJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzbblPjAfBwV6uZPqx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwwz2ZR-ozubCsIcRt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzLbrEbYXlFOfXRrJN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]