Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont think this will happen, but AI anything shouldnt be compensated. Use it f…
ytc_UgzmorRlF…
G
It can replace script writers but not actual enterprise software. Too much stake…
ytc_Ugysg6Wok…
G
@michaelciancetta6397 AI is algorithmic, computing, but consciousness is not com…
ytr_UgyxPWVqr…
G
The problem is intention. There is no mind, no divergent thinking capacity behin…
ytr_UgxZ7bHsO…
G
AI art is becoming more popular but i reckon it'll dig its own grave
it'll make…
ytc_Ugz5Dijg1…
G
Anyone is scared of this robot but I'm more scared of the man who wrote "Hot Rob…
ytc_UgwLaVqxa…
G
I don’t get how people don’t understand this. You’re exactly right. It’s a tool …
rdc_mp2udsl
G
Yeah but good luck trusting a robot to carry your baby or put an IV on you when …
ytr_UgyNuZhO1…
Comment
I feel like we are ten years away from going completely analog with everything again. Pen and paper. Cash. Books. People will get to a point where computers and artificial intelligence is so engrained in life it’s just too much to sustain mental health, privacy, emotions and thinking and the whole thing will pop, and it will be the cool thing to be ‘unreachable’ with no phone, and society will reject technology to a certain degree.
youtube
AI Governance
2025-10-18T03:2…
♥ 89
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy7wbM4Zq_9jHV609d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyYqiMfkIH_xZGslIt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz9IxIirpBhfxLgmgp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzkqAQWZFl0wpZkIux4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyQINuhqkN5Rhnjcah4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzuiCrP89wKMdDD8ol4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz1JxtoK1ihz5ckmb94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyKKAbBYahk3Ro-r5V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwZf7SVXR8Zc6NFkdR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEyMY9jNhT8nybMl14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"resignation"}
]