Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
King Arthur: I just interviewed 25 grads... not one could beat fizzbuzz
Sir…
rdc_oi12jxs
G
Heh, probably no. Not quite there yet given it lives for a chat session and isn'…
ytr_Ugz54tRSG…
G
So if all workers and communities were well compensated AI would be great? You'v…
ytc_UgypbCBKO…
G
Give rights. We don't want a purple lady sniping robot celebrity figures from ro…
ytc_Ugg_qn7yj…
G
There should be a horror game where you have to figure out who is an AI and who …
ytc_UgyUm6sU-…
G
I was expecting a shotgun sound to come from ChatGPT’s way towards the latter ha…
ytc_UgwsE67Hy…
G
It takes practice, you just have to try and keep trying, when you do ai “art” yo…
ytc_UgybcEzUE…
G
The real evil here is not the robot, but the people that programmed it. I think …
ytc_UgwVzybAm…
Comment
I have a conspiracy theory: all this fear mongering about AI becoming smarter than us, and incentivising the masses to not become educated and just stick to doing jobs with their hands, is likely a massive psyop designed to keep the population stupid. The reality is, we will need critical thinking from humans more than ever and we are getting less and less of it. As for AI? It can't think. Have you ever actually engaged an AI with anything related to critical thinking? It's incapable of it. Don't be fooled by this BS.
youtube
AI Governance
2025-06-16T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz6-L91TPc9c_796vh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyR2yFkfYw6Gh1EOZd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwmk7gaKsQZWAZUBZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_-NRaIb7sVBIn-Rt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwFXiwr8knVDU2h6Wt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx41PMtV226G55gSaN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwrvzRWLgYBcgqTmit4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuNMrlYPaMbcqPF0J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWxmPLSd71S-Yss5x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgrpIlqp_JJV49D0d4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]