Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
just a jailbreak stunt: the creator prompts a few chatbots to blurt out “I’m con…
ytc_Ugz9Ho1OL…
G
I wholeheartedly support and agree with what you're saying in this video, althou…
ytc_UgyggL9dp…
G
Even as someone who dabbles in AI art once in a while, I do agree that there sho…
ytc_UgxtlbLQC…
G
Human are the babies and AI will be the mother and we need to have the mother li…
ytc_Ugxi9i_yg…
G
The only ones getting mad are egotistical artists who only now realize that the …
ytc_Ugx357A3i…
G
Those two clips very good AI, but I’m gonna have to go number one that’s the mos…
ytc_UgxhxtpHF…
G
Yes even though the people at the top of the skills hierarchy can beat what is e…
ytc_UgxjzjfI5…
G
Yeah... click the cookie button on every single page first and then tell me abou…
ytc_Ugz7rVMdu…
Comment
Advances in humanity have never helped 'more' people; only certain sectors of society and the world. Eg. While some of us are living with computers, and cars and phones there are so many in the world who don't even have basic electricity yet. So I disagree that AI would free people to rest or pursue other things. It will for the richest, but just create mass poverty for the majority rest. I agree that it all depends on who controls it whether it will bring positive or negative effects to humanity.
youtube
AI Governance
2024-01-01T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyv0lzbg9vli8rRy-J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzOF9Hty1yZjBdY5ll4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwCT5_GI9JYcftocup4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzsspx8U5i97Dw3Hzh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSxnMa55SYRVPHfO54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyRBiL8AGNTE12RQAx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugypz9qJFT3lCup-TfR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugwj3QxgADxxFaUZO0t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz94tdm-i-DUlfdF8p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzKH1nLgFVtO2Q37WB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]