Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How does he know the AI isn't just choosing from likely word combinations, like …
ytc_UgzO1o6Or…
G
Someone told me that people being polite to ChatGPT is costing them $10M to proc…
ytc_UgwsPI6uN…
G
AI means that I can finally make decent pictures on my own without having to rel…
ytc_Ugw0MpXOI…
G
This is just wrong because of the simple fact that the AI doesn't have a body. T…
rdc_j8venqd
G
-> I feel the terms 'Rights' is applicable only when something is not artificial…
ytc_UggU-9Nq9…
G
Thank u for this posting. I'm sending this out to my friends who are not artists…
ytc_UgzNopWe3…
G
"Your scientists were so preoccupied with whether or not they could, they didn't…
ytc_Ugy4bkznk…
G
Here’s what I heard. All of the Hollywood sci-fi robot movies like The Terminato…
ytc_Ugy9myUk7…
Comment
I think it's SO stupid to willingly be creating something that could whip out the human race just because people want convenience... The future is so uncertain but no one will care until AI is the new top predator. It's so fucking silly and honestly I wish it didn't happen in my lifetime 🥹😭😭 why do I have to suffer for someone else's gain?
youtube
AI Governance
2025-06-29T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzCSPBnCc-rGr0Wn-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8xia2S1iKmUtFWll4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbdBTT4xQ3g1Aqxp94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz9vZ_sUK79yy1xYyF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwzXhmEdLAKHfKBJn14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVd11sBkM1Uk4W6N54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8kXJ54R54LGC4pDl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVHOi9R8IukZippyp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyr_IfBWsMsVHx8IZR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxstt_ZvTwkO0gydr14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]