Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At 1 hr in. Christopher Langan would disagree with a materialist who thinks con…
ytc_UgxSc_4-o…
G
No one laughs at their little jokes because this is pretty terrifying. Why would…
ytc_UgwVuytIP…
G
You can kind of teach AI to think because I got mine to not want me to delete it…
ytc_UgzqaO80L…
G
Bruh tell me about it, I tried to get into Fan Expo Toronto this year, and was t…
ytc_UgzOZwUsW…
G
And def, if I feel AI replacing a chef's work or a waitress job, it has to be ch…
ytr_UgyzbPxNH…
G
AI was supposed to replace drivers 10 years ago. It's the Current Year and it st…
ytc_UgyAPP0Fl…
G
Today's society, businesses, and politics are so uncontrollable do to the lack o…
ytc_UgwceJdyX…
G
Why does that second robot kinda look like Doja Cat?? Or is it just me 😅…
ytc_UgwBiVjdK…
Comment
Let me tell you how this pans out:
They conclude that humans can't ethically handle this technology, and use that as an excuse to let the technology "handle itself."
The only problem there is ..... The synthetic morality of the AI tech is still being DELIBERATELY CRAFTED by the people who run it.
youtube
2025-08-27T21:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzNU1NkH5T2mB4YXzh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBbamMSHraXXbDW5J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyvE94pDdWBmp1QvoV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwqlsW1ZWq-9V6TR_x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwJ9Sjm3Mt6XNGjWYV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwmAudV5OLyqYT2OKV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzTqP4HyY2TkzlsKK14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy6FFIA-XuyExz-LDh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzk5HRNDJfAdHyeW7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyacL69hKuaWbzvx5l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}
]