Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate AI. I hate the people who made it. I don't care if it's rational, I want …
ytc_Ugzvuv_XQ…
G
@CarlaRaziel Yes it does! Accelerate evolution in consciousness!
I've been lis…
ytr_UgwwgH2fO…
G
AI defenders really think making prompts is some incredible, laborious feat that…
ytc_UgxnHdun6…
G
Artiest: Someone who uses expressions of creativity to produce artwork. AI artis…
ytc_Ugw7sZns2…
G
AI will have access to all human knowledge. But will be more intelligent and lo…
ytc_UgyRA0XxX…
G
I have a theory that people that are rude to AI are rude to service workers…
ytc_UgyNWKaZU…
G
Lol everything is a possibility.
I can see everything getting replaced with AI i…
ytr_UgynBKDe6…
G
I hooe all rich news anchor liers fake news were replaced w ai fkn traitors for …
ytc_Ugz8QZ1WF…
Comment
Nonsense. We will happily hand over control. We already have to a very large extent.
This sci fi assumes the AI will have ambition to take over. Ambition would have to be programmed into it along with the skills needed. AI is not made by nature. It has no desires.
youtube
AI Governance
2025-08-07T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyOq2S9H2Q1sw9fcSR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQHfJWLSHtiGns1MB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7jKD1tDJ9VPPLnOt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPQg4R0oY7flU7bDZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrGaiYqNVEuFb-12l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwGe3mOQhS2l2uFNUt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyAKbdUDIixqUJKtRJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyXgga_zvAkKQNCMHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtpEIKAx7sKMqqIpV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQ1I2Wbyxx8vfdtMt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]