Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
machine learning researcher here, i wholeheartedly agree with what you said alth…
ytc_UgzVfVMDT…
G
I think that AI will eventually replace humans by simply being better than us at…
ytc_UgiAhqmX_…
G
AI can’t make art without a human doing all of the work for it. Art is self-expr…
ytc_Ugx0ohxqf…
G
Of course not. Ask them why they think AI is art and they will either get paraly…
ytc_Ugw9vkbEz…
G
Me too. But I’m a mechanic so the scripts to change brake pads automatically wil…
rdc_hsf84sd
G
I think it is reasonable to say I intensely dislike these people like Sam Altman…
ytc_Ugz4lIEMC…
G
we are gonna have to wait ten year for whats this visionary guy discoverd, AI wi…
ytc_UgzwBOC_r…
G
Except you know.... They're the ones developing the AI so that the very least th…
ytr_UgxPHjKq8…
Comment
It will take one scientists to re-program one or many AI, to act differently then others. It can be dangerous. Little by little our privacy are being watch by AI. I say about 200 years or less this world is going to be All AI… 300 to 600 years from now they will take over and we will be too late to over thrown them… Year 2500 and up are going to see a big change. Our descents are going to wish they were living in the early 2000s like we are and wish they would stop AI from progressing…
youtube
AI Moral Status
2025-01-19T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzxvkN3k_9tHGPLf6t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzFfHpGluBP0-Qy6FB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw5HtOdTB11jOdVZXR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAHQ2HG07fMtbeHH54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy9hluCbeVJNB7MOlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxfD97QdVTI13zmNl14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx5Ex3tk4PX0p3h58V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwC9OB5SG86o8tC6NN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy9vX_gn4BnQDRotH54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUxEjo3DKZa081yDd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]