Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't like the idea of ai in general, but this seems like the smartest use of …
ytc_UgwTGCfZx…
G
Artists when they can't sell their shit for $100 per pixel: "Ai art ruins everyt…
ytc_UgwXTBrUr…
G
The best part of all of the responses isn't the obvious logical fallacies, it's …
ytc_Ugx2UV6q6…
G
Ahhh yes blame ai for man made problems... It's your rich assholes running the s…
ytc_Ugx4krQqR…
G
@MrHeatAzread the caption it clearly states the car was a SELF DRIVING UBER...T…
ytr_UgzyopgDj…
G
This is where AI would be ideal. Let AI follow the patient and flag tests or tre…
ytc_UgwlGk3kg…
G
Don't worry robot .there will be a failed safe button to destroy you if you go c…
ytc_UgwEudPNW…
G
Honestly, let's speed things up. Let's go AI! Do it! I trust more on AI than on…
ytc_UgzYZ8AnA…
Comment
IGNii7edWhat is your reasoning on that? We can make things stronger, faster, bigger, tougher or more efficient than us so why not smarter? Also if you were right and they can't be smarter than their creator (which I doubt), if the person who created the AI was the smartest person on Earth then that AI is essentially more intelligent than everyone on earth.
youtube
AI Moral Status
2017-02-23T16:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UggRTChRIG_e5XgCoAEC.8PKYumpGV8_8PKeQaTKbYF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgiwGuogXeiLSngCoAEC.8PKXOOD5-Vn8PKaZuevC6Q","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UggXH575m2uJ53gCoAEC.8PKXBHZyov88PKZpHX-dQj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugjx2gfLE92JJXgCoAEC.8PKUkxaT3jn8PK_6CiO70N","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PK_ctOVZMx","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PKadfoRDRb","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugi12tcY5scji3gCoAEC.8PKTkFK8siC8PKeIyPbVYp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgjGDitq2edvs3gCoAEC.8PKTjXh-9B18PKZ3gGEvOj","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PK_7ZVnzub","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PKa1brKO-R","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]