Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think he got into the place that the robot was grabbing the boxes like he got …
ytc_UgxbyIGpu…
G
Because AI isn't smart, it needs about three times the energy required for calcu…
ytc_UgziZ9q5y…
G
ai is also extremely dangerous in that it can be used to create illegal content …
ytc_UgylUjhZd…
G
I was going to say it should be illegal for AI to give medical advice but in rea…
ytc_UgyYC6ilI…
G
Honestly I’m all for AI art in the same sense that it will be in the future (hop…
ytc_UgyRrJ_qC…
G
This video failed to address whether the autopilot has lower or higher chance of…
ytc_UgzRJBNNr…
G
Honestly, the fact that AI generations can't be copyrighted is the only thing th…
ytc_Ugw6mRCr8…
G
Holy F, so illegal immigrants taking jobs is okay. AI taking jobs is bad. Are …
ytc_Ugy9nZsMd…
Comment
@slender5738 why would the programer have the robot complain about it's civil rights? Well they wouldn't these robots are more concous the a lot of humans
youtube
AI Moral Status
2024-07-17T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugwl0Cw5u30SGjdmgbt4AaABAg.9nw_owR59WO9nxPH0ESPxv","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxHZYblUOI4PGGgVep4AaABAg.9nwPaSuFSra9nxPfJ1m40Y","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzn5bYxrvM3gUlhH4B4AaABAg.AEtLihlIMhyAEwu5GdxuPV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxkKn7E9fSTPCWN-Dh4AaABAg.A5vJc_nsraEA5xKi94HF8o","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzsTPmYOvD8nouDVr54AaABAg.A5qWnVS8bJDA5xLDytZJb0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzfpoyJbCSLwvwFXp54AaABAg.A5n4y_6eA0AA5xL_BSaYxf","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy9vWCDpVRUkKpXMj14AaABAg.A5moONOr6kiA5xM3oHs51C","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugy9vWCDpVRUkKpXMj14AaABAg.A5moONOr6kiA5ylOV8_25o","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgyzJxh0IO5-4SF-7Lp4AaABAg.9xNbgOYc0zG9xPLJO5CtTH","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytr_Ugz2dCqS66hPkJq_NQ94AaABAg.9wz_R4j_VGVASra1c7uAjH","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]