Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Good point on the context, seen it many times in cursor that AI starts to halluc…
ytc_UgzDR-6UC…
G
54:44 another howler from Joscha. *Every* non-trivial prediction is an out-of-di…
ytc_UgyM6_hip…
G
AI will eventually make the 1% 10x richer than they are now.
Life will be terrib…
ytc_Ugy-JEgQE…
G
I was once in a discord server for writers and artists. Someone thought it was a…
ytc_Ugy_Ld2O1…
G
AI is bullshit that is going nowhere but to stockmarket to pump and dump ponze …
ytc_Ugz2JniGz…
G
Not only do we not want AI data centers we also don’t even want AI. It literally…
ytc_UgyvTSKLM…
G
Ya'll are really justifying the self driving vehicle's actions??
Yeah, we're s…
ytc_Ugx6jR1qN…
G
This video is beautifully put. I have always been more of a science nerd more th…
ytc_UgxtrrUGG…
Comment
I believe that they should have rights, but only if they are able to suffer and dream. which is a weird concept but I feel like if a robot wants to do something and achieve something in their almost never ending life then they should be allowed to do it. what about you?
youtube
AI Moral Status
2017-02-28T21:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghVriokmiBrdXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggjMob2djzkEHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggJr8-UN-xM-ngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UghhNDhzWUUiOngCoAEC","responsibility":"government","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugj9myDUs7y-zngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjfweSgo8G6r3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjXivWrKkGxu3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghzKagSWsoOAHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Uggj1y11qcrSHHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggaLH0Jy1BVU3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]