Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just a note, Tesla auto pilot is not Tesla full self driving. My life as well as…
ytc_UgwvKXyJ3…
G
If people randomly asked if I was "really an artist though" I might turn to AI t…
ytc_UgwpfFG1q…
G
Very few people are born into a talent (if they are at all). Personally, I would…
ytc_UgwAyuzK8…
G
Your gonna give chatGPT daddy issues and you don't see the problem with that? jk…
ytr_UgwjESSK2…
G
Glib statements like ‘It knows more than you’ are like saying that an encyclopae…
ytc_UgwtQ0sLN…
G
On the other side, imagine how many trillions of investments will go belly up if…
ytc_UgzwCVjfX…
G
Yeah the problem with this is that thise people need those jobs to make a living…
ytc_UgxBZoPik…
G
AI isn’t all that great. It destroys us artists. AI “art” is NOT REAL ART.…
ytc_UgxN4Dem-…
Comment
Primary function basico programming system where they just start talking through a god knows what and a voice at 32 bush has been collected from individual person learns how to think how to the socialise with an individual person unfortunate say when you unplug or produce certain information when they cannot think and cannot function that means the software needs a big update what the other option is that the other person is thinking much more incredibility than the AI
youtube
AI Moral Status
2024-01-05T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzbQSW-h2kiPA92q4N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz8ZTiJIUrKKiZYXLl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWfc9AscpK6WWCar94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxXr5r2ogWYtVioOZd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxaTJLTA1d-LUjMds54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwzPbe0c370QYn1z3R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwgrF5OQZ_BvAee4xd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"unclear"},
{"id":"ytc_UgzDyiTPmBHtvWPzlsx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxu31VCNLjOmjBfM_Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwcb9JuBJzTkRns21l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]