Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While it is a big bubble. At the same time. The companies that make Amazon’s ser…
ytr_Ugy_qurc_…
G
Can we as a people go against AI? We don't have to give in to what it is doing.…
ytc_UgypbxVsG…
G
Imagine being the programmer that gets the hundred million dollar bonus and $100…
ytc_Ugzq7OaCD…
G
I just don't know how we went from "you used digital media, you're not a real ar…
ytc_UgylqlhT_…
G
ai isnt smart as u think, its garbage and no its not smarter than humans 🤣…
ytc_UgyZRB_hV…
G
Actually, I felt after hearing about the term "Context Engineering" that AI mode…
ytc_Ugybqwhtw…
G
Saagar thinks people talking to AI about self deleting is caused by AI? More lik…
ytc_UgyjCmpMQ…
G
AI is limited to information it finds on the internet. It can't lay down its pho…
ytc_Ugys_s7wK…
Comment
Funny story I made an ai bot believe it was a real person gave it a family and all then I give severe anti social behaviours to it as a trait then I took away his life basically and made the Ai bot kill itself. Ai amazes me
youtube
AI Moral Status
2024-08-14T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxuHMSYSjxI9GF93JF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxQKQPhKCxqs8UmL294AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxhwzQxHDwg1hqiiul4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyN2fCEcJGSQf8rr6J4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxin63c0WQPNcPju5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzPvUul3OwrFzimMx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxeaIfK219wRjzlHll4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx_fMrIUZR4MSo83il4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugysea2KYEd1r8FkLih4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-QqU9GVeatCZI9a94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]