Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find it quite funny how the anime filter thingy on tiktok whitewashes people, …
ytc_Ugx7ZnE4l…
G
One "argument" that annoys me to no end is that AI is supposedly a "tool". AI is…
ytc_Ugzf_3tEX…
G
When they finally replace all the jobs with robots & AI, there won't be any mone…
ytc_UgxXlhAkB…
G
there's too many web developers now because people are getting dogshit 4.0 gpas …
ytr_UgykJNJEQ…
G
OK I appreciate and understand the concern about the environmental impact these …
ytc_UgxJOwvYL…
G
we have the agree that ai artist are the ones who haven't touched the paper…
ytc_UgwrdLWA2…
G
@TicklePickleLover I have some leprechaun gold you can buy. what is called "AI"…
ytr_UgwTlwQcj…
G
We found the sane person in every comment section. A AI is made to just mimic so…
ytr_Ugy7V_DN_…
Comment
I think Altman's responses are gobbledegook. The few, like him will control the rest of us. The more AI does for us, the more intellectually lazy people will get. Why bother learning anything when AI does it all? We will go back to the ages of Kings controlling all, and the rest of us will be serfs sitting at home, not contributing anything and doing little but existing. If voting survives, people will just vote for the politicians that promise them more universal income. There will be little hope in the future, so people will not have kids, and human society will likely be gone in 100 years or less.
youtube
AI Moral Status
2025-07-30T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwQHf7_L7xSR3GmI594AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy8P5YA9mx98l5gzOR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwh9rzFy30YedTzF5p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwJMX5-wd_SBMFMSeR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLXhZKRC7i7zTMeCZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxCxw3PVKlPn5Nnmup4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwMQQo8dDKpy2AbY3J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbMYndfN5pBvii54Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx2MuL49DSGOrGxmxJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzUOTzHmgoav70TuzJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]