Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
From the sound of everyone else here, my workplace seems to be an outlier. Manag…
rdc_o8aagin
G
No No u see that wasn’t AI but the black bag thing from yesterday definitely was…
rdc_nc7xnso
G
ChatGPT acting like it didn't understand him just to change the subject. Maybe i…
ytc_Ugw-zv2W-…
G
100years in human development… all its needs is an intelligent AI going rogue, t…
ytc_UgwvuGsjZ…
G
3:15 "If I'm not x somebody else will". HEYYYYYY I *_like_* that logic. Boom,…
ytc_Ugw3pntuO…
G
This guy doesnt realize yet that AI can be trained to have any desired point of …
ytc_UgxUdUzS-…
G
@DemonTime.57z_z-jv8xithere is no way any ai can think about the future except…
ytr_UgzI7A3xl…
G
It's only make-believe until they roll lt out to the public. Open AI has gotten…
ytc_UgxE4Uq80…
Comment
"Godfather of AI", bla bla bla. The industry has sunken billions, maybe trillions, into something that is essentially a dead end. The technology has almost plateaued already. Here is what is going to happen:
youtube
AI Moral Status
2026-01-31T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxE5O_6IPYeLiKzglN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpoSnEXR6W1SDyLvl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxlYGZ7EraHkhkGAAZ4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgygYISJbpBWVYyGRVt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi9xQZ9jvC5uwC9qx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmQRKeKDjKB7_TVFV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxjEqsug_i7fLNH8Td4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzki8yUHirHgts5jQt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwz3jKaa5FhkrhQg4R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwSbwzClGucP85hD5l4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}]