Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Acting like an emotionless robot was really just Zuck playing the long game seei…
rdc_oh1ia23
G
chatgpt made studying university 10x smoother.
apart from text generations it's…
ytc_Ugx2vmXt7…
G
i find the ones that shout loudest abuut their irreplaceability by AI will likel…
ytr_UgxFyIwWA…
G
I literally (out of boredom) asked Chat GPT “As an AI yourself, what do you thin…
ytc_UgwScRJEh…
G
It's a tool like any other, if a person misuses it they should be fully liable f…
rdc_ljoetnp
G
I use ChatGPT whenever I need help with a homework assignment I’m baffled of. I …
ytc_UgwgZfikG…
G
Let's not get cocky, AI is at its absolute worst and will only get better, one y…
ytc_Ugw-dz8Iq…
G
Very good regulations. Let's just see your own quotes: "...you've got a cool new…
ytc_Ugyb341DN…
Comment
The bottom line is that people need to work people who work earn money they buy houses they buy food they have families which supposed to grow whether it's a white collar job or a blue collar job people need to work because they need to eat they need to be leave and be proud that they earn money technology is good to a certain point this is going to be good AI for the people who are greedy who are going to eliminate thousands of jobs which is going to create a worse situation overall for everyone. There's an old Twilight zone episode from Ron's Sterling that addresses exactly this particular situation about technology doing away with jobs for humans. It's common sense.
youtube
AI Harm Incident
2025-07-13T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwqdmPSvccejnw-5mp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxT6FkFXYEUO_sWxwR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy7APR8V4AeCWuY2dx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1p1jT0cIEx0gRzhB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKQgMYhdzX4uqpiC14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAIn1gcqOC5iBL_ph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzTc4_0eKohjTb5eu94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2adesEFcP-_-fCnB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzoNcIbYUkadr6z9lh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9NWhzscmSkau98MN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]