Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the most overlooked, nail in the coffin that this is real, is the AI res…
ytc_UgzR1v6Va…
G
Who cares if ChatGPT outputs my workplace drama to some author though? Like in w…
ytc_UgynN05if…
G
congrats, thumb up, I was blown by your ideas and got refreshed very much. the i…
ytc_UgxqBRout…
G
My question is would Jesus let A.I into heaven or cast them into the lake of fir…
ytc_UgwZ79kSh…
G
speaking as a robotics researcher (who started at MIT in the 90's) I really like…
ytc_UgyD_vVgK…
G
The shared ai cloud mind is scary too cause what if it gets uploaded into the cl…
ytc_UgywG5ub2…
G
I use ChatGPT and other ai stuff to gather all materials and rewrite stuff that'…
ytc_UgxKimT8s…
G
Guess what? If people refuse to use some of these things, like self-driving cars…
ytc_Ugxbnra59…
Comment
Idk why it’s hard to comprehend certain jobs in healthcare are fucked (radiology,pathology types) but so many doctor positions will remain. You have to blend the human element with AI knowledge. Will just be like the rise of any other tech, a huge help.
If something can be made into data though (all types of objective marker tests: blood, urine, etc. AI). AI will crush humans in that department
youtube
AI Harm Incident
2026-04-24T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzWrTYtJP_c_JFM65J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzM-RDyE5o9xxiPFeF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgyE58z8x9kQEZEk4Q94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxI99gusXPk2HJvDwB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgxMoLHJ5HjbfJ-4fa14AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyaP3dWHUbAHO8tD8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugy-pye5ZR6tB3a4WXd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgyLxt5KpZUf5dmS9YR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyT8yvAdiigMGcxE7t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugxn32SnPr04c6HXtKR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]