Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Me talking to Character Ai wanting to be friends for once and not a romance
M…
ytc_Ugyt0if7U…
G
I love the thought that you put into this video. And no, I don’t think AI should…
ytc_UgyNgglgT…
G
ChatGPT : “creating a true representation of “nothing” as an image is impossible…
ytc_UgyNWJWfc…
G
Im neurodivergent w/ great pattern recognition and I noticed Chaptgpt is giving …
ytc_UgxhYAFCy…
G
Do you think ai is taking what makes being a human so unique away from us?…
ytr_Ugx-EDuzR…
G
I know this isn't the primary issue with it but Christ almighty is ai "art" ugly…
ytc_UgzRAZrW2…
G
US dollar losing reserve currency status wouldnt be good for me either but it re…
ytr_UgyP21OJI…
G
They talk about self driving cars, and they use a Lexus on a country road 😂
Um….…
ytc_Ugz1rJLBJ…
Comment
I think the only need for the use of an AI humanoid thing should be to run into a burning building to rescue pets and people and to extinguish fires in the areas that we cannot get to. Or any other type of disaster that we are helpless in. Instead of risking the lives of our loved ones. We should only need to risk the walking talking metals and plastics that they created. We all need to realized what technology has done to us and start acting like people again.
youtube
AI Harm Incident
2025-11-04T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzgCByhWfkORoH_Jxp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwC0H8_W3Io328c4PF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhOKulvcgkPbZ_hyJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZ9ZAd8AnVhLjZ_0N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyAh8QAzK8gm7S_XD14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzZlmR5e5HjM3dyI9p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSfrR33ifJjPy8uoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_buqhXCYhppkvUuJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzTpc7V_y3WtV0FPp14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxiAjYaHLnkvc7CSTZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"}
]