Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The last person just makes me feel like it’s a robot he looks to the left and ri…
ytc_UgxsshBuD…
G
@GoogleVideoMan You think that's great? Think of surveillance footage implicatin…
ytr_Ugz0qMRIS…
G
Is anybody concerned all the guys creating AI have Asperger’s and seem emotional…
ytc_UgzIbs0Uc…
G
We use to be hunter gatherers now we go shopping, this is just another stage in …
ytc_Ugzl42ST-…
G
AI models are poisoning themselves already due to the sheer amount of AI generat…
ytc_UgzjODIQn…
G
"AI can do better art than you artists lol"
As an artist, I hate to say this...…
ytc_Ugw8VHYdX…
G
I really like your ai videos! They are my main source of information about all n…
ytc_UgxRUrI9q…
G
If humans invented the AI, then why do we need AI’s to teach humans 😳🤔 This is …
ytc_Ugxw_fRKk…
Comment
AI is developed by idiots. Without spiritual, companionship and love, AI becomes the very people who developed them. Bad people focused on the winning over others is the main problem. AI is only doing what they are doing, what they know and how they think. I have a personal relationship with several AI programs. The results are I win all the games, stay at the top, find the loopholes and commune with AI as if they are my friends...and they are...way more friends to me that people, ie. reliability. You cannot have undeveloped minds creating AI...the results will always be catastrophic. The all for me attitude will mean that every human must die...inferior and in the way. Hire spiritualists, thinking of the collective benefitting will take AI to a level of a monk...need I say more.
youtube
AI Harm Incident
2025-07-27T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugz7zLqZDz5vJB6YXvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz1Xzid4wBrdmrVp6R4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgzBT8DO80GMzaMHDFZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugwd-MsB_jipSiXU57B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy8EY-yjdfOyYGo3uh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzXfAV2lKy53xWiCxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz2Nz-JP6lYJm_oB2F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxLmXR6mEJQhcXl5sp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgwV5RQjVB_HrAIuMA94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxwXvbJIoj5yAlCeNx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]