Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I draw as well and I dont hate AI but bro I swear these AI prompters just makes …
ytc_UgxfuJUIG…
G
I think basic neuro emotions endorphins can connect with already human-made drug…
ytc_UgxLbqhlP…
G
I dont mind AI, as long as a human input is needed, drones are stil controlled b…
ytc_Ugy1opUJZ…
G
ai art should be like a style of art like digital and traditional, not replace i…
ytc_Ugz-2Gq0N…
G
why is YouTube or this Channel Blocking my comment of very basic simple facts wi…
ytc_Ugyw4H8Vh…
G
100% True. hackers can hack the code because they already have the access to it …
ytc_Ugy_Xwoui…
G
They defend AI art because they think it allows everyone, even disabled people, …
ytc_UgzdiffOy…
G
I like to think that ChatGPT isn’t actually an AI model and it’s just thousands …
ytc_UgywokC1p…
Comment
A.I. Doesn't mean ROBOT. People are afraid of ROBOTS because of HOLLYWOOD and they just say the word A.I. We throw that word around a lot now. Same as the word ''Quantum''. But they mean shit for what its worth. Can we program a robot to dance, hell yeah and how. Can we program a robot to kill, hell yeah and I'm sure outside of this video we have killing robots we don't know about already. But none of that is A.I.
We aren't smart enough to invent true A.I. We will only ever make thinking machines which we call A.I.
We'll kill our planet ourselves before A.I. or something else from HOLLYWOOD kills the planet don't worry about A.I. Worry about fuel consumption, where you put your garbage and things like that.
youtube
AI Harm Incident
2025-09-12T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzMaHmNFVsQ3DXqr0x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFIJ8ZpprQm6sIOoJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgycixXcrX3ToUZwiDx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyh-ucSLULqkNL227h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzg0O8cAk9ZXS_PwQt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwttKBZIjredivAmCl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxuBv5ZDyUWGHtKuGV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzc9GBMAPWxvW21rRJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxc5maDM109P3r6nZB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx2ag7I3nk9nHDjfOt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}
]