Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think we as humans should install a chip or something to every AI we create. I…
ytc_UgjGaC_Pe…
G
honestly shocked that you can't see how easily delivery jobs will be automated. …
ytc_Ugj6ln9bC…
G
The more I hear about this story, the less the problem is the facial recognition…
ytc_UgwX3IohJ…
G
Another absolutely hilarious thing abt the ai bros is that so many companies hav…
ytc_UgwJ0bdVI…
G
It already is. I have no problem with AI art until a corporation starts to use i…
ytr_UgwY64jFh…
G
I've never heard that argument before now, but even now it makes me mad. I went …
ytr_UgxVlTC7_…
G
it aint AC, Artificial Consciousness, it`s AI, artificial intelligence, having t…
ytr_UgwIL4Iwr…
G
People can turn my art into ai generated stuff. But when the results are dull an…
ytc_UgxFDxsAk…
Comment
If I can’t get a job, I can’t buy the product. And I’ll say this, not everyone is going to be in the robot repair business. So if they really want to replace humans, go for it, and they can lose sales and pay maintenance. And any business I hear replacing humans with ai will be boycotted.
youtube
AI Harm Incident
2024-09-18T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzyHUnRWbaBxSEJ-Sx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDWJ5UK7fRKq8074d4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw95eTiPdNefzfkfCt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXHsyx26zduw-J1-Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy_E-4OIF3fF-Mx8yJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnHSbIYw1nD66znZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxlXCEyrnhtStgRYPl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRbvChEzhEvxUtF_p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz2ia_OeyeLPCyor6N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3m8WYO3LQOWoUiBB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]