Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There are valid safety questions about AGI, but Russell’s fears emanate from a v…
ytc_UgyVCICW1…
G
No way I had an ad about an ai generative program during this video 😭✋…
ytc_Ugy5-Ll1q…
G
Yeah. Imo the absolute MOST AI "art" is good for is inspo for people like me who…
ytc_UgzP30XDI…
G
I heard that this doesn’t work anymore and that you have to put it through some …
ytc_UgwTHACWY…
G
It sounds to me like they want to use our voices and products to upload into the…
ytc_UgwD_wv8-…
G
Machine learning and neural networks and complex systems are intended to fool hu…
rdc_du4rapj
G
@williamtheconqueror69 If you are so bitter about this "enabling to stupid", the…
ytr_UgxIPvleo…
G
Most people do not realise the full impact of drones, yes tanks and artillery ar…
ytc_UgxnFkVsB…
Comment
Another consideration when it comes to AI and automation is the fact that now companies make consumers solve all of their problems themselves. whether it’s figuring out where your delivery order is or trying to talk to a human about some basic piece of customer service, it always has to go through some sort of artificial intelligence or automated call center where it’s extremely difficult to have any sort of human interaction and you essentially have to coordinate and orchestrate the process yourself. For some this can be quicker if you have some level of technological competence, but for those that aren’t competent, this drastically slows them down and even for those that are competent, it is extremely cumbersome in the edge cases. In a way this is really the Despecialization of labor because companies putting all of the responsibility on the consumer themselves to figure out what has to be done and what all the requirements are to receive basic service.
youtube
AI Harm Incident
2025-06-04T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwMVBjEf-Algg02rIt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyV9-RY_dbgXs8TkKx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyXVCW-XJe3bSTxlqJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwPgf8l11DYcFX1r8d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMy8LrTlF5I3oaJNJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzOYt3oTrNORoiPJpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzuqY2VP03azI2ATEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEM11NPaSJD9b-ygx4AaABAg","responsibility":"elite","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyLKFuQfLnPUiYTPUN4AaABAg","responsibility":"society","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1_OSVzm4qB32dF5F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]