Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The lack of eye movement in the first one makes me think of knockoff movies with…
ytc_UgwVFXjWv…
G
By the time you come to the conclusion that ai needs turning off it will be too …
ytc_UgzGZweLj…
G
Ok. I think that the Self Driving car should be programmed not to get its self i…
ytc_Ugis_iWcr…
G
As though corporations need MORE profits. Greedy capitalists are going to get us…
ytc_UgzU4JZuo…
G
I want to see AI take over all of the scuba divemaster electrician's jobs 🤣🤣…
ytc_UgzskPSSS…
G
People: Ai is bad because it could never do what humans can.
Same people: lets …
ytc_UgyDhl4sT…
G
I do get the vibe when artists talk about this that somehow their jobs are the o…
ytc_UgwYfpaH1…
G
This old gentleman seems smart and quite nice, but he is dead wrong about some t…
ytc_UgzFWCS71…
Comment
For me, right now is a matter of power, as Elon musk said, if he doesn't do it someone else will, this stupid excuse is sadly the same everyone else have. An with thechnology is nothing new, competition will drive its development, not its regulation until some tragedy, but the consequences of AI, I think are much more subtle right now, so regulation is not really incentivized
youtube
AI Harm Incident
2025-09-11T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyssvt_SsowG0Jyup94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3HOkZ5ptP_KNsnFV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy7kIenXrL9WgdFsEt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwVeECo2XCjgv6Y-fZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9dYXjtUhOT5sfDfx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzOETw8f9WPniYu3FV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwkxxcAkmc_LuO3Pnd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZHF9kT0cn6pdtbZR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw_OABAnkYuo_S-Cgt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzDFiKQ_2-OI4z3SGJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]