Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn’t trigger it already has been triggered. AI is part of the process but …
ytc_UgyIOV4B_…
G
RTS games give a good visualization of how automated warfare will look in the fu…
ytc_UgwND01hu…
G
Oui ça aurait été top de pouvoir voir l'impact des data center aux États Unis qu…
ytr_Ugxiz9P9c…
G
Yeah, no. Im a network engineer, and i see ai being wrong far more than being co…
ytc_UgyXbS41G…
G
"A man asked AI for health advice and it cooked every brain cell"
"I think bromi…
ytc_Ugw7ixg8B…
G
id take a fetish art over ai pictures any day. even if that fetish art is the gr…
ytc_Ugwoj7Ho0…
G
Chat GPT gets the answers to online quizzes wrong about half the time. And I am …
ytc_UgyCySHdG…
G
The first one is AI, it has no music in the background unlike in everyday life. …
ytc_Ugw5CI_ZQ…
Comment
Why are we not talking about the complete lack of social safety nets AND the fact that you can't hold a computer accountable for accidents. What happens when autonomous vehicles have a brake failure and plow through a crowd of people? A person is held accountable for their equipment failing. Driving is all about decision making, and you can't hold computers responsible in a reasonable enough way. We should be automating hard jobs, but we don't have anything set up to take care of the people. And a caravan function would by definition reduce available jobs still
youtube
AI Jobs
2025-05-28T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwZUxgZ-6kD-_WVTJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyt74tKQBp7cZDWDRp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxdxkqADVptmtoeSux4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxrvIHp0AbwtfFRjvN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_fECPhp-TSyaIHWB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyS2_tBHis3bkDVcvx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyrIif-3FpQUrB8-714AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyQ2PFTSlLlwGu-m8t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxMlkOo7yXJSxtazDF4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzVRH5ceq2iZEzlaA14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]