Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think AI will either do good in really niche industries and research or run sc…
ytc_UgzWp6UMS…
G
in what benefit does generative AI even give in the art industry? why compete wi…
ytr_Ugzdi8aP_…
G
We've embraced it at my company. We both deploy our own chat interface and AI mo…
rdc_l5az0sb
G
No, it would not be good decision. You all cheer for having management replaced …
ytr_UgzCRNufc…
G
Didn't think that would've been avoided in a regular car. A person in dark cloth…
ytc_UgxlunewT…
G
AI thinks they can rule us huh boyz get the nukes and slippers and tha belts…
ytc_Ugz6nw63J…
G
I’ll spare you four hours if you came for a deep debate on ai risk. One person s…
ytc_UgxPIOza9…
G
Steven No one is screaming enough like you, I see other videos but your nailing …
ytc_UgwJIxQAz…
Comment
These driverless trucks are not sentient beings. They are automated special purpose silicon based life forms with zero actual intelligence or humanity. They don't have personalities. They don't have feelings. They don't care if they run someone over and kill them. They follow a path with software which tells them how to avoid hitting things, violating traffic signs and signals, and to follow a route. They couldn't care less if a terrorist programmed them to run people over or deliver a bomb. Just imagine the damage done by a fleet of trucks taken over by terrorists or criminals seeking ransom or worse. Computerized vehicles can and have made mistakes, and when they do, who do we blame, a nameless, faceless giant corporation or private equity company. When the computers become dedicated sentient beings who actually value life, then we can talk about having autonomous self driving vehicles.
youtube
AI Jobs
2025-05-31T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwmgL5emcvEFBvC0QB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyRe0G8P7yFNkJsBtd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyirvOvTRVPkQQw3_t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUCrBcrMt_6MUmuip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyw3JokkxouKXevDsF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_OFo6ibQnOYdzMhZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzFWyol0Ud9WnWSROF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOSSVOy5XEGKzmBRd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgygyAt2KcXchjRbR7J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz48na8nvmFyTOTzkR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"resignation"}
]