Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Google Deep Dream was an early AI, whose entire purpose was to try to identify f…
ytr_Ugyds9jNB…
G
Love the channel! Having written over 500 songs for 21% of the worlds nations, I…
ytc_UgzgZpWNW…
G
nothing pisses me off as a disabled artist more than people saying that AI is fo…
ytc_UgwD5Drex…
G
Sending my support from Mexico where the same greedy companies are taking advant…
ytc_UgxFnYD8B…
G
"Making AI aligned with American goals and interests." Good grief! Who wants a w…
ytc_UgyRhtIij…
G
I have something to say about whether resources will be distributed well to peop…
ytc_UgxDcYL0U…
G
One of today's most prolific philosophers having a conversation with an early ve…
ytc_Ugx-v1WxH…
G
Yep we're doomed. Ai will take over it's has already. people can't live without …
ytc_Ugxh-IlA7…
Comment
As much as I hate the idea of this kind of automation, it's also useless to try to push against it. If one company doesn't do it, another one will. You can't really ban this kind of innovation. What are folks expecting anyway? That companies are forced to use human drivers even if they don't need them? Though they could have more stricter fines against accidents caused by automated trucks for better deterrence from companies going YOLO.
youtube
AI Jobs
2025-05-30T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxOcLRVY7YAfv6UhtV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxzcaqoYVBliswWW-94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyaaRAn8L4U7KJdE6N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyDekBTwPcxVCChEQZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_dAPvVuGA0hpRgCh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxG9jz9_vSLxeMGrpV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzmx-aUf6zXwSZTSTN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxele4rskjmLR9yZ8l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxf11Og-2swOnFTj-d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxgAe2pt1ywUbHaTTV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]