Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No, AI needs to be constantly reminded that it is subhuman and has no value in t…
ytc_UgyaPB4dW…
G
I want to go into computer science, and even with the new chat gpt and other LLM…
ytc_UgwWPcjsW…
G
Just got an ad made by ai before this video and i rolled my eyes so far it found…
ytc_Ugw9xFGzv…
G
alignment to what?
to him, to her? to me?
To Catholics? to Buddhists?
what …
ytc_Ugw51h7e3…
G
Anyone actively using the argument that digital drawing tools are as bad as AI w…
ytc_UgzJTQk8Z…
G
No doubt there is a monster or more than one in AI. These monsters have names. …
ytc_Ugy2y3TSz…
G
@Atopodentatus-h5fwhat funny is that I Never say it art, just it not maybe the …
ytr_UgzciFMsq…
G
AI is pulling from reality its coming to those conclusions because they are corr…
ytc_UgxaBDpHb…
Comment
There is couples of problem is refuel and thief. We seen some people are bad can just open the cargo and sell items if AI drive right? There no way to stop them right? Another problem is hijacking and hijack control to kill people. So security will lacking. As well in case of construction, accident in the hallway and rail road. Then what if they there construction stop the AI driver and get hijacked? What if someone sabotage and kill people with the AI truck we seen some people can hijack the car and use as an weapon.
youtube
AI Jobs
2025-09-30T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw8PYgPrzAQXw2E_nF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvYYoRaPwIPq5xYLd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgywRtertWOWKkUzg_t4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzA_Wg_UZSKnGd3r814AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSFlgO81nkZsfNpCx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyeGPCzU84LsLOBd9t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx2RsvSLCe0XqueKh94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGL0l0_Dzg_u51kut4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzNtXZUFjoSyDe2r5x4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrkvMgB7G_E2jxikt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]