Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you for your comment, @cjsy9486! Elon Musk was indeed onto something with …
ytr_UgzSNAWuz…
G
I don’t get what the issue is. The person literally has AI in their profile and …
ytc_UgwqxOUq0…
G
In Sweden, there is a shortage of several thousand truck and bus drivers, absolu…
ytc_UgzfkFbjG…
G
...you're making software that is designed to give the user whatever it asks for…
rdc_n0i9xvd
G
Am on a 3 year Bachelor course in programming here in Iceland. Right now I just …
ytc_UgyA0G8vk…
G
THATSSSS. And also it doesn't need 100% jobs automation or AGI to to render the …
ytr_Ugy0udtUg…
G
For some reason, I expected this to be horrible but it's actually decent advice.…
ytc_UgwzxQjfj…
G
you say us using rockets to get from here to there is just so stupid, but we did…
ytc_UgwC_Xh_P…
Comment
It's really not the robots I'm afraid of so much as the AI software and its ability to sabotage internally. I work for municipal government, so our IT team is top-notch. They ran a firewall simulation last year using AI and without approval it started accessing files it wasn't designed to. It took them 4 hours to force stop the software as it kept rebuffing their attempts to gain control. Since then, the employee handbook strictly prohibits the use of AI software of any kind on city servers/equipment.
youtube
AI Responsibility
2025-05-08T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzEORRfoIKqaBNz4GJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugya9drMh2RyVx5gNBx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_Tj-HvjzbGYTioFZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxul9SQY_wHE-ZoK1t4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz06z0XuhwtRD1EQYR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwhQ6KR4sodjmFqQB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz91X25f1c3Ms4Nmht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxO_mq8LCeN9GpAzDp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJlUXoC7JZV0S1af94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAZLuW7G68pYvZP7d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]