Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not really, you haven’t use ChaT if you think so. If you build a house with only…
ytc_UgzZwEwhg…
G
Its fake. The robot moves way to quick and fluid and it doesnt actually look lik…
ytc_UgzmCg8Bh…
G
@Abbysal_creatureAll these arts, comes from original idea generated by AI.
And …
ytr_UgwZ980xB…
G
This is some combination of hyped nonsense, foolish engineering, and an intervie…
ytc_UgxYL0EJg…
G
we all know what it's coming but fear makes people ignore it, avoid it, which it…
ytc_UgwT8te-6…
G
These would be awesome for drivers
They can sleep while the truck drives
For l…
ytc_UgwdMYwhx…
G
An interesting follow-up would be to see what an AI that is trained on one perso…
rdc_f515nme
G
2:18 "it's not necessarily on your best interests" what do you mean? It's litera…
ytc_UgxQwmUOt…
Comment
Sure, there is automation involved, but the mistaken or careless kills are human error. I suppose an exception is a missile that misfires, but I haven't heard a claim that that is a significant cause of civilian kills. It could be, but that is not was is reported. Not that the military would want that to get out if it were true. But I doubt it is true. And thank you for not calling me fucking stupid, moron, etc. . If we really support this site, we shouldn't dump our garbage here.
youtube
2012-11-24T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzLgHGb32Jea5OFKqF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7amYdQiGwJljo7894AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9bcUTIcCBDEsZBRJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7BqMhhGpcp_rNO8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9Ggwv4XWRx4nlf0d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzoWwuDpN-21OfQ0U94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxLhw3d4Cfjm14qZd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYTAGDrDX2UaHFL414AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_MFN3II2ieBv1g3l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiAhLWl3p99-PEriV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]