Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would trust an AGI over any human being any day. The reason humans are so afra…
ytc_Ugx_8vAbE…
G
hmmmm,no mention of AI tests this year for blackmailing and killing of humans?…
ytc_UgzQWXDdJ…
G
AI will be foisted on employees and used as an excuse for layoffs, hiring freeze…
ytc_UgwrqI8_3…
G
Cry as much as you want, everybody likes it and AI can not be stopped. Go to all…
ytc_Ugw2leOUR…
G
I am happy he is coming out for the US public…the rate race for AI is highly uns…
ytc_UgwhsLCZv…
G
[The costs of mitigating are trivial compared to the alternatives](http://www.ag…
rdc_et6czyb
G
I hope this comes to fruition, I’d really like to buy one. Getting older, I coul…
ytc_Ugww_PVaJ…
G
The problem isn't AI but capitalism that drives the need to profit in companies …
ytc_Ugy2WnWEC…
Comment
having played around with "AI" extensively, and seen the failures of self-driving cars, I am confident that AI kamikaze drones are definitely a war crime. If they've been used at all, then they've definitely randomly decided that a civilian vehicle was a military target and exploded some poor russian/ukranian family.
youtube
2024-08-11T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzX8_ox8iPZz2WViZV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzKhfJFe-I9uSXylIt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxq8UPurYmBFFB9TCh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9xYC6oyGV6vQt8E54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzhydd1vDfq81qq_6R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxy8Swff8zbwrVkbcl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzRm4HnXbqZe4lDqMd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzeYQVXHTYTE1vw8DV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwxaNeHbgA_d9UBU5h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy4lUD9v5TaK2snDNJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]