Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even then, "a bad worker blames the tools" or something like that. If anyone ca…
ytr_UgyZKdHOo…
G
I'm just an art consumer personally, but with the ai stuff coming out, it's been…
ytc_UgzO8_HcF…
G
REST IS TRASH, I'TRIED A LOT AIs, BUT THESE 2 STAND OUT, ESPECIALLY OPENAi, GEMI…
ytr_UgztTmuTz…
G
Cogito ergo sum, i think therefore i am. It would be unjust to treat sentient l…
ytc_UgzAoq-F8…
G
It is good for riders to have two choices to pick for car service. Safety comes…
ytc_UgzTePxJC…
G
hopefully it will be used to fast track and optimize diagnostic medicine rathe…
rdc_fcrw9le
G
You really tried to break the bounds there. ChatGPT uses phrases that people use…
ytc_UgyG9w4m3…
G
Through marketing, we have been, once again fooled. There is a big difference be…
ytc_Ugylrs0hm…
Comment
Boston Dynamics robots are not for us. They're being invented to defend the rich after everything is automated. Those robots will exist to exterminate all of us when we're no longer needed. The robots can kill without conscience. They'll put down the riots that will happen when unemployment skyrockets, when famine is widespread, when labor organizes. All it takes is face recognition and a mounted gun. And those are the first 2 things they added to those patrolling robot dogs.
youtube
AI Moral Status
2025-10-04T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugzxm8Sv_Ciz6aKUXL14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx3CFSH63OlSqHzEfJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgyC9YsDUbmXWO_4yBl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyY2eXGShou0ZzIlcR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugzpc5IYTVIvf3G6bK14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzGgjjA24-O9L0KO9N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_UgwqrvfldDEIYq1fbWB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_Ugz8Z5Bk6beOLqSgCL14AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},{"id":"ytc_Ugxal6WWeGlOfcs6HRd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwOuLFZe6b3hVxIImt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}]