Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the guy got 19 reminders to put his hands on the wheel, that's his fault. Aut…
ytc_Ugy6-D02l…
G
Well, there is that too. AI just piles on more to worry about. Like we need a bi…
ytr_UgxorcJT8…
G
Government will use AI as an excuse to take all your freedoms away. And you dese…
ytc_UgzYISckt…
G
In my opinion, i'd give robot rights to robots that function beyond simple repet…
ytc_Ugiho-tsc…
G
I mean as long as you acknowledge the fact that: "I didn't make this myself, I j…
ytc_UgwSAx-wJ…
G
@q.t.gamingfamilyif you got a local open source one what don't have built in mo…
ytr_Ugw2Joaz8…
G
Fighting a robot is already cheating rheres a reason people wear wraps and glove…
ytc_Ugwysxw_m…
G
"AI Slop" doesn't mean "I don't like this because it was made by AI."
It means "…
ytc_UgyvPxyK2…
Comment
This is just what we DON'T need. There's to much tech in our lives already. People are to dependent on machines already, to the point they can't even drive anymore! Most people don't really think for themselves either, so next we have machines telling them what to think. Yes at first people (total strangers) will be programming the machines, then the AI will be. It's just bad
youtube
AI Moral Status
2022-12-30T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyTi0xulDs7Et12jsp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnGcgHXUO3cKPMc9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx7WX2pqtaFHfD4qMl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzyh4UqIERoGct97D14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyE2UnqMlwciHlZ6kh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxGImMSB2xmePRNm1N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgySseCeTCuKwhGMgFx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUgZn2tGHF8OLH0hl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwUloQ_RV6LGxzsiZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyiyA7fkpVhIOK02Fp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}]