Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Autopilot is an ADAS system the same as on all other cars and the driver is resp…
ytc_Ugyckxf_u…
G
A.I. is not one singular thing. There are lots and lots of different AI models b…
ytc_UgxtLC521…
G
the thing is AI is supposed to be a HELPER not somthing that does it for you. li…
ytc_UgyduutWI…
G
Hope you fry in hell there is no reason for robot people.take your program and s…
ytc_UgyQnLMvt…
G
This is why robots shouldn't take over jobs done by humans. What's next? A robot…
ytc_UgxfuFMOM…
G
The problem here is a researcher propagating the media's darling term "AI" for t…
ytc_Ugxw5hm9m…
G
if students want to learn what they learn in school, they dont need to learn it …
ytc_UgwiTe6cm…
G
I've always been polite and friendly to AI. That's pretty logical. It learns fro…
ytc_Ugz_Pn2Q4…
Comment
AI is "thinking"? Agents are looking for "a reason to survive" when prompted? It "hides" its own power?
Huge props and kudos for the stellar first part with the very good explanations, but you cannot make those statements and expect us that work with the tech to take the rest of the talk seriously.
youtube
AI Moral Status
2026-03-02T00:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugy_Zi6e446z8ZwDbMd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwymEOgeiqlXTiobhx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlKKdM9__HyX5L9O54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwgc6XchNCeUkOtR0R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzA_iH6Cc417sW133x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzscWfQR7ZfIPuD2zF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyrxo3Yl8kUbsYG4Bt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyR-ev6jgBcapI0sfZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuSqk0bViyGQoH9j54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaqvLZtyDjo5EmdRR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"})