Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think people think that AI is going to take jobs on mass but that's not how it…
ytc_Ugz--JZr0…
G
I’m dubious. I’m pretty up to date with AI progress, and there’s no current conc…
ytc_Ugw03r_Uq…
G
And yet everyone is using A.I. for everything they can think of, not looking at …
ytc_UgyRV2U_n…
G
No REAL people want this AI S*** which is threatening the giant bubble they have…
ytc_Ugx631_9T…
G
As a Software Engineer, I will tell you this: AI doesn't think. It's just a bunc…
ytc_UgyzAkfq4…
G
As a software engineer working with automation I can comfortably say that any kn…
ytc_UgwljaOiH…
G
Only 26 minutes in, so pardon me (and ignore this) if the questions are answered…
ytc_Ugwr2PqsG…
G
> have AI code me whatever software I need
You have a seriously over-estimat…
rdc_oh5972e
Comment
There is no way to even think about destroying anything unless it’s been programmed by somebody
especially robots but really in my opinion the person programmed that robot it’s not literally human. It’s like wanting to throw a brick but using somebody or something to do the dirty work and hide be hind it but the real destroyers is in who ever programmed it
youtube
AI Moral Status
2022-04-07T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwHwTX7vJvGa2xTWpN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw6LTEr25_dEL2m6Z54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxEnZaS796eP6s4lmh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxgSowFh6-uG_xS_Bt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxR5bqu3MfgVwvYI_N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwtj3W2OUuvYdOpCpt4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"fear"},
{"id":"ytc_UgyR_nGHA-aEJcoPHI54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQCyoUZJ26eRm-z_x4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyH50DsSJewYNuXquZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxl0b96pX6OyZTLykB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"}
]