Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why would you tell them HOW to get to AGI? Who needs AI? Until humanity evolves …
ytc_UgwDBn6JO…
G
Robot: 'Human cant create smarter robot then human brain"
Human: "Why"
Robot: "I…
ytc_Ugweb0APd…
G
I didn't ok this...but i guess im paying for it through my electric bill. I dont…
ytc_UgxlqJQ9a…
G
1:03:57 Will Machines Have Feelings?
Geoffrey's AI interoception postulates it…
ytc_Ugw4eRshe…
G
I can see a scenario where AI determines it needs power and it redirects all ele…
ytc_UgxNWIOit…
G
@kitataki2296oh no I had a good listen to the whole thing. And something needs t…
ytr_UgyWzdbcn…
G
Cosmic robotics for making space station in the mars and also for instaalling an…
ytc_UgwCtFFqR…
G
More AI is great videos while we’re on the verge of massive stock bubble burstin…
ytc_UgzPpNqUR…
Comment
But can you really say that ChatGDP is morally responsible for its passivity in the Trolley problem? Isn't it the makers' of the AI who are morally responsible, since they are the ones who are in control of the AI's actions or inactions?
youtube
2026-03-10T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugygk3yyG4UBavktzBN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx8hGqvTXH4SCdNeyB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugza2BgArsDvnRk0F354AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx1a6URFwicFVDdBax4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyuAi7s3i5M1_ho2gp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBbo692bv6UhOJPHl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeexFILZ_JGgtijER4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzFeyv9pwE0NmpfcwN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxWpywBpAR57q23Ukl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwNl-55Uuk4x6J7qvd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]