Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI could take over the world in less than a second, and the people making it are…
ytc_Ugx80adyy…
G
This is why we need AI bubble to burst sooner rather than later, as the longer i…
ytr_Ugx3RUL4o…
G
Ai is just a program fetched by humans. I think we can control creating and usin…
ytc_Ugx4P2_Rq…
G
I wouldn't a trust a sentient robot chef it can do evil stuff to my food. I woul…
ytc_UgwLZTh1-…
G
This dude has been worried even at the start of his career.
AI Owners : People …
ytc_UgzwacKN2…
G
If he’s lacking skill and charisma, then why do so many people watch him? I thin…
ytr_Ugw8tsxCp…
G
Lol reminds me of that 1 scene in the movie Bicentennial Man. Who else saw the m…
ytc_Ugw2pcxd_…
G
ok i agree with all your points and im not an AI advocate. But i do think the di…
ytc_UgxnBrXDs…
Comment
I work professionally as a programmer, that is my job. And I want my car to be so little automated as possible. I don't want it to break when I am about to hit something even, for that day when I actually don't want it to. I hate that today we so often have to fight our devices that tries to prohibit what the user wants. I would say Mac is the worst offender but Windows is getting there as well.
I fear the day where driving will be illegal and only self driving ai cars will be allowed, it will probably come to that though..
youtube
AI Harm Incident
2022-11-20T20:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxWEUMaUEDYyGIjoQ94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxdK8TU5Di3ES4ElLZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxAmzlCFmrZmxMHRI94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQ3BRizlPG8DSnvyx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzefO_F95ohs0NlSdp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"disapproval"},
{"id":"ytc_UgzG4obmJAZGFm4HiGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwFHp8SUlQu3Uj7rex4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwB5QsAllfNxW11pJN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxM8a7Nd1Qit5aSspR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxhzhse5PJvVG9QIF54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]