Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The narration and generated video were really fun to watch, but I think the peop…
ytc_Ugww8kt-I…
G
10 years ago, no one thought AI would come for blue collar jobs. I guess unless …
ytc_UgzGWPZOD…
G
I am seeing far to many comments of non-artists feeling petty about having to do…
ytr_Ugwfv60Ws…
G
Newsflash…robotics are already doing the work, just in countries that can pay ch…
ytc_UgyMrq2f_…
G
Some day AI will code better than me (not yet!), but that's no reason to stop it…
rdc_mic8csq
G
Or maybe because if there were no artists ai would not know what art is since hu…
ytr_Ugw2EImsl…
G
Unclear what AI has to do with Gaza. People are always pushing Palestinian propa…
ytc_UgyL_jw0w…
G
Should not blame the AI technology. US political systems must adapt and embrac…
ytc_UgzEQYLVQ…
Comment
This is such a non-issue!
If you want to have reasonably well functioning self-driving car algorithms, you'll need to have all vehicles be self-driving and constantly communicating with each other on a certain road. All non-self-driving vehicles need to be on separate roads.
And in that scenario, the AI of all vehicles will act as one entity which will act to minimize harm with all vehicles involved reacting. And minimizing harm means just that. In real life situations there is no "all things being equal" scenarios, where the harm has more than one minimum.
And even if there were, the AI would then chose randomly between the two, eliminating all "moral" dilemmas.
Having the self-driving car AI take into account your life history is absolutely bonkers!
youtube
AI Harm Incident
2016-02-05T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg6IX-uG5XQOngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggqx26B0vYlNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjwZCpf6uJ5EngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjQFdEz8fzO-ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UggF86o_OEFCZHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh-bk-TAV7aFXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgileDub0CwddngCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjpqrVAg7rgYngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghQCXhv7515e3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgisOSWSkQ0bTXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]