Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bernie’s suggestions for corporate restructuring is not going to happen. Just lo…
ytc_Ugyosi2lF…
G
AI will never replace human artists I don't get what the fuss is about. Or at th…
ytc_Ugz7UBA7u…
G
In 20 years we’ll hear a repeat of the “COBOL Story”, where the now-retiring poo…
rdc_oaf07vz
G
i don't believe robots/synths are human but they are definitely people, if a rob…
ytr_Uggiw8mjx…
G
IT’s always funny when people warn people about something they say “that’s what …
ytc_Ugw8Hwo7E…
G
All jobs that can be done online in danger , lawyer , doctors , teachers manager…
ytc_UgxLt73wo…
G
@MultiJogh I see a lot of tech guys ridiculing the guy as well as trying to expl…
ytr_Ugyzh29lt…
G
1 of the few times I agree with a judge. Analogy Making money is legal but we ha…
ytc_UgyJ5a1-l…
Comment
"Who should be making the decisions anyhow? Programmers (software developers), companies, Governments?" How about scientist. And more then likely a vehicle will look at the situation on hand crunch some numbers and decide what's best for everyone in the situation, and for the example of being boxed in. Remember this if any of you paid attention in drivers ed. Remember the rule of 1 car length per 10 mph. These self driving cars will be designed to follow the laws of the road. This scenario is VERY unlikely. Though very creative thumbs up.
youtube
AI Harm Incident
2015-12-19T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgicJ8o6vgL9vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjgjA3QBACveXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgiIRvaFLRy4BXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugi6wxkU3JS5u3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjSjaD1amn_NHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughy05zsMvO4YHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjFM6BROUj5UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UghkEkbZMbCpeXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UggNzTObvFdx33gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg5W6YbwRYNMHgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]