Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Government always finds a way to make it a problem for us recording them, but th…
ytc_Ugxb2QB3x…
G
While I'm not arguing with you about there being way too many laws, and those la…
rdc_dzebqle
G
11:48 No, It cannot replace a human podcaster especially one who is already very…
ytc_Ugz_JhYB0…
G
i am soooooooooooooooooooooooo going to write a 200 word essay listen robot righ…
ytc_UgihOVP7c…
G
CEOs told me their goal: once 90% is automated and people riot, the world will g…
ytc_UgxAL3s-T…
G
It is not real life, because if you replace people with AI, they will not have $…
ytc_UgzZesioO…
G
Just make UBI to 50000 dollars a year per person, and have everyone except the f…
ytc_UgwkG-wBh…
G
I am graduating soon and planning to work primarily in editing. But even with ed…
ytc_UgyxLWeMQ…
Comment
Its a real dumb ethical argument
If you were in a self driving car, the programming would obviously be set to stop in time for vehicle or object in front of your self driving vehicle to stop in time for an absolute sudden stop. If the stop were to change lanes into your lane, the self driving vehicle would make impact with the object or vehicle.
End of ethical argument
youtube
AI Harm Incident
2016-01-22T22:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg6IX-uG5XQOngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggqx26B0vYlNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjwZCpf6uJ5EngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjQFdEz8fzO-ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UggF86o_OEFCZHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh-bk-TAV7aFXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgileDub0CwddngCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjpqrVAg7rgYngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghQCXhv7515e3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgisOSWSkQ0bTXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]