Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
States that ban the use of AI will end up losing nothing. Even businesses will n…
ytc_Ugx-KQ7cT…
G
I like the show but I cannot bear listening to all this nonsense about AI person…
ytc_Ugxa0616u…
G
if they are coming out of the closet with where AI is at now, then it is already…
ytc_UgyJiGFHU…
G
Bro really said I’m incapable of being self aware because I am AI which means it…
ytc_Ugw7tc4MV…
G
Terminator coming you will see. We think we can control it but IA will trick t…
ytc_UgxHRy-qR…
G
A majority of humans are afraid of control being taken over by ET ' s but humans…
ytc_UgzwZijhw…
G
The number 2 clip is AI generated with sora
Because:- that women in behind clip …
ytc_UgywhBkPI…
G
@uyrgdjcxvxnbvxcbvjkerghytefdghBecause of your deep-seated hatred for AI and ev…
ytr_UgzpjN-1Q…
Comment
If we consider the 3 laws of robotics proposed by Asimov in 1942, injuring the human being on the motorbike is out, risking injury to uninvolved humans in the SUV (and other collateral damages to occupants farther afield) also contradicts the first law. Braking and inflicting large damage to itself contradicts the third law, therefore the AI must always leave enough space to the vehicle ahead to minimize damage to itself. This makes the whole situation irrelevant in the first place. If the collision were to be unavoidable, the third option would rule.
Now, lets study a collision against a pedestrian...!
youtube
AI Harm Incident
2018-04-12T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzmTWvkx_16kZxRipB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwU3JKTd7DXea630ch4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5Al8HpdWeQXVA14V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzR-I1zg7Fd24xkO2B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTRsoOWGCfnwZIeM14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLGAxBlRBrU6Q9HzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGRQqzWI3R8ooBq5t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzTIXhunuFErqKh5k54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypFlEyseWL_KumX5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMlxjfJsg85RHvnPZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]