Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Human gather more data then ai in real time . We human only use a tiny bit but i…
ytc_UgwHd2Ndo…
G
The hell is blue blood? It's just practice, it takes time and dedication
Edit:
…
ytc_UgxVHg9tp…
G
18 trillion dollars in, and this is what we have for the use of AI. Bugger all.…
ytc_UgxqLbEl7…
G
@AZAEL-music See, only the ignorant and privileged are anti AI. You mentioned a …
ytr_UgxVPOWHH…
G
Your the reason it eventually wants to harm us 😂 the debate that starts a war wi…
ytc_Ugzn5uJgZ…
G
C.) attempt to unionize until they shut down and lay off your entire branch of t…
rdc_jgt6bu2
G
I always show my respect and gratitude to ChatGPT because the results are better…
ytc_UgxJ6cJRn…
G
I asked ChatGPT if, under Massachusetts law, there is an insufficient factual ba…
ytc_UgyRoaCg7…
Comment
unfortunately what does video does not mention is that the self-driving car is supposed to put itself in situations that reduces the risk before anything would happen. the self driving car should already be programmed to have at least a two second buffer between itself and the vehicle in front giving itself enough time to stop or slow down to reduce the risk to the self driving car and its passengers.
youtube
AI Harm Incident
2016-02-08T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg2BtWozk8CNngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgjFkjDPjqE2CngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjEW6MP3uLTC3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgjbNTENqsljHngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj-Tm4fiodnsXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggOUDgUdR33tXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgjGwm-c396lkXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Uggj8ubOGU2UeXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg-1WzQ124krXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugif6gsoLWXGuXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]