Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Am I only one who wonder why we have to create sentient AI. We are playing with …
ytc_UgyVNcUuU…
G
You are comparing LLMs with Visual Machine and that noonsense. Moreover, Waymo i…
ytc_UgxcfAX3P…
G
This is a big problem, but AI Safety is much more critical. 38% of AI researcher…
ytc_Ugwp3fhtQ…
G
The government will use AI to control us…..like the DICTATORSHIP in China does t…
ytc_UgxaClk-c…
G
@knobwobbleno you dont need to. there is reality and the tool just need to show …
ytr_UgwaNINYC…
G
I m sorry but at least I don t get ghosted by chatgpt when I want to vent about …
ytc_UgzKyV9dr…
G
I absolutely hate AI and what it’s doing to our civilization. I hope all of the …
ytc_UgxN0uJPZ…
G
i already hate these custore service and random spam calls now i will hate more …
ytc_Ugzd0lsc_…
Comment
@masterplayz5220 I think you're missing the point. I'm simply saying you're putting the decision of whether you live or die in the computers control. So the computer decides without your consent or control or will it would save the pedestrian and put you head on with an oncoming car. Now in this instance the cars weren't going fast enough to kill the driver, but if you were going fast enough in the same senario you would have been dead. Now I don't know about you but if my choice was to go head on with an oncoming car because someone jumped out in front of me I'm going to try to avoid the person but not if i means I have to kill myself and my family that is in the car with me. But with this self driving car now you and your family die. Now do you still think this is a great idea?
youtube
AI Harm Incident
2025-10-23T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytr_UgwL5iVASFZ-jvE6_fp4AaABAg.9o64GCwQ_gp9o67CfIYzVg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzvpEMLjxFBCpv3F2N4AaABAg.9o5dkz_G8QW9o5ouXaiUW2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw8oU5uC8r-pv7CE2Z4AaABAg.AUfiKy4bDKCAUzXUeU1XKL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx_sNwG2c9X-0sOmY14AaABAg.AQdeXAB_vozAS2yNqKYnuL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxaDQwGDmYZPvXnGDF4AaABAg.APltwIoNyF3AQTi3SHFK7G","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgygUlMPCGO7GGQ7L-F4AaABAg.ANPOrxMEBvoAUKk2g4qiRr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgxpoWIBjXDiyoSsNht4AaABAg.ANLDRPCwGq6AOb-gvVk-nT","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugy14gdg9yBJ7UrnVmZ4AaABAg.ANJpUzuTD1iAQxgkY8KXkb","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgxJ0GdPYWRQkNbcpZZ4AaABAg.AK8JXU4XXSkAKYktXnYlAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwoIXEQUnvksXxjjjx4AaABAg.AIXBJaUBt6fAJw-VuPoFix","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]