Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Much like humans, the AI will go after themselves before they go after those who…
ytc_UgzDNidg1…
G
It says shift not suicide, just confused do any of the messages to ChatGPT say a…
ytc_Ugyv7RCET…
G
AI art generators take inspiration and style from existing artists, scanning the…
ytc_UgxUZFQu9…
G
Brah my childhood is becoming the future...."Dog robot named Astro", you mean th…
ytc_UgxqgWPBi…
G
Solution: Dissolve AI. People keep their jobs, Armed drones don't get an opinion…
ytc_UgwkqaosK…
G
My questions is, when one AI start competing with another AI for power and contr…
ytc_Ugx8EDh61…
G
ITS NOT THE MOST ADVANCED DRIVING TECHNOLOGY.. WAYMO is FAR MORE ADVANCED THAN T…
ytc_UgwqxGvgv…
G
@NovaraMedis Hi, you mentioned at point 1:02:17 that our UK chancellor doesn'…
ytc_UgzoyN9nR…
Comment
Wherefore can't the car slow down to minimize damage to the most amount of entities. Physically, if something fell off a truck, slowing down will reduce the amount of damage the host car takes while increasing the survivability odds of the passengers in the host vehicle and the ones on either side. The car behind it, assuming it's utilizing self driving technology, should be a reasonable distance away to react to the sudden slow down of the host car (that tech actually is employed in modern cruise control).
Therefore, ethically and logically, staying in the same lane AND decelerating would most likely be the best option and failure to consider that somewhat breaks the thought experiment.
Did I miss anything?
youtube
AI Harm Incident
2015-12-08T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgiyupRVtlWBhngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggyEI8_YHbKA3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi3Gjq5meodMngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghoyPd4-QvbcXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugis53FXvmFe9XgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjQeVmvXPf4K3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi_78FWydk3dngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjMGlt6fG9gKXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjGMferoLg5VXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiJzHt8WvuEHXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]