Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But people can’t work trades forever because of the physical demands so it proba…
ytr_UgyavWics…
G
> It’s a “startup” that’s losing money
That WAS losing money... a year ago..…
rdc_jrpewq7
G
People are going to graded by using AI because it's going to be problem these ma…
ytc_Ugwmme_7H…
G
This is already bad, too many humans depend on being able to implement and perfe…
ytc_UgxH2-4gu…
G
At the end of my 40+ years writing embedded software as the Ernest Hemmingway of…
ytc_UgyokIkWf…
G
Terrible idea! The greatest minds of our time I told us over and over again not …
ytc_UgzpZnv84…
G
I'm drastically changing careeres from corporation girl who does payroll and man…
ytc_Ugwz4VXWP…
G
So long as we have government agencies tasked with spying on and controlling cit…
rdc_fvyzdy7
Comment
No car should be allowed to move out of its line. If some heavy objects fell on your line, first car should keep safe distance from car in front, so it could stop car in this distance and press brakes. Because moving to different side does not mean it would save car passengers, because car behind on other line can just heat your car in the middle and make two cars from one, killing all passengers. And it also will result in chain reaction as your car will become heavy object on road for other cars to avoid. So, all self driving cars should keep safe distance depending at speed and road condition and stay on they line.
youtube
AI Harm Incident
2023-09-13T19:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwqc1_q2DdUOgJryI54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1yrdGDed9s8wbpop4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEJwX76AinvoS1s5d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRqHjlDaElPKWED_14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXr4QY4lnbmQR0nAF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxO_Ujk5rSvOjWRKBB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdMf8xwWtitYQSG9Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxA0UzFRipN-4avxKF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugxr6E9-mHJqZTtbMkB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgzEQs0TPw7Wr4XUt_V4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"fear"}
]