Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah I don’t think so…I’ll still be mowing my lawns and replacing washers on my …
ytc_UgwY6amWe…
G
Blackrock used AI tech to set it apart from other financial firm way back in 198…
ytc_Ugw1OLN1w…
G
Probably need new rules, unconnected AI trained to talk to connected AI. This co…
ytc_UgxxfIu-k…
G
Come on guys. Im not saying that companies dont want to profit from this, but wh…
ytc_UgxVRv6B9…
G
If u really think AI is only about chatGPT then you clearly haven't done your re…
ytr_UgzHAy4JD…
G
Yep, I was over the AI nonsense long ago, it's not useful, it's not intuitive it…
ytc_UgwrQZUXT…
G
I've yet to see any actual intelligence from any of the models. I'm a programmer…
rdc_nt6pc03
G
Mass unemployment is only an issue if we keep harping on working jobs.
AI can be…
ytc_UgzxCc_TE…
Comment
These scenarios are extreme solutions why not just apply breaks gradually and slow down the vehicle, why the need to crash left or right into someone?
with or without self-driving options the main problem is and always will be people. without self-driving, drivers always drive too close to the cars in front of them and hardly anyone follows the rules of highway codes, with self-driving option again people will and have found a way to break the rules, bypass settings etc
youtube
AI Harm Incident
2024-01-08T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzyoQHfkvKymBmesal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxjDv4Z1CBjO3WJHB94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTShSLnJ9cwL-Lbw54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw2KRypsW4jIcRnBj14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5i-H8AgNVVhk3L5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLQD8OSQSPK_gIU7Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxcoaaNEw2BPDxcR8B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugy1OKJlc-JkmZ4FchN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzCOQgM8jWOa4MaGkB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxcp38aM6btPES_LwF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]