Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well. I’m pretty sure AI can’t teach blind kids to read braille. I think my job …
ytc_Ugzxjqv5G…
G
He’s a damn fool to give much power to AI to write a code and if AI writes a cod…
ytc_Ugwe7S-4O…
G
Here’s an idea: AI might eventually insert itself into the www and mess up all a…
rdc_kve69b4
G
That’s why if I get a Tesla, I wouldn’t spend another $10,000 CAD to get a full …
ytc_UgyePPdVD…
G
... or HAL in 2001 - or the smart bomb in Dark Star... you can actually talk to …
ytr_UgxUuMTpe…
G
@ricklubbers1526yeah you guys don't understand the AI program is just an interf…
ytr_UgzgSjDcp…
G
From Superintelligent Agents Pose Catastrophic Risks: Can Scientist AI Offer a S…
ytc_UgwG6EVp0…
G
I don't get it: I've had a past of terrible mental health, even tried to off mul…
ytc_UgwIKiXTn…
Comment
as the systems engineer myself I don't think we'll ever have self-driving cars simply because of the liability issue. someone has to be responsible when an accident occurs and if it's an autopilot well then no one is responsible and no one can be held accountable and the behavior can continue to occur.
I also think artificial intelligence is wildly overrated I've seen how it works and I'm not very impressed and I know how it works and I'm not very impressed.
youtube
AI Harm Incident
2022-09-09T16:2…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyMOGAAN8V6nxQQ4294AaABAg.9fe8U1o_TQv9feGSzTVXnA","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyMOGAAN8V6nxQQ4294AaABAg.9fe8U1o_TQv9feO8Io_GD-","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyMOGAAN8V6nxQQ4294AaABAg.9fe8U1o_TQv9feRU835SXZ","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxiIRBnky9EHqVXCyt4AaABAg.9fdqmlcyset9ffacJaVNpv","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgzaaQAN2LOIMrtIFk94AaABAg.9fdaFSfnk5v9fi8xnVArlo","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzaaQAN2LOIMrtIFk94AaABAg.9fdaFSfnk5v9fijwRpnu4p","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzaaQAN2LOIMrtIFk94AaABAg.9fdaFSfnk5v9flE6Yh_J1V","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgysPhuxG-ysnjpUz3R4AaABAg.9fda7-4Hr4F9ffRjeVvJRl","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgwStSaNS7HtXz3OMHt4AaABAg.9fd_4764hVs9g7sZRtarSn","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzFF3KTuHnW0XX1o594AaABAg.9fdUgQVEE2P9fdVZS-5VPB","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]