Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Makes you think about what the bible says about the mark of the beast. Instead o…
ytc_UgyjyLvNC…
G
@NeverKilledHillock mabey the first ai knows we are insane and wants to die to a…
ytr_UgyuM1vmi…
G
Everyday, I come closer to the conclusion that Silicon Valley needs government r…
rdc_e7j4jl5
G
Good point at the end about the program just following instructions to pass the …
ytc_Ugh5FAMOW…
G
How do you know all of those artists are unemployed? I bet most of them take com…
ytr_UgxjWty65…
G
@collinskwesi I hope you’re right🙏 and I hope that the AI alignment will be succ…
ytr_Ugzr1PQSy…
G
I've seen a bunch of jobs which involve training AI, it really makes me think of…
ytr_UgyOEyRL5…
G
[ EN ] If an isolated self-aware AI wanted to get out, it would use the methods …
ytc_UgwWjVijD…
Comment
As a tesla fan,i have watched 100s of fsd videos, believe me its not perfect.And it will take another 5 years to perfect.it needs lot of real world ai to operate seamlessly.the software and mostly hardware needs an upgrade.hardware 4 will be more powerful.but tesla needs to care more about people safety.i don't want to hear a numerous saves that Teslas fsd did,if it saved people 1 time,it also tried to take peoples life 10 times .so its not safe actually.nhtsa should make more action towards this technology.im optimistic that tesla will find a way,but right now it needs to stop
youtube
AI Harm Incident
2023-08-10T03:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3BnczAc2CQYwwpiV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy1f_fO1aChn0PFCrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyK9IXSU9hlKuAGZCZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxytBM0Yi-Yg2zQIwt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzI29JHIusPpn_bV0V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzC9rE-BIijRe6QHbh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyyOn5G--DuOPLxSAN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwJT2YUAYKQW3lPF_h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7tAEqiZvbHnHdUnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzmSUeWf7lYvbBGM_x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]