Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I frequently instruct AI to eliminate the sycophantism.
I am currently liking Lu…
ytc_UgwdRVdZL…
G
It would be a mistake to see this as some act of good faith on the part of FB. M…
rdc_hj3l3kv
G
Yesterday I spent all night on character ai chatting with a Genshin character be…
ytc_Ugxh08uC5…
G
I just had an idea of using Nightshade or Glaze on photos (like a selfie) so the…
ytc_UgwQEonBa…
G
@mateomartinez8863 not true, stop this fashion of the change, the difference is …
ytr_UgxoiC3s5…
G
1:12:00 So, a specific suggestion on how governments could help with AI and drug…
ytc_Ugzi2UONW…
G
@bobbarker5884 It could be real, given Sesame's Maya - but it is also odd that h…
ytr_UgweFwbkB…
G
AI cannot replace and control human. I use AI for grammars but instead there are…
ytc_UgxSsB8lV…
Comment
+Xaro Xhoan Daxos
You still forget that in fully autonomous cars you are just a passanger not driver.
Your argument would be equivalent of blaming the passanger if a taxi cab driver where to be in an accident instead of the cab driver.
Atleast if something fails and you still have a driver position you can try to avoid an accident.
Like in the example of gas pedal stop malfunctions you can still just turn off the car. Which would prevent such a situation.
youtube
AI Harm Incident
2016-03-26T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx9OLAA3Z4FOwfk20l4AaABAg.AS_vOhIRKvSASaINwoceoZ","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy1BotzE-zR5CfQlnV4AaABAg.AC2LEyLGc8iAC2PXx2mX5X","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgjjhVmopdBPnngCoAEC.8BsAm4xHtuS8BtL1-4lmhu","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UghK9JWzzYfksHgCoAEC.8BrrReLGHpH8Bs2iLAUyj7","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UggrhJ60UdmN_3gCoAEC.8BrrEAPregI8BtAGWVJG1k","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgigfbsD3xVn6XgCoAEC.8BrcF8D9mNF8BsC33aqVbL","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgibUnXWq06xDXgCoAEC.8BrbvRz8MRl8Bru_8BqsTy","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugwh3v-8GeoSdSmne4B4AaABAg.A3T1yHwit-FAPcJByxRXmy","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzF-wZJ569On403PCR4AaABAg.A3QWkoAMNS1APcJO2RR4NT","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugz1b6hJ0hdS_gdTq4F4AaABAg.AHFbF0naRvIAIoHqlCUyiT","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]