Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
7:30 I'm sure someone has already pointed this out, but the way categories 2 and…
ytc_Ugww7MeGq…
G
I'm fine if an algorithm discriminates against certain groups as long as the und…
rdc_h4o6xun
G
In my job we aren't allowed to use any AI tools as it's a classified application…
ytc_UgzCbA_xg…
G
lol then why is llama the worst LLM out there? Oh and that metaverse is doing gr…
rdc_koki961
G
Don't they already have one, the US passport database?
Am I not being vigilant…
rdc_iyz85l1
G
the ai chats are hilarious. the chats dont communicate that way, really. they on…
ytc_UgxJ7q_nM…
G
Yeah the new AI ruling is beyond stupid. I was actually onboard with it and thin…
ytc_UgzwcHh2j…
G
Not to mention Wat Phalat! My wife and I climbed the mountain past the universit…
rdc_dy8qriq
Comment
In the example shown there was ample space to slow down and (partially) change lanes to very close to the motorbike or the other vehicle. The box was a side that could be avoided easily. Self driving cars know *exactly* how big they are, how close they can get and are designed avoid the situation to begin with.
youtube
AI Harm Incident
2021-10-24T12:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzslwt9QiheJpWRbfp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugxcu2cWNWH8_geonwJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwXxgUBr-1VnhEAZzd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwugIWuRy3J137G9j14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugx2e0zPvlz0YqnvIjZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxvMPLnc85qWm1CCV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxcvVkDSDnxfMmBnS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwSIBvE8zK_FrylW6d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTbdc_xoYz5h6zmE54AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIxDwsTLHdFBsCgnx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]