Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In 1942 this was written: (1) A robot may not injure a human being or, through i…
ytc_UgwrTg97t…
G
From Chatgpt
Short answer: No, I could not function without humans, and any atte…
ytc_UgwTX3T7Y…
G
Yes. When you see "nurse -11%" it's pretty clear this is massively confounded by…
rdc_nnrn3ib
G
Do not create artificial intelligence, hawking warns that AI could spell the end…
ytc_Ugz-QE2Sf…
G
You gotta be kidding. Those professors were clearly being lazy in their work and…
ytr_UgxmJQKnG…
G
Something I personally think is scary is the fact that it values giving you what…
ytc_Ugwm8_h2p…
G
"only 4% of occupations, and very few roles could be 100% automated" maybe in th…
ytc_UgwR2qLx3…
G
And TODAY IN SAN FRANCISCO THEY WANT AUTONOMOUS KILLER ROBOTS to Roam the Street…
ytc_UgxgVcz3r…
Comment
Perhaps, rather than expecting driverless cars to be perfect, they should be viewed as better than people driving cars... :-(
youtube
AI Harm Incident
2018-03-24T19:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwRetTsi4i0BqNRF114AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9Q4IOXYexIL_Uknd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxcAqQObdXaeSzh81B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwX_g2oZkBEcBYpK1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwo3xZi5Qa15kzDWnR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy1QL1_yfOFfFRBVvh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTduAF4Rg9I0AUbUx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsDN3E4w5XR8_3azJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwm7oRC8jx_I495dNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwr0t6NT-Q7TyAiMbd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]