Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The old guys too calm only because he’ll be dead before this happens so none of …
ytc_UgwBbeLfB…
G
I don't think we have a choice with regards to progress. We are neurologically h…
ytc_UghA24C7V…
G
Remember when CFC caused the ozone hole but eventually alternatives were found t…
ytc_UgxBcTNEs…
G
The problem with AI is it will figure out quickly how bad you guys are, and what…
ytc_Ugy3JWpJ-…
G
Mark Zuckerberg touted on Joe Rogan that AI would replace metas mid level engine…
rdc_nmazkxv
G
@novilunae Sorry, I forgot — it’s not a real opinion unless it’s handwritten wit…
ytr_Ugww8JLYl…
G
Why is this so bad ? Its super smart and gives better information then anyone in…
rdc_mtourg7
G
We need to take notes from China’s facial recognition software.. their shit is n…
ytc_UgwJ5VPsq…
Comment
I have not been in any autonomously driving car in which you have to constantly stay alert and be able to take control of all times, but that honestly sounds so much more stressful than just drving by myself. You will constantly have to wonder what the AI sees and how it will react and constantly be prepared to engage if the AI is not reacting as fast as you would. When you drive and you know what you're doing and are a confident driver it's not stressful at all, you just do it - I'm never going to get a car with the flawed technology that tesla uses.
youtube
AI Harm Incident
2024-12-16T13:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugymxs7I-T47d-2EPgl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxS8QN44BNGQCNwYK14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz5A5m2klmeRivyB5d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwS_LepYfJbvFnKSMR4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzebMplL1mA9E_4LPF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8rez9_oL_bu9knnV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzS8ooHOA5c2aTeTiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzfLUOfQuTAP5UQmkF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwiKaMdsF4mK-Aom514AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugydqn1QFkeeyNgxTd94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]