Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t blame mom at all, however… this was bigger than a chat bot. He was lonel…
ytc_UgwgAE3HK…
G
But how can you prove that it didn't fail the test on purpose to reduce your sus…
ytc_UgzA1uW5K…
G
I would have to disagree, if a Hollywood company used ai to make a script, and s…
rdc_jj4ckfm
G
I feel the AI bs creeping up as well. Had a commissioner who wanted a com from m…
ytc_UgyGmdY0g…
G
The intention is 2 fold. 1) find a way to convince the board for these companies…
ytc_UgwkQQlvc…
G
The funny thing is, they all seem to be using "A.I." to write their insipid comm…
ytc_UgwGigphw…
G
Robots should never gain rights. They only do what we program. If we program the…
ytc_UggpzkAUQ…
G
‼️There is a Misperception as follows:
The notion that CHATGPT has intelligence…
ytc_Ugw1jfikH…
Comment
Yeah, which one DO you trust? Tesla dropped radar because they just can't solve the senor fusion problem--you can have the best radar in the world it still doesn't matter if the AI decides to throw the result out. I respect Tesla for going with practical solutions although it's not good for advertisement. If you are forced to pick one single sensor from vision, radar and Lidar, vision is for sure the pick but I still love the idea of having radar/night vision on my car when driving myself. I guess either Tesla has to either get so good at computer vision (with multiple cameras maybe?), or solve the sensor fusion problem at some point.
youtube
AI Harm Incident
2022-12-06T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxIYtMKgczcqi80WP94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfGFA3gwkcGpjisZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwl_hPP_MFr2jRpgfV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTon5T35b9e_tRRTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKp3T8pvKbxYZ3EMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQg-kbAoVjtGZWPP54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzH2uuodXCgwmz0kF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx3FfYVAZsd1eOF0u54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWQxsLxtdvM1qnlx14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmGYRbmFAqcnQm6jd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]