Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok, so what if we design Ai to be adorably clueless and emotionally needy? Like …
ytc_Ugx7TFmtK…
G
I know someone who is fully paralyzed except their eyes, they love music, yet wh…
ytc_UgyBKrOwT…
G
>HR, managers, and recruiters act untouchable
Which is going to be interesti…
rdc_o4htw7j
G
AI is never going to take your job away, but it certainly makes your job way eas…
ytc_Ugxonbsvh…
G
maybe thats why we shouldn't train AI with the internet, theres a bunch of vile …
ytc_UgwhX-VkA…
G
Now I’m a big proponent of the idea that AI isn’t the real problem—capitalism is…
ytr_UgzjEKtJ7…
G
So you are saying self driving cars drive like many bad drivers on the road? Tha…
ytc_Ugx6oXvF4…
G
I refuse to use ai, and if my stance has any impact on ai dying, I'm happy…
ytc_UgxdGiP4q…
Comment
Not Tesla's fault driver was not paying attention. You are told over and over you must pay attention when you buy a Tesla. Auto Pilot is just lane keeping not self driving!!! Just like auto pilot in air craft auto pilot just holds course and altitude it doest avoid other aircraft or take off/land the plane.
youtube
AI Harm Incident
2026-02-04T17:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzG6qV7OsGfYouFiIJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1O6ZvQBizd5oFkPl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwH_5oQPoNoGyzaJSx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVVMDLsrSANhM_QSZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKHEyBfgBz4kKsAl94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8bDzDiZdaSWUIEex4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQuL3kY_eyaEMD4FN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKwyjy0Lv9K6Zt9Sx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy900VElDf46i9y99V4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwN32k9hLtOGwy6Ep14AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"}
]