Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
God I hate this timeline... I've seen enough shit in my life, including war and …
ytc_UgzwQE2TK…
G
@ElephantsInRoomsyour videos have enough quality for at least a couple 100k. Jus…
ytr_UgwmbuTwT…
G
How does he even get money with those AI broken pictures?, like, somebody from D…
ytc_Ugyr1xl5d…
G
Well, if this AI is anything like youtube closed caption, we have nothing to fea…
ytc_UgwPuBhSM…
G
If the robot confused him for a box, how could it crush him? it would've stopped…
ytc_Ugy6UFDLu…
G
I don’t think viewing organ scans in literal 3D space is really useful at all. T…
rdc_kchtmla
G
Self-driving truck shows up at factory. How does it know which dock to go to?…
ytc_UggrhQzDi…
G
This guy and people in the comments are being annoying. Self driving cars are no…
ytc_UgzSnpx33…
Comment
never the less, doesn't AI know that it needs to maintain a safe distance. And why is the human not taking over. Is tesla letting its driverless cars run free at night. Is a cause for concern. Other drivers have a right to know if the car next to them is in driverless mode so they can drive defensively. There should be some law to warn other drivers just like there is a sign for a student driver.
youtube
AI Harm Incident
2022-09-26T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzVJfG-AVRs_IjtUR14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"concern"},
{"id":"ytc_UgyFcFo17Iz77RZ7kxN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuHHOYbje74v-LEQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy44Uu_6CM87kC2D9B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOMp39mj7AX6QaKVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDwB-gzJ2lsZ4jLJJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkLA7IekFRrfXui2Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxigWtmNHS395_MJaN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQESbNb8A-IGnM0zx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyD4joqCj_i47H5kvV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]