Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI learns to fight its technique is not gonna be that sloppy and it's not jus…
ytc_Ugy5I04qa…
G
I just asked chatGPT to do that for you:
The Reddit post provides a comprehens…
rdc_jf7lbk2
G
keep doing that shit. i'm so angry that all those AI companies just take. I'm wo…
ytc_Ugwg5UYK8…
G
3:18 the GPT referring to itself as a human. “The pace of AI development may str…
ytc_Ugw04a8Dh…
G
The script is wrong. What AI is missing is not accountability. What AI is missin…
ytc_Ugw-E_hPr…
G
AI is little more than a toy when it comes down to it. People who call themselve…
ytc_UgwD5e8QJ…
G
Its pushing people to be a robot programmed by the communist government. Its sic…
ytc_UgxYwRb8J…
G
I don't know why these people keep lying all the time (AGW=BS). I don't listen t…
ytc_UgwviW9D6…
Comment
idk why, but when I hear elon said that it is it's more safe than human or greater safety driving while also implement it only using cameras and only putting one radar as backup?, makes me think it's like implementing GPWS on an aircraft but instead of using proximity sensor (Perhaps mixed with Inertial Sensor and other stuff?) (mixed with GPS on EGPWS), you use camera only for the GPWS, or implementing trim system to be automatically adjusting while only based on one AOA Sensor
youtube
AI Harm Incident
2024-12-18T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyRnbOeHcCraG7CJLt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgxbyUv6IkzNIsXjeKZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxJ1fhEWSERXHPfDLp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzW7kTwlB6aOjVxuN94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxRpjfrqyJ9xrInG5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyH3U8SZm2UbqCob314AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwv9DWJci226HKmKHt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy1auMKXOYtS4AE9FN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-FKaFihGCJVk_BiV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyX4NbPLOVGph38yVl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]