Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was catfishing gigachad asking if he had “sum girls to hook up with, bro?” 😎😎 …
ytc_UgzcvJOxH…
G
Can’t we just ask ai to increase our cognitive load while helping us help oursel…
ytc_UgyzTUBib…
G
14:09 Beethoven was DEAF and still made some of the most iconic symphonies to th…
ytc_Ugw_MLqpE…
G
Generally I use an LLM for proof reading documents that I’ve already written, es…
ytc_Ugyjjrdk_…
G
Hi Alex, So philosopher and physicist, Tom Campbell, refers to consciousness as …
ytc_Ugxvgzi1t…
G
Like every other girl that has a public platform? Deepfakes have been around for…
ytr_UgybFh2YH…
G
Great video! And great illustration of how human 'fine tuning' of AI results in…
ytc_UgyAEa_5j…
G
I’ve done this exercise. AI always supports Israel because it’s too factual and …
ytc_Ugwso1cbu…
Comment
Wouldn't a self driving car always be programmed to stay beyond the minimum safe distance from the vehicle in front of it? And if the VAST majority of cars are self driving cars, wouldn't they all be doing the same thing? Thus making the presented scenario HIGHLY unlikely? And even if the scenario did present itself, couldn't the programmers make the choice RANDOM and thus more similar to the human result for this particular highly unlikely event?
youtube
AI Harm Incident
2015-12-14T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjaTrm3xjVlsXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiVJWa_Y6bmRXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghRt6TFpVDC0HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjWo4JZkIB25ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggQIWXW0Sjhu3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugh4kvsWRmbvVXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghHt-JHGMzZ1XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiJ_L1RWjzFSHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiygo6Qdg1iq3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggxWJ27f-_UIngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]