Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My favorite part of this ai is that when it makes a statement that violates the …
ytc_Ugw17DR0i…
G
Theyre going after ai companies, but theyre not going after studios disney for b…
ytc_UgxU3zGrI…
G
I majored in history, so I'm curious to know whether there are any historical pa…
ytc_UgwBWekUy…
G
yeah, sure, let humans keep deadly jobs and let "AI" invade prestigious fields. …
ytc_UgwB1mk13…
G
But if AI is meant to do something for us what will AI do if there is no one to …
ytc_Ugx0tyN8R…
G
Who knows enough about it to regulate it ? Only the developers know where it is …
ytc_UgwCMsVQR…
G
If the software you write is an enables a product instead of being the actual pr…
rdc_mjt96sd
G
One day we’re going to see on the news, a.i robot kills its owner and it’s on th…
ytc_Ugz_ZSqnM…
Comment
I just think of it this way: If someone jumps out in front of me in that situation, I'm not going to write an ethics essay on what I should do in that situation, I am going to freak the fuck out, slam on my breaks, and probably swerve. The car is the only one with the luxury to make a moral decision, because it's the only one with the capacity to observe all of the variables. However you slice it, automated drivers are better than human drivers.
youtube
AI Harm Incident
2014-05-25T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghX3ONnXBqe8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggBb4ZNQvsUxngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgiB44HQOpV9QXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggyaM-IUg_XuXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiIlLAil_Zq03gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjU67dU5p4ZfXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UghsmoHmvmSdbXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ughie0wJ93N44HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj0eYYZq3WVp3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugg5q94oNYv7l3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]