Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This looks too realistic to be CGI, that is a Tesla cybertruck which is bulletpr…
ytc_Ugz9oDCHn…
G
I have never laughed harder than when someone asked for the founding fathers and…
rdc_ks25onb
G
How is this AI will replace plumbers and gas fitters (like myself) ? Or carpente…
ytc_Ugy_oQMGl…
G
People thinking AI won't take their job (plumbers, electricians, construction) m…
ytc_UgzOyG2wJ…
G
This video misses one very big point:
An AI can have multiple bodies. All of you…
ytc_Ugj9Z8KXX…
G
Not only that but how much energy and water ai takes. It’s affecting our environ…
ytc_UgyOKAKzQ…
G
dont know how to feel when all the most intellectual public speakers are telling…
ytc_UgzPodA1T…
G
I feel that the professor is struggling with his own understanding of what he wa…
ytc_Ugw-8frdM…
Comment
"What if a truck has an unstable load that falls on the road in front of you? Should the self-driving car prioritize your own life or the common good?"
"A self driving car would probably keep itself a good distance away from such a truck as to avoid this scenario."
"Ok, but what if a child was crossing the road? Would it save you or the child?"
"It would probably have a better reaction time than I would, so regardless the chances of any fatalities are minimal."
"What if a tree fell on the road? What if a meteor was about to destroy the highway? What if your mom is so fat that the road caves in?"
People keep attempting to put self-driving/operating "whatevers" into increasingly ridiculous and improbable situations. Regardless of the situation, a self-driving car will almost always save more lives than it would ever endanger. Its the same thing with vaccines; it will save millions if not billions of people from needlessly dying, but one person in 10 million will receive fatal complications due to an allergy or something else.
youtube
AI Harm Incident
2015-12-08T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgizunohajILCHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjjWOUDi8MzcHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQvTuYsrqOtngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgghF14lWrWg93gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghSobsLJzKwTngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghTsPIeRMcNT3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiGpAhmNNMkf3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjlPAxVCSrTmHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugju10Xr0tXdF3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjc3KGPZNZyqngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}]