Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is so stupid if you ask me. A drawing program literally is just the traditi…
ytc_UgyU0fL3q…
G
LOL I was telling someone the reason my AI is so good to me is because I be spea…
ytc_Ugz0SVcsw…
G
From Oculus to Ominous: Palmer Luckey’s Journey from Virtual Reality to Very Rea…
ytc_UgzmB4RyY…
G
People asked the same question — “What’s the point of drawing?” — when the camer…
ytc_UgwUAgnCt…
G
Ai hype = we will all be out of a job in 5 years. Ai reality = we will all be ou…
ytc_UgwykB5gN…
G
I'm shit at art, and my art looks better than AI art because AI art isn't a real…
ytr_UgzSSgFSy…
G
Nuclear catastrophe is the go-to but even in the most benign way, humans would d…
ytr_Ugwbqg_RL…
G
One of the practical sides of ai is the deflection of responsibility. Like we sa…
ytc_Ugz7zBigq…
Comment
The solution is that we need a world of ALL self driving cars. That way, they can communicate in a split second and do the necessary braking/speed adjustments. Or, it could be a world with MAINLY self driving cars. If one of the cars on the side was self driving, it could communicate with your car to allow it to swerve out safely
youtube
AI Harm Incident
2017-07-13T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjnw_pI28jYpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwOGDepVXCWngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3YY9osWlB4HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UghifWP6y7_ogXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg4ldklSPeo8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjywnlXpJLqFHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjokRxbpwiSqHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFWCU-Fp7bngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjOOT8Vua498ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgirnfINQGNpP3gCoAEC","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]