Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Y’all dense af if u hate AI and wanna make art for ur own joy and love just go d…
ytc_Ugxx9rIpq…
G
How people cant see issue with replacing thinking with ai... you will end up wit…
ytc_Ugy_by7PC…
G
I thought Mark Z was a robot but Elon M must be Skynet’s human incarnation……
ytc_Ugy8mzUAi…
G
I would argue it's different than when an artist gets inspired because when an a…
ytc_UgyYrrDMN…
G
This has been a disappointing interview overall with little details on context, …
ytc_UgzAu_emz…
G
Then what on earth is the point?
People's fear of AI is almost always out of ig…
rdc_gd8zncv
G
@pr0j3ct2501 We might get paid at the end for the stolen art of the AI at the en…
ytr_UgzFxKhNG…
G
Tuition for the 2025-26 school year is $49,300. So your parents need to have a…
ytc_UgxfW6Hpq…
Comment
Two possible 'solutions': A. Don't let the masses know how the car is working and let them be ignorant. Disclosing the information will only worsen it all as there is no universal correct procedure, which is clearly demonstrated in this video. OR B. Randomize the reaction to forgo the dilemma of responsibility altogether. For all accidents involving self-driving vehicles, this would average out all the solutions that would be either 'good' or 'bad' to any given individual. This way, it parallels the 'randomness' of the universe, and gives us the typical excuse when freak things happen.
youtube
AI Harm Incident
2015-12-20T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgicJ8o6vgL9vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjgjA3QBACveXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgiIRvaFLRy4BXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugi6wxkU3JS5u3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjSjaD1amn_NHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughy05zsMvO4YHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjFM6BROUj5UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UghkEkbZMbCpeXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UggNzTObvFdx33gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg5W6YbwRYNMHgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]