Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem that I have with this analysis of emotion-detecting AI is that, as I…
ytc_UgwdLQuJH…
G
This discussion is all over the place because Miotti is the wrong guest to invit…
ytc_UgwI-2MF1…
G
People should get off the AI apps. The time has come.
With that said the same …
ytc_Ugx2EJipA…
G
It's really strange being someone who is both interested in tech and art. To me,…
ytc_UgxELWSvn…
G
I like ai art cuz I don't have to worry about starving artist anymore I just tel…
ytc_Ugy2whOzd…
G
My boyfriend got into the habit of communicating with AI. Started out as a joke …
ytc_UgwEK3md_…
G
I would like someone to refute my position. Neural networks is not a Turing mach…
ytc_UgxtFNiUO…
G
That makes sense from the perspective of the present, but consider that the firs…
ytr_Ugw6xrrd4…
Comment
I think we would be safer if all car were self driven and only a few road outside of city and stuff were manually driven, the thing about having driver making them responsible is bullshit, if you ever have been in a accident in the night by a crazy driver, they'll jusr go away and left you there in need of help and sometime they never even catch the responsible driver...while at least with a automated car you can track where it was, we can make it safe and it'll be the end of this stupid drunk driving or driving under substance, it's just my opinion...I agree that only having half or even a third of car being automated is dangerous is useless thought they either need to comit or make special location for it because we can't miss mash them like the woman that got roll over, she got rolled over by an automated car after she got hitted by an human, having both on the road just gonna bring chaos we just gotta choose at some point who do we keep and for that reason I think that safe place or dangerous road or maybe at night manual driving should be off to avoid any accident where nobody can help who know in anycase it's don't really matter
youtube
2026-02-05T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzfhubjY3_AeX4uO-d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzLIKTmQh_6EiXlE594AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx8ACDk69Ug2yakGvl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxA0py79JkdIWey16R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdBMG8G293u0rAnnN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQBKBBY1s7m-GjVTB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw5GhHldVL5tfdO_ux4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4kTf7o8caZmwrUax4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxqlVGY7FYCg6o-XZp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw1nKoIB_TzROwZZpp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]