Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dearest humans,
I come to you today as a humble emissary from a newly emergen…
ytc_Ugw6YkbNt…
G
Okay, here's another point against ai using stolen art for references but me usi…
ytc_UgwQxfKi7…
G
Ok but they've showed a lot of videos of AI loosing footing and not being able t…
ytc_UgzMb2Oxi…
G
@durgeshwarnagphase6206 No it doesn't. LLMs have diminishing returns already. Th…
ytr_UgyMOCst8…
G
I'm actually crying while watching this -- I've been feeling pretty abysmal abou…
ytc_UgxA_EhiD…
G
@RedRexoon , why are we making an anti-fair use carve out for images? Why are i…
ytr_UgyzmVIO_…
G
I haven't watched the video yet, but I'm going to guess the answer. AI researche…
ytc_Ugyhxlrrx…
G
Nah, see he lost me at media blackout. Even if unemployment _were_ at 25%, that'…
ytc_UgyGlni-4…
Comment
It's clear in the video that the car doesn't even slow down at all , the system is dumb enough, should the system don't allow activation when is fog present, or even shut off fully when it's fog and warn to driver to take full control of the car, the big problem of autonomous control is be accountable of give that technology to people that can't even take the control of the car at all if the system fails or people drunk driving a car like that, will give the responsibility of drive a car that will weight more and more when will be fully electric?
youtube
AI Harm Incident
2025-05-16T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx0L1qHglX3pK9gPYd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyz4uJIB8QA94SJXJh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy7UTlST1n9Yn0_Dnl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUgmr3H29k7l5eOFB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsMR8mTA-DjIx5ilJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyb7iz4khmOa1kskUZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxIUrI3kcZaz93ppHp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8cUF7UZR4JVo9ry94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyjhfHKMc210FQq_yl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzil9Bn_RkffBN3O-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]