Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People show their beginning art like that and act like it isn't talent. I'm defi…
ytc_UgzWMXotC…
G
@LeviHatesLuka nah im kidding i hate ai and i do draw prety regularly my art ju…
ytr_UgxSM-CUK…
G
So if u are a driver in a self driving truck and the self driving truck causes a…
ytc_UgwwIqCTp…
G
I feel A.I is making up these stories sometimes. Use to make sense now some of t…
ytc_Ugyh7gY2P…
G
17:28 Please get the phrase,"now as your attorney, I highly advise you to" imple…
ytc_UgxU00Zr6…
G
Thr circle of life, ai takes your job... you make bad art for ai as your job... …
ytc_Ugw6de1XU…
G
I can't see mich of a difference in ai learning from art to make new and a human…
ytc_UgwNMVSfT…
G
I met a doctor in Cambodia on holiday who was routinely saving peoples lives wit…
rdc_dpc5k8v
Comment
The problem here is a researcher propagating the media's darling term "AI" for the LLM and ML tools she is talking about, which have zero intelligence. A hallucination of a woman with three arms may be laughed off, but it is gibberish just as much as a gpt-written mindless word salad or a self-driving car that refuses to stop for a police officer after a traffic infraction. Important open problems, like alignment, must still solved before we can unleash, trust and use real AI, AGI.
youtube
AI Responsibility
2025-07-27T03:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzfQkuwjUL-zThK9Np4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5jaavTaZwWERcjMJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4s7gCNrDr1Vmavpt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz40Shar1q17Ep5yfB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZSh-eNgd3rMYOyQt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxw5hm9mF0a4XEhNtd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwjFwuGVG1lzV2TcCh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7FKAuJ7DHZNhwW994AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyEdRGjAxoJRaKu0Px4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUpiXSf-9wi45SPiJ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"liability","emotion":"outrage"}
]