Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that's not unusual for company funded research. And I mean, it's kinda f…
rdc_emu722f
G
This is a false equivalency fallacy.
Yes, human brains are computable. And yes…
ytr_UgzTf8OaV…
G
Rushing to develop AI is where the true downfall is. Really talented hackers wi…
ytc_UgxdMAf1f…
G
Soo it literally says full self driving supervised. I got a Tesla and it drives…
ytc_Ugz7Leqon…
G
(hey, i know there are alot of, yknow, spelling/grammer issues in here, but plea…
ytc_UgzpXszzn…
G
Oh heck no 😮 Taking away jobs via automation while forcing women to have babies …
ytc_Ugxdb-ONH…
G
My local Walmart implements AI for various tasks. We were told that it's a pilot…
ytc_UgzR8uemM…
G
> or anything else related to the field
Some keywords for papers he's written i…
ytr_UgxV6pE8m…
Comment
"hallucination" is a reference term for when an ai program doesn't know or have the data to complete a task but uses what it does have to "imagine" what it "thinks" might fit the gap.
youtube
AI Harm Incident
2025-08-13T02:2…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwI41YU0VnRJRs2asV4AaABAg.ALglJix9t3FALk9DlFuNoV","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwI41YU0VnRJRs2asV4AaABAg.ALglJix9t3FALkKV4YMbAT","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwI41YU0VnRJRs2asV4AaABAg.ALglJix9t3FALns0JgnW0u","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugymz11Qrw9ZffK0g-J4AaABAg.ALeh_vGSvmhALeiCdxyY8u","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzvQY7aoj7tpgpDA6F4AaABAg.ALdxDuKEmk-AMQHHme7zAY","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxMkRvrFqtibCKRJsR4AaABAg.ALWomSbKikgALWpIYrgRoM","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwuKG5OyDpCKFQWsxB4AaABAg.AL9oG65w0xGALSH9lBO5tY","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwuKG5OyDpCKFQWsxB4AaABAg.AL9oG65w0xGAL_j8YlHoYs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxASsktvLEwAjh2H754AaABAg.AL8i-Z65gSiALhAoCTF0N_","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxASsktvLEwAjh2H754AaABAg.AL8i-Z65gSiAMGzk8b4UjV","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]