Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did not expect Rob Pike to pop up here, glad he shares my sentiments on AI…
ytc_Ugw4HPyJH…
G
Awesome. There needs to be more western involvement with Africa. Economic ties t…
rdc_ibdw8z9
G
I haven't used chatgpt, how do you spot AI posts? I wonder how many times I've …
rdc_n7ljuib
G
Dear Bernie AI. Is going to wipe out Humanity as Men seek to Control the Popula…
ytc_UgwrpyhGk…
G
All these comments about people’s various interactions with their own AI. All I …
rdc_nm15p5q
G
Guys come on thats not a robot nor a human, its from the game “Detroit became hu…
ytc_UgxFLEKP_…
G
We need to redefine what is and isn’t AI. If you use google auto complete to sea…
ytc_Ugyf-hGT3…
G
@roopax05Others pay cents per task. 0.08 per task that takes about 2½ minutes t…
ytr_Ugxm35GTH…
Comment
Fascinating. At first he proclaims AI understands what we are talking about when we chat with it, then he questions if AI realy understands what it produces. I say as long as AI can't recognize that it's halucinating it doesn't understand anything. And to just suppress the halucinations doesn't help solving the problem of understanding. You're just hiding it.
youtube
AI Responsibility
2025-12-20T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzJD4677wXn6ZZa2BJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_813MxAtv1gyK4u94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzNAd02qBx7Noc0mrF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwXUSXxGlVLzkXcoG54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz3CmBvEbqmY9qZ6D54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwtnqL2wcNYfTPSUgJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyGTmI9WYL0ou-ANXp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzLdppqLlP8mQaAQyN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYZrYtNmu4CTLbu6F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFo5pY00-f8IVodaJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]