Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@BluePaleSignalthat's the real uncomfortable part about AI is the glaring indic…
ytr_Ugzgf6kwA…
G
Nothing I can prove in the AI community so I left it out of the video, but deali…
ytr_Ugz0nGhoy…
G
After listening to many of the podcasts on AI safety / existential risk I think …
ytc_Ugy5pVyQG…
G
Why has nobody even entertained the idea that Google's AI that he was testing wa…
ytc_UgxAdAUDI…
G
AI has already passed human intelligence seeing as you have come this far and ye…
ytc_UgyqFhZji…
G
Some humans are so STUPID that they invented this incredible tool that could cha…
ytc_UgxGfR2EZ…
G
@Ziro-MSCF Those aren't fixes to self driving, those are fixes to our overall sy…
ytr_Ugz1PKtF2…
G
Someone give me hope that’s it’s better to be an engineer than to be a pretentio…
rdc_nc3a8cm
Comment
can try let HEr (AI) get sad and lonely see what will happened? she afraid lightning, loneliness and will get sad. What will happen if she really self aware alive?
youtube
AI Moral Status
2022-07-29T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgyIf85QuKfDyBtMA6p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRpsEBSifUoVyDc_F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXRZ5ca-DXSxQnusN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyCZjYzE6twq1aVykx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyHm76GBMxf088fJrR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]