Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am sorry, but if you cross the roads like that you are asking for it!
NVIDIA s…
ytc_UgyixYXJW…
G
Terminator robots will make humans slaves 😢 the robot is tired 😩 to do the same …
ytc_Ugyka-VlY…
G
Oof. Just what we needed, another data-sorting strategy which doesn't present an…
ytr_UgxoYm-F6…
G
The argument makes no sense, artist's art is way different from AI art. Especial…
ytc_UgzeUdfie…
G
My brother played with the AI app..ChatGPT is a ShitLib folks.. doesn't even try…
ytc_UgxtXhme8…
G
capitalism lives by innovation. if there is no innovation people dont need to pr…
ytc_Ugxgke9ak…
G
Everytime we ask AI what it would do or how it feels about humans we are teachin…
ytc_Ugym3ljHt…
G
a lot is just part of the marketing hype. these models are still not "intelligen…
ytc_UgxhxjhDz…
Comment
@marcfruchtman9473 Appreciate your comment. Yes, we've had decades to think about it, but I believe most people didn't think it was actually possible and if it was, it wasn't coming any time soon. And then...boom...it's here. That's why people are talking about this now.
AI for self driving cars is not the same as AGI. A model trained to drive a car is limited in scope and the physical world, whereas AGI is not.
youtube
AI Governance
2023-03-30T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwvtUjccGFfPIV6nwZ4AaABAg.9nrnZpaNGkR9nsY-cBEF5A","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwvtUjccGFfPIV6nwZ4AaABAg.9nrnZpaNGkR9nsbW4K7MBw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrn4m0trjv9nruHw0PaSq","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9nrle7iFSFV","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ns6RZtoKDO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9nt0NShWS_s","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ntJ7Qv2sCu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ntlXtisU_-","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugw6EtfFGqbU3EKNXFx4AaABAg.8ebBLFhnP-u9TQaU28JdPc","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxyULC5OslX0G74cJx4AaABAg.8eZkIXf7xt38e_xmX9IADA","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]