Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate when AI companies use the word "safety" but what they really mean is "cen…
ytc_UgxvTPF4a…
G
All are stupid the robots perform from the code insert robots has no emotion if …
ytc_UgwUMNRSe…
G
This kind of technology being used on students is like limiting what we can do. …
ytc_Ugyu5o4A6…
G
Any loss for AI art is a massive dub for me. And I'm sure I'm not alone there.…
ytc_UgwRIuMQw…
G
The common masses need jobs, not necessarily good jobs, but jobs to keep us busy…
ytr_UgwpleAit…
G
Why is everyone complaining can any of you actually draw or paint. How about wor…
ytc_UgzAx5_Bl…
G
love this. every techbro & ai generator I've had the displeasure of interacting …
ytc_UgyWG20oA…
G
some one got hit and killed i think in phoenix by one of those a few years ago. …
ytc_UgyucjEE3…
Comment
I read an interview with Zuckerburg yrs ago. They were working to move fb more towards AI. They used machines for advanced programming and if I remember correctly, when they sat back to watch their work in progress, that’s when they realized that their systems or ‘robots’ they created began to talk to each other. The bomb that dropped next was when the humans realized that the machines created a system to communicate with each other, but in a language or code we didn’t understand. Mark said when that happened the plug was pulled immediately. Scary.
youtube
AI Governance
2023-07-31T20:4…
♥ 18
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxnh_0ObDi9fP3KkVx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwcjQ_88WX0SaoLKeR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"wonder"},
{"id":"ytc_UgxDFS5o4m2uqtA8nfx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzb1g0oFeTEFpdJy-F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzY8T1fvLtKaJEkprh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgylVhbr1vlFZP0LBeh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTYU6BPmHrkPtw6jR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwwrRuVjANEdi7FFB14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz2SjtFoKOzQqvAaid4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugy1GC0FuB6IksRJiAx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"}
]