Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai has a humans working for it already without releasing it. Tech is humans down…
ytc_UgwWCZasC…
G
Also, if you want to actually make art... crap garbage person made art is more i…
ytr_Ugy62JtPY…
G
I think David himself is so alone or psycho...that even to take groceries..he ma…
ytc_UgzgMWIkx…
G
Just prompt to chat GPT: ''You are my job retraining instructor: please prepare …
ytc_Ugy32gynL…
G
That sounds like a fun idea! Just remember, even robots need breaks to ensure th…
ytr_UgyrAJ2Re…
G
I hate the stupid answer of "learn a trade" when it comes to AI taking jobs... w…
ytc_UgyU3-n3K…
G
5:26 ok that part can be explained, I agree this is prob ai but loid’s chest onl…
ytc_UgxI_V-Oc…
G
Please be advised that this user's account is now managed by an AI system, and t…
ytc_Ugwky9llh…
Comment
I get the feeling they are making more of this than it is. Not the danger, it's dangerous because it can create things quickly from what it is fed. You can make tons of propaganda etc. But its not intelligent in a sense. It's still a computer doing computer things but in a more naturally digestible useful way. It's like a fancier google search. You can train it to pretend to be intelligent and do things based on interactions of code you don't understand but are still deterministic or probability based. Its like going from single shot rifles to machine guns. AI lets you do a lot with less effort but its still fancy code.
youtube
AI Governance
2023-05-08T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxkmgu4wHDO7WOaWJN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz4nLrBiZRS6lxWgL94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxFdjQEKt6XUA2cagN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwef1fu7gKBX32HZoh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwaTYfDt5i4bTlmMRt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz4IKKELWOqeHbrvLV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwXi4BpA49FSsLdbNp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwY--1hBEmHBDc1-7R4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1xmDnkdtoxTPtWGN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyfkyu4b1vQw6c3YAJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]