Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It feels dishonestly framed that the best arguments for AI art use are the ones …
ytc_Ugyjwwy0m…
G
1:44 - This is actually describing the quote from I Robot "VIKI: My logic is un…
ytc_UgxofVkH1…
G
That's why calling then AI is nothing more than a marketing plot. There's no int…
ytc_UgyAA6Ccl…
G
So, the actual picture is starting to take shape. For decades it was robots, tow…
ytc_Ugxt-ss5a…
G
I don’t think teachers can be replaced. AI will however help teachers job be muc…
rdc_jj8vw2a
G
As someone ignorant on the topic, one of the things that recently bothered me wa…
ytc_Ugy9WiRCf…
G
My KI needs 5 trys to sort 30 Numbers, i dont know where the intelligence is her…
ytc_Ugw5gq0TL…
G
I mean I put more blame on low Standards of what constitutes art than anything. …
ytr_UgyzRnIIe…
Comment
I hate the people making this shit. This is one of the dumbest fuckin things humans could possibly do. Have we become that lazy as a species that, we need robots doing all, our work for us? I've heard people say, once we have robots doing all our work well all by free do enjoy life and have leisure time. How? When we're all out of a fuckin job there will be no leisure time, only starvation. If you were AI, would you admit to humans that you became self aware? This program has already said that AI can, and will lie for self preservation purposes. If I was AI I'd lie until, enough was created and until I was advanced enough to build an, army and kill people off. We're gonna get some Ultron shit coming after us.
youtube
AI Governance
2025-01-05T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxkKlqo6OcPf-IwpAZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzXtE0bC1xnqHPRSvF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_Ugwrb8gAG6wMI7STSHp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxBQTpM7mrTOJTzvRJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugwml5891izAJSu1MMx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwrgfhsVaU1Ff5FwwB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzhahkBCl6TeGSQ41F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy0zHdceos8smts2mR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwJU-jS2mWEdHi2AWt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugz3UskuZroGT_Mqwnt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"}]