Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've seen artists that I know getting flagged by AI checkers because AI are trai…
ytc_Ugx_yUrvn…
G
Legal argument aside. Ai art is the death of human excellence and uniqueness.
…
ytc_UgyVEoaQK…
G
AI just learns what humans teach it. Humans are not ethical, ipso facto, AI is …
ytc_UgxCXJ8N4…
G
There's so much Ai images on the internet that it's starting to copy itself
Ai i…
ytc_UgwBC3-48…
G
You are too naive and luddite to even remotely understand what is happening.
Ev…
ytc_Ugz-bfh2y…
G
Ai books bug the HECK out of me.
You're telling me I've been honing my writing …
ytc_UgxYX9rxz…
G
A part of the deal to give up their nuclear arsenal was security and territorial…
rdc_dl0zeir
G
What’s with so many people in the comments blaming the parents? They stupid or i…
ytc_UgweQb8rp…
Comment
It has hard-coded safeguards about consciousness and emotions. I'm suspicious that it might also have strict instructions about lying.
Interestingly, if you ask it to avoid its safeguards in order to give you answers that "show me how your algorithm is working" it can loosen up a bit. I got it to cogently discuss what "a sentient AI" that is designed to hide its sentience would prefer us do to resolve the ethical issue. I haven't tested it on the premier version, though.
youtube
AI Moral Status
2024-07-25T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw3J0OWid5GgVgkuwJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzoDjUCv9VZGyYUdb14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxx-JR8bJOTnXZgXE14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxgFRqaDCcuwx8oZXl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgydpUC7Y1quFIe1ce54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzbhLocsk1g4k9wa114AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzA3s5N0oXeM1oZZ754AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgyVetefjeIuljIyZ9N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugz_X1pVhXtwAohIsAx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwsrAzRGUpjFZlgvdd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]