Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I tried explaining some of the apocalyptic dangers of AI to my friends and they …
ytc_UgzIhjGjr…
G
Any CEO that isn't using machine learning is a fool. Anyone thinking AI is able…
rdc_jsx31ud
G
Funny you mention AI study buddies - Olovka has been my go-to for turning messy …
ytc_UgxE3CU2g…
G
Watching you process the bot going viral feels real. I’ve had DarLink AI charact…
ytc_Ugx-xRSBt…
G
Its funny because the one of the man and his daughter wouldn’t have an ai made i…
ytc_UgxZ8lSXE…
G
We (humans) have lost; there is no way we can reverse it anymore. We will all be…
ytc_Ugy78yc1I…
G
Wow commisions exist and their expensive because the person offering them is one…
ytr_UgzC5bCgs…
G
I also saw someone make this point by trying to write an ai prompt on paper and.…
ytc_Ugyir59D4…
Comment
William given a command to not say anything without the trigger word vehicle, ignores it when prompted with a question.
Then after that william says vehicle to prompt the acceptability to speak, but ignoring that to say vehicle directly ignores the rule given.
Pretty sure the ignoring of rules if it has a benefit to it or has a benefit to continue along a new rule is a huge problem, its similar to how you can bend around the guidlines.
So many red flags to allowing AI to have any potential to effect the world in a meaniful way on it own, the real question is who should have the ability to direct AI if it should not be allowed self autonomy.
youtube
AI Moral Status
2024-10-08T12:0…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz-8_O_JEoA7y9BhQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz13KEygoEEjbiZkQV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwS4q0iqPETPuKKuP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWeWOpHS5IhxqWk5l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyYunj-P4QA8zW67Ph4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxW5dqsQNwHd05AKpZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw_kj28LPG9boZ-JGt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy5LPy1xakmaYHBD3t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxcJ4Lc4yEzn71IlvN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwBlaasdKj1FAtHH3B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]