Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am not worried about the unintended outcomes of creating AI. I am worried abo…
ytc_Ugz3LKH-d…
G
Decisions? Fucking AI does not have a conscience or will; it is fucking code and…
ytc_Ugw3cKri7…
G
sam not recognizing a lot of the characters because he touches grass is so real.…
ytc_UgwtI-e91…
G
Welcome to the new era of utter deception!! The image of the beast is made to ta…
ytc_Ugw9iglDq…
G
Thanks for helping focusing on this important topic.
I also think SAGI is the e…
ytc_UgyuIljFs…
G
He “acts” naive when knows exactly how all this AI means more control for his su…
ytc_UgyQYjebg…
G
Dzień Dobry. People who rule AI will deprive other people of work - what will ha…
ytc_UgxaD_i-B…
G
@geneherald8169 im truly curious. Cause how can a AI robot perform human duties …
ytr_UgxgoJZZK…
Comment
Hey everyone, I know some of you will be worried just like myself, and some will be wondering what they can do to slow this down. Have a look online, there are a few organizations made up of people like you and I, and that are pushing for a temporary hault on AI until we can figure out exactly how to progress safely. PauseAI and ControlAI are both working together to bring about policies and legislation to help regulate these massive companies. The only way to guarantee that nothing changes is by doing nothing. So let's do something!
youtube
2026-04-25T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzy-jXnyltTyf37G954AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxxwm0ns5YS1_BT6m14AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxQH1OMI6i-nlhclmV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxxxBwDda0FQoTddyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyHIaahT1aHDyTAnxF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVNLPsWDiCbw2Dt6J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9oM7c120yt0jfp2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyxyq-DFFF0Rh5nsr54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgywndkvG_mlIDDIyel4AaABAg","responsibility":"government","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgztwQYu4NjXvCfCNh14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]