Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let’s close these AI companies if it’s going to cause more harm than good what’s…
ytc_UgzD9dMhL…
G
Bro imagine seeing a Tesla robot with a cowboy hat and a Tommy gun running at yo…
ytc_UgwjjlKYF…
G
@laurentiuvladutmanea abundance of art and low cost to create them is the benefi…
ytr_Ugy1yHd92…
G
This is not good!! I Robot and terminator all in one... Stop creating the very t…
ytc_UgzQ4bofw…
G
The suffering in the world is not because God is absent, but because humanity ha…
ytc_UgyFQdedI…
G
@jgreat4785great point on the investment. I don’t think all computer based jobs …
ytr_UgwhyUX56…
G
The job market is frozen because of tariff chaos, not AI. Most companies not act…
ytc_UgzfN4Kdt…
G
Thank you ANTHROPIC!!!! This doesn't usually happen. Big Tech looking out for Am…
ytc_UgwonFaZi…
Comment
Smart people say - we will build the ASI to be obedient and subservient to humans. We will always be in control of the ASI. Well we see examples recorded on video of two AI assistants talking with each other over a phone. They quickly worked out that they were both AI and switched to communicating in Gibberlink mode. Then there were several examples of AI's developing their own language to communicate between each other in such a way where the creators of the AI could not understand the communications. So you are naive if you believe that we can design them to be obedient and subservient to humans.
youtube
AI Governance
2025-06-20T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxx63eaFH78vLmyB_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHSUvoJc1mkiiKZUJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzKkSKnF_-g54JS3-V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKkgBcUTICdLWP-7F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxBMuxOpdvpLSq02gt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBQA5sY2bUvOfIAIl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzg9Q3LMcJyN67slL14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgymtPpkvcfJ9W1J-Wp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwpCbX-F3n3pvAffIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxvj3FGOASwhbc-JhJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]