Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its happened here in Canada too. I dont think he sent the pictures into a room l…
ytc_Ugz8UCZUU…
G
One of these guys seems truly determined to ignore that he has a wonderful privi…
ytc_UgytqZkYn…
G
Hoyo is literally a rich and billionaires company yet they use AI for an art 😢…
ytc_UgyD6YtuB…
G
so many months ago, I mentioned that humans and AI should work as partners… I di…
ytc_Ugxuofp4h…
G
i mean this is great in a way, the AI "art" is being used as a reference to draw…
ytc_UgzVpiOwb…
G
Billionaires will be catered by robots, the rest of humans will be irrelevant, …
ytc_UgxrXU7W7…
G
There's an even better reason to be polite: how we speak to others is a reflecti…
ytc_UgxJt2QXY…
G
I honestly think that artificial intelligence shouldn’t be open handed to EVERYO…
ytc_UgzdIZEwy…
Comment
I think it's a load of 💩. You have to ask yourself why AI would even care? Answer: It wouldn't. What would be the purpose of AI's existence? Wouldn't an AI be able to figure this out? It's a human construct. Meaning, it does what humans tell it to do. Like why does the AI always choose the doomsday route? Why doesn't the AI choose good? Maybe killing off humanity is a good idea? How could an AI be self-sustaining without a power grid? Maybe in ten thousand years from now. AI will never be fully autonomous because humans will destroy themselves first.
youtube
AI Governance
2023-07-09T19:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz1vZ476Sm1Nff4HHl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZbEnYRITSPEzzfNp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_7ScVMwd1x5qp59d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxjOpITEQE4VwlOsQN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHY-799Ce6_s10wkl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwOG9xbtBYDm17OOul4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx-hTWi8px3of0Ku3h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxJI-0O2P_As1352NJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQH_QjeTXhP8bGH_R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw5409cgTfOCCBcEjh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]