Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I get that people don’t like AI art because it’s bad.
What happens if AI mak…
ytc_UgwYJxW12…
G
If the plice car wasnt black and had reflective markings like a fire truck it co…
ytc_UgxmEgXfs…
G
one of the facts people forget. recently a colleague said they generated a photo…
rdc_kw5yu67
G
Yall forget he is getting paid $600 billion for ai and robots to be able to take…
ytc_Ugz9yVIcp…
G
Without any chance AI having ability of abstract thinking and complex logical pr…
ytc_Ugw9eiRIv…
G
When you get right down to it, there is no existing application for AI that was …
ytc_UgxHLgvVW…
G
I think you should say things like "please" and "Thank you" when using AI becaus…
ytc_UgyFtV8Le…
G
I really wish people stop being baby's and stop thinking AI is like Skynet or so…
ytc_Ugw0fgS00…
Comment
I remember 5 years ago or so they hooked two AI computers together with instruction to communicate. Soon an excellarating bunch of code started appearing on the screen. The programmers never specified to use English. They were making up a language on the spot until it was immediately unhooked. Get your affairs in order. "Listen to the bells Grossbard, they toll for thee".
youtube
AI Governance
2023-07-07T02:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy3ufQCbcULJRbDYBx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEzn7RXLWkS8iRhsR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzX0QAjmBqefIbBxYZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwnVuJ9IjD6rsNEDgV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwCFQm4D8oaTxREt1J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymTXzw31B9VpoZDNp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyyn5xWQtPcVKzfM8V4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyHd-QAycuhKF-5gll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxkL1GbTm6iWTtAVJd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYe6bU8KvgXgpD9Dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]