Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@3-D-Me I may sound crazy, but put simply, both AI models told me that the occu…
ytr_UgzixaOtg…
G
The sad thing about human artists posting their l=timelapses is that people will…
ytc_UgxIZFcvn…
G
As someone who is looking to work in tech in the future (taking courses to get c…
ytc_UgxoeMzvB…
G
@knucklesamidge Here is LaMDA, the first sentient AI, that is also woke, progres…
ytr_UgyRiQThk…
G
those "job owners" who know how to use ai and/or "treat her well" will take the …
ytc_UgySwFxe2…
G
Future search engines will need to redesign themselves to be a distribution mode…
ytc_UgwjIowsy…
G
I think this conversation is futile because the people who like AI "art" do not …
ytc_UgyTer5Zz…
G
I'm an artist, luckily mostly in 3d modeling, who also started using ai to suppo…
ytc_Ugy46Stg3…
Comment
They might not need a lot of this once they get inside our brains.
They're working on getting into our brains right now. Probably the easiest way will be by AI microchips in the brain. At first it'll be optional and sold as an enhancement, a way to improve yourself, and get ahead in life. Then over a short time it'll become something required to function in society, like how smart phones have essentially become. You might get deals if you agree to ads inside your mind and they get control of your dreams. I fear this will happen fast. They want inside of our brains. It may sound insane but it's all happening. I might end up joining an off grid community before it gets too close to this becoming fully implemented, and the chip becomes necessary to function in society.
youtube
2026-01-13T23:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5WYZ7ZuIhuFCqab54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwzweEM_hfnQfSsAjR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1PkDJlKNaDekni6F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzewbqmPpTkapbiRrF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzAxdsUR5IOCago2V14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxGkX6IiyuC8kkFfeN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy3c6bfEDA_xzoTFCN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzjeGrdTEUv2hbE3ql4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1glv0In91v0QuxOJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1eYd0w-pGqOwQavN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]