Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Didn't Isaac Asimov solve most of these problems in 1950 with the Three Laws of …
ytc_UgxSaXpuX…
G
Am I the only one who considers the AI Artist mindset to be.. kind of infantile?…
ytc_Ugwp183x8…
G
I mean go create one yourself, you just need experience f ai making ppl lazy…
ytc_UgwWJCbia…
G
Finally, an intelligent conversation that actually highlights the real dangers o…
ytc_Ugw8vRXJa…
G
Its time to become self sufficient and eventually disown money and technology, w…
ytc_Ugx-5ND4D…
G
This is not a phased transition. The controllers have made a blunder of pushing …
ytc_UgzF7pkPR…
G
The big points, imo, on this general subject are...
...unemployment statistics a…
ytc_UgynBrwiq…
G
@babybatbailey03 and yet here is all this art, made by real artists... as a resu…
ytr_UgwaCq6v9…
Comment
Wait. Why don't we just act with superintelligent AI as we are with 200+ IQ humans? Make them work for us or let them die in a form of nihilistic loneliness idk. There are some geniuses living amongst us who gave up on using their superb intelligence and are now living peacefully on a farm doing stuff that humans do best iirc.
Those who are afraid of "AI singularity" and other apocalyptic fantasies always remind me of the climate change activists: They forget about our ability to adapt and especially _innovate_
99% unemployment? Jobs and occupations that we can't possibly predict yet are going to appear out of thin air. Stuff will happen that's going to help us humans to stay at the top of the food chain.
My father always used to say that a good business man is someone who has someone _smarter than him_ work for him.
youtube
AI Governance
2025-09-04T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzG8qxBH8Jn1J5vjal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxL7wjCnPov_0F2e_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYRJbsg_XI7_975s14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzG1hGPP6vTfWohphJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxSidap1D5PSLQXfx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxA8KsdJhJbLgpyjn54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXFf67CpKwRc58r8V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgylJ8UaXTpp7H4Ef7V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwcgEHAq3RlGR3bMIB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjSfFzUCwSxsxCdqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]