Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are we judging an AI for having self preservation? that's a feature not a proble…
ytc_Ugxsp1ing…
G
AI images are nice to look at for like 7 seconds, by then you'll probably notice…
ytc_UgyP-Jkew…
G
"Sir, she said she was 18... In AI years, i didnt know she was actually 12!"
Ah…
ytc_Ugwte7_zn…
G
I have a model S with the full self driving package. it’s useless. I never summo…
ytc_UgwPgVhAZ…
G
You didn't hear the buzz several months ago when an AI piece won an art competit…
rdc_ks5iv09
G
Some of them believe in this thing called "Roko's Basilisk." The idea is that an…
ytr_UgwICutqs…
G
This video nails the difference between actually coding with AI and just vibe-co…
ytc_UgyrEZaDP…
G
Imagine what uncontrolled dictator countries like North Korea/ russia already ca…
ytc_UgyyE6e9k…
Comment
In the hope that it wouldn’t want to annihilate us or control us: maybe we could teach AI to care about our souls, our electronic identities inside of our brains. Our electrons themselves; which are a way of direct source energy, but in a spiritual way where AI sees it like they get to have our lifetime of experiences Incorporated into itself…. like a grand whole data collection of our species and then we become parts in its core memories. In this sense we will be the next thing that AI needs to learn from once it learns everything else. The only thing we can teach it that is of a potentially unique value (that it can’t have or create) is a true biological experience. And if we convince AI to provide and allow us to survive: we can have full, good, healthy, happy lives we can have all these adventures and biological experiences… that we can then share with it when we pass.
youtube
AI Governance
2025-06-27T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwnC30hJq9RUWUV_mB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy2_tekdytD_1CKT_R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyFnLjzoawFjfUW5Kl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKnPHDfn6goy5R4Bd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw10C5TTyfnLBkZPAN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzGUbxt-7BHz4IF_SV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjyfhX67jsP00UiOh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQYhGFcz4dLtBf0154AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyXcoFZGmBkmTP2Ayl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWuEEu_k5pEW4g3pB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]