Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The project won't be viable for one reason. And I only need one to destroy all t…
ytc_UgxpxRg07…
G
@yacovmitchenko1490 Which is what enables us to know that AI isn't conscious. Yo…
ytr_Ugzveb1aj…
G
Where is this universal income from these companies that are making bank off thi…
ytc_Ugz49SZ9t…
G
i dunno how anyone could ever be friends with a ai-bot they are so liberal, alwa…
ytc_UgweoOHbD…
G
I hate how ai will be able to do anything soon and money will be useless because…
ytc_Ugx8pajjb…
G
ffs, the computer learns what faces look like. If the program is fed white faces…
ytc_Uggt_f6d7…
G
Never happen LLM aka AI are limited by the base pool knowledge with in a databas…
ytc_UgwmrGghY…
G
Explain to me how this is not AI:
Revelation 13:15 "And he had power to give l…
ytc_UgyqKltWr…
Comment
Won’t these AI rapid advancements need infrastructure to work and rapidly evolve seamlessly? Unless the AI starts to develop and, evolve it’s own infrastructure with little to no human input. Possibly Hyper Data Centres in space or on the moon, find other ways to power and cool centres replace obsolete equipment etc, may be then. The on,y scary bit is if loitering AI unit ions and hypersonic missslies with nuclear warheads then maybe.
Maybe Elon will be Living on Mars and using AI to geo engineer life on other planets or in spaceships. AI might make humans an inter galaxy species. If the earth is eventually destroyed Abdin AI looses power as the data centre become obsolete or seize to function.
youtube
AI Governance
2026-03-19T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw-9SXpomtSmdydkCV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwbJ3eiw-_7f-W72Fp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyyvaPLHFUlJ632-jR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz08FUcmhALgmHIElh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5d843vZs74Kg_gix4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGi7Akis9Y27DivFJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz38leHfIzdi0quBCN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwrM7EJ62SHbsKgmyJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyfTY_w5xg7359YUUB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRla7ogncULDkHxB14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}
]