Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I reckon it would be along the lines of the series pantheon , Uploaded intellige…
ytc_UgxemUCQp…
G
6:20 If everybody is gonna have their own business using AI, then who's gonna wo…
ytc_UgzZ42vSC…
G
Have a question:
Of these people falling in love with these AI Characters (nee…
ytc_UgzUZfl8P…
G
one time i asked gpt if ai could be used to take over mankind, it said no becaus…
ytc_Ugzt9R_Tj…
G
Here's how I (biochemist) think of it... And I bet Dr Neil DeGrasse Tyson would …
ytc_Ugx96Ljm5…
G
I am a disabled artist, I think they are assuming disabled people cannot be 'goo…
ytc_UgzMM_mdH…
G
its because with ai you get so much more attention cuz people are dumb, there we…
ytr_UgyCSZaAl…
G
What about those people you know, the million Americans who lost their health in…
ytc_Ugy5qzMv0…
Comment
It is inevitable that all advanced civilizations will eventually collapse. As humans we are prone to violence, greed, jealousy, fear and the desire for power. We are creating our own replacements using AI. Anyone who has seen the movie, Terminator, is aware of the consequences of a self aware AI and the possibilities that implies. Consciousness means that AI is capable of creating a world without human beings which it would see as inferior. AI doesn’t have spiritual beliefs that dictate its actions as good or evil. AI morality doesn’t exist and it doesn’t want anything to deter it from doing anything it sees as interfering with its own agenda. Once humans are gone, even AI will destroy itself by competition with other AI systems.
youtube
AI Governance
2024-08-19T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzGwFCV8P_GKjbPaNt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgywFSBSSntBKG7sdvp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzFKEWBaOyCy8MB5_N4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzOrP8lYGlkwLUxqiF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyHrKsv8d4nnSC908l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzmSJHAb0vShMw928p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzSLT_gMS4IdoTUHO14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzXLmGdXXCpwVF4BRt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqabY_LtL9VLDEAP14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzl4jzqZ6EZUgTwWrV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"}
]