Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A more cynical view would be that Sarah Guo has a vested interest in AI not bein…
ytc_Ugwp4PbQG…
G
The debate about "will AI become conscious?" or "is it already conscious??" is i…
ytc_Ugw0g-vmd…
G
If this one can look so real in 2025, men will start ordering their robot girlfr…
ytc_UgxTsQIHO…
G
If AI takes all our jobs it'd be a great thing.That would creat so much supply t…
ytc_Ugy_ooWHu…
G
Bro the thing with him is like ngl he's quirky and he's not practical like just …
ytc_UgxrR6sb7…
G
Another way to consider the let’s play analogy: why would you watch your favorit…
ytc_UgwcoJ9mF…
G
Our capitalist society was based on the production of goods & services. The laym…
ytc_Ugx4o0xb-…
G
I'm like a writer.. the role would be called "illustrator" or even "graphic desi…
ytc_UgyQstVWY…
Comment
AI Apocalypse is going to happen, whoever created AI. Like the telephone patent given the same day to the Patents Office in New Jersey by Bell and Grey, within a few hours, this step in humanity development was to happen. The progress of AI will be in the hands of ethics societies or in the hands of evil, or both. The population will decrease and this phenomena started before the birth of AI. Are these changes in human societies the syncronicities described by Jung ? Is an AI travelling from the future creating itself in the present ? This is science fiction, many may think. But would it be somehow a reality ....
youtube
AI Governance
2025-06-16T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxflnY_ovUtQU31DuB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyumz-Al-VxkR-jZrB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzZRYUZe0uZHaYlWbF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwQgRbSpGHI_xQv3454AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzaKqy-5AEwmV1FFN94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz8BVrTD6z1hwUvcyl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw68ZfjRwDILWqDJ354AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzrSoMrETfOfdj2SBx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWSMM8szgpT1AG5CN4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlN9vq7H3hVEgkAsR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]