Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I could do this for all points you did maybe cloud not just because let Ai build…
ytr_UgwJDWbbx…
G
The race to quantum conscious AGI. “Global Civility” is an AI-generated sci-fi f…
ytc_Ugwlv3fXR…
G
I don't want to get her argument against self-driving if she says "Cars need to …
ytc_Ugz9bZi1I…
G
Thank you for your positive feedback! If you're intrigued by innovative technolo…
ytr_UgywvF3l2…
G
lmaooo, the second one got me.
cause... I can credit everyone I take inspiratio…
ytc_Ugwn8S6et…
G
Now is the time for my own sake to live of the grid. I’m old enough and ready fo…
ytc_UgyZ0q2Fj…
G
@AxurNuvae oh god- I’m sorry then I thought you were talking about the title ab…
ytr_UgxRLOXYk…
G
If anybody bothered to read the papers would have reached this conclusion. AI wi…
ytc_UgyRk2XyI…
Comment
No, present AI is built on a computational level. It can't make the leap to consciousness. Intelligence by definition is consciousness. AI is not intelligent, same goes for most humans. Human error or war will eventually lead to necular war. It's all over for humanity then. Don't worry about AI as it is now. We all should be very worried about necular war & environmental disaster instead. Do we realize how close we are to extinction. The way things are going now we are just rushing towards the abyss.. The world hangs on a thin tread - Carl. Jung
youtube
AI Governance
2025-08-29T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwrlhcuN2oR-XvmDIx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-PWV_49ciTwzosi54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyLM8qKHIRajkzDr6V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCaHy5HoBDhlwKA3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw0yNhXMsYZx7RQlb94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzlPNd7JlBx993oMeJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx79D_xrmWh9czScYl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyKB2Pez5y32DOnjCt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJebW_hnJ8tFnG2ot4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzavKsZ8ozn5F5iqOl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]