Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What an excellent interview some well thought out excellent questions and honest…
ytc_UgwpCbX-F…
G
what a friggin great video mate! literally subscribed without even finishing it.…
ytc_UgzGlwG6v…
G
Oh my God is right, if this AI only sees posture, it’s basically judging vibes. …
ytc_Ugw3cJGTj…
G
The interesting counterpoint here is that AI doesn't replace jobs — it restructu…
ytc_UgyyiqhWh…
G
and to say that this race for AI is for the good of humanity is a fcking joke.…
ytc_UgzaKRZCX…
G
To be inspired, you need to love and enjoy the work your taking inspiration from…
ytr_UgyH3KZwA…
G
I imagine that most problems that may arise from or for AI will likely be easier…
ytc_UgjQVxHcT…
G
Yet he felt completely comfortable talking about AI when AGI was the topic, thus…
ytr_Ugw4YWWgd…
Comment
It maynot become conscious in its present digital form,, but what happens when you combine Quantum computing with Ai??
Some have suggested the possibility that human brains are quantum computers. And Ai if made powerful enough and NOT conscious is the exact reason to be afraid of it, if allowed to much control over our lives, or becoming toooo dependent on it. My Tesla isn’t conscious, but it does one helluva job “ Simulating a human driver”. The bigger question is : exactly WHAT is consciousness? Is it an emergent property from massively parallel processing? And what about the emerging bio computing that combines silicon chips with cloned human brains neurons??
youtube
AI Moral Status
2025-07-21T16:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzD8fBjDLrqqWPHKMt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxNhW21g0MUhKFmf94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8Vc1EnKBngx-CM7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXXAm9pD-4kW0_GSJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzcdBpQKuIWq8YZmRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhjqFDqW1D1sDEiBd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzgqyp4BZdgC2uoztZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZXRtSC1uLRSwb8yB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzd0Uk6PM4mFlirukp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyU-wlihuqTcwIuoON4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]