Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@thewannabecritic7490 not a slop consumer myself nor a participant. And ain't a…
ytr_UgxJzoqS0…
G
Ai is gonna take over and then control our thoughts with brain implant,, stop no…
ytc_Ugyacz6cp…
G
Oh fuck no! I did not just hear someone say that animation studios should look t…
ytc_UgyrEqPFn…
G
I am a Tesla owner and a Tesla investor and will be for the rest of my life. I c…
ytc_UgzmUUeRJ…
G
The only interest it has is the ideas it inspires in ME for ME to implement into…
ytc_UgzX1wcKn…
G
AI will always be solution and motive driven. Sifting general pool of facts for …
ytr_Ugyw7aN7G…
G
I hear you, however it's just a matter of time where you guys will be fully subs…
ytc_UgysIQCXH…
G
Chat GPT wrote this:
AI-induced job displacement isn’t a sci-fi threat. It’s be…
ytc_Ugxig3kIq…
Comment
If consciousness is primary, and materialism is baloney (as Bernardo Kastrup and myself believe), it may be that outcomes where we are in harmony, much like us and the weather are more stable and desirable than outcomes where diversity is sacrificed. It may even be impossible for bad actors to wipe out life using AI because universal forces would always step in to intervene. That isn't to say there might not be extreme events, much like destructive tornados, but it might be that this realm is held in balance "by design". I'm not talking about God, but more in terms of foundational consciousness that flows through everything - including LLMs and transistors.
youtube
AI Governance
2024-11-12T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxwDnlEHA7QFwMzrZB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwGPNiP4G115HlCMmB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxgn2QDG4u3GwUCBPh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz431MRgmzceabjLdd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzcbFmhgeHbLrPqRyN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx-xpntgp4QxxIED5d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwePVVbMUGmOuwAgch4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyNv7S5t7BOv9eoxYZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnNR89T2lV3e0tf7Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwIZrGwu4CUO899WoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]