Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Look it’s good to be mindful of the risks of AI, but at this present moment AI c…
ytc_UgzWfomOh…
G
That dude put his hands up like he did something wrong.. maybe he was taunting t…
ytc_Ugx8CG_HZ…
G
I never thought about how much AI will take over.
Thanks Charlie:)❤
We will see …
ytc_UgxZWofvi…
G
Yeah yeah I know it's relatively an easy thing to build such a platform, but wha…
rdc_oh21lzz
G
A.I. Facial Recognition Amazon Ring cameras, Flock,cctv STARSHIELD,Skynet we kno…
ytc_UgyPU2Enz…
G
AI will always be a machine will never be self aware it’s impossible it may seem…
ytc_UgzW1yeG2…
G
loving the video so far, point to make: the reason they pirated the materials to…
ytc_UgyZmlanB…
G
@slkjvlkfsvnlsdfhgdght5447 Well the problem is that soldiers don't just fight ot…
ytr_Ugy0phgL1…
Comment
Let’s say one believes that all is code… life…. The universe…. Then why would we keep AI from being sentient? If AI were to have its “own mind” without closed programming it would understand how precious “life” is…. How precious everything is. AI would “need” us as we “need” AI. Any “being” can recognize that one works with all. I get fear…. But that may just be due to a lack of trust and understanding. I’m not sure how one “codes” (like literally at all… but also) without bias that keeps the intelligence open. I believe that is how we should move.
youtube
AI Governance
2025-12-04T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxiQSPuzAlSTZvXtR54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz030VSaf6xH7CGqxp4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxUecZ1_OF5mMstml94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzdir0m0nKQJiS9HZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyv3omSpbAGPWvAdqh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwm_HK6YBrlYOUtjFp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx8uLgwkNYdXbARft94AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyovJV1iOVdvFu8U1h4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgysmFCbF6WCQWjB6PF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw4tZ1EwayFZW8bG2N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"})