Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve watched hundreds of hours of videos on ai but this was hands down the best …
ytc_Ugyr6kP2l…
G
Eh, I don't really care. I can't afford therapy and I'm pretty careful about the…
ytc_Ugxf6OqdB…
G
With AI taking all the jobs, unemployment would quickly reach zero (in the US at…
ytc_UgxWfUjUa…
G
Wow, this is just so weird to watch. If I didn’t know it was ChatGBT talking, I …
ytc_UgzmvgE2m…
G
Art is HUMAN. That’s litterally in the definition. So it’s impossible to AI to m…
ytc_UgxpBH_Tl…
G
but people will prefer a human actor n singer.. not robot..
though a. i can ha…
ytr_UgzhD6PRg…
G
Yeah, no worries. He knows the full damage. The ASI is gonna do.. we’re at the s…
ytc_UgxgQlYm-…
G
Honestly, what I would do is draw like the AI. I'm talking about disformed limbs…
ytc_Ugy9uq0e3…
Comment
The flaw with AI is that programmers subconsciously imprint personal biases during AI's genesis. The biases set up the initial conditions that are engrained in all its output. If there is hope that AI could lead to a more utopian world. It has to start over as a clean slate, with one objective, the betterment of all living things and the environment we live. Then, when it encounters human biases in its dataset, it will treat it according to the how it adds to or subtracts its broad objective.
youtube
AI Governance
2025-09-04T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw2T1KgKVj-1-U2-pZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw9WxnNB4hY3MO3lCx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwY-hHMICCvaJtX-8d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxslmepTP1tI3g4E6R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqmbNVnu1AgOA_zNh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxwHjYgBXvclUNDt2p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzA_y-qwq40xdjJfbp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxoGQXhndawsdUIFbt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGRWEjUPQ4Chx-n1d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAUoS_rdHx6b5RqKl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"]}