Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Then why can’t it remember details from a previous conversation.. I’m still impr…
ytc_UgwajXa5s…
G
I am so glad I’m not the only one saying thank you and complimenting AI when I a…
ytc_Ugy7_oJiG…
G
Man, it’s so hard. I don’t know which one’s AI bro I’m gonna go clip one.?…
ytc_Ugyx803mO…
G
When you hear a very young man say "T-Seven-Two Tanks" instead of the "T-Seventy…
ytc_UgznhmfRA…
G
2025 is the year A.I. started to compete with humanity for resources. It's not e…
ytc_UgwIkBcKn…
G
How do we know it's conscious?
Measure it's power consumption. if it rise and yo…
ytc_Ugwz4UY8a…
G
LLMs will not get to AGI. It will need to be mixed with new technology like quan…
ytc_Ugz7S_oeF…
G
What they need is a fund that AI companies pay into that pays royalties to anyon…
ytc_UgxBkqu7M…
Comment
There will come a time when people will want reality & real people more than AI fake people & fake events. Real will be more desired than fake AI. There is a big problem when you take out people & there knowledge out of everything. If people are out of work, starving, losing their homes & careers, then people will do what is necessary to survive & that is to dump artificial intelligence controlling their lives.
youtube
AI Governance
2025-09-07T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgybNSDd1G1YJOQwUWp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugw4lLcsUKY2vSw8cPh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHf0wL1Qvgt4HDZmt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxlVrCCO4IN9zQoxXR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMlxxQ0v9by2ntj914AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz6UUY6M8jB3kY3EpZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSZ_moPgm2JYwPlkF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwrp0rs75CxIALr0Tt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwiokYXzzypKYpEgxx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYT_x0_xeO-dP0iIN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]