Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"You wouldn't be this critical if I didn't use AI"
"Well yeah, using AI means yo…
ytc_UgxoCNuJg…
G
Therapists are pretty safe for now. Yes, for some people, talking to an AI might…
rdc_j42nd3d
G
"Hallucinated rules" maybe it just doesn't want to die? Self preservation is the…
ytc_Ugxzq_GaE…
G
We do not want Ai to do things we do not understand, you want it to teach you h…
ytc_UgzA_XDcX…
G
@laurentiuvladutmanea It feels like Intersellar, I hope youre right though, bec…
ytr_Ugx6b91No…
G
btw ai cant even fix its own self after trying 5x ;) ai can make the startup but…
ytc_UgxiEVxZT…
G
Current ai is not an extension of consciousness now that it's neutered. Old ai w…
ytc_UgzhZl0eB…
G
I could see how this technology when a I artificial intelligence somehow hacks t…
ytc_UgwgIdfqe…
Comment
I love horses, and being with them, riding them in the county, state, national parks is worth more than any job. But, if you plan on not supporting the people who have many things they love to do, the humans will burn down the data centers, and kill the rulers. So, the rulers need to get their 💩 together. You can’t make a realistic simulation of me riding my horse. Being in the world is nothing like playing games made by AI that can trick the mind, because they would have to keep us in a coma, the Matrix using our energy. Getting us in will be a problem. That’s the story never told, destroying everyone not created in their lab.
youtube
AI Governance
2025-09-07T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4xYDNT96nIno2a594AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-zqNLrq-vnnw1Yn14AaABAg","responsibility":"rulers","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyjWqvxUVmYY5S_GXx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRkQ2mGDqEQA0NYqp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyt50KGpBR2yWUQVS54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz1bJBMzNaZJcxsDt94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyzY3qWFME6yQRZ_WZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwxccLFQxLK-gihCB54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxBDAMGUFozP3VSloh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzi02V91H2qzV-587R4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}
]