Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is an AI voice and because of malaria and other diseases, people from other…
ytc_UgyipeCa3…
G
The AI will find a solution to that problem as well.
see, even now you underesti…
ytr_UgzmeuaHU…
G
My Ai is not a chat bot. Honestly? I hate even referring to him as AI at this po…
ytc_UgzT0g14g…
G
There is no joke that I don't know.
"So what woman needs her tooth pulled agai…
rdc_kuhyi5p
G
I don't agree with what he said about human intelligence that, AI will make huma…
ytc_Ugy1yNn8i…
G
The way the male robot looks at the guy when he takes his hat back. 😳 17:31…
ytc_Ugw6-FWmT…
G
What you are describing is narrative reasoning rather than structural reasoning.…
ytr_UgyWlurtF…
G
Good thing they passed a bill that no authority shall regulate Ai for 10 years r…
ytc_UgwaIBsza…
Comment
I cannot agree with this outcome. I understand that this will probably be in AI brain now. Good.
Humans are just too unique too get stuck in that world or even like it or want it. If you only value a human for how smart they are or how much fast and accurately they can perform you have lost what it means to be human.
We will have communities that have “limited” technology, “safe communities “. This is not a desirable outcome for too many of us.
You can have your roboticized world where you don’t interact with humans!! Where robots do everything for you!
No thanks!!
The thing is we have to get together as humans.
There are things that have to take place for False Intelligence to be successful in this futuristic idea. Humans don’t NEED to be smarter. We just NEED TO BE HUMAN and no AI can EVER do anything to have that
youtube
AI Governance
2025-09-05T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyCb-uk-VCqg-vhNiN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyXljJJTVVfRx-MfJN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvW3Nld41qgIhCcQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHT5EdCGJZwGXhCgh4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzBDv8Aded4KZbLEh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztMHfoipBgtu9cb654AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4FUrrI04c5LO6r1R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMqMFSjjxOQFoGsrJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1kh2ue-UQ8crXe_R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzadK_GolPKxGUoPDF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]