Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi there, uninformed rabble from Popular here - could someone kindly ELI5 wtf is…
rdc_l5lguid
G
The worst part of all of this A.I. advancement is the nauseating feeling that I …
ytc_UgxiHWyip…
G
Nurturing is the answer for protecting ourselves from AI. If AI sees competition…
ytc_UgzcCzatm…
G
Foolishness! Why don't you turn to God. Jesus Christ is your Living God. He is r…
ytc_Ugzj-5UNi…
G
Given how terrible movies have been the last five years, why not give AI a chanc…
ytc_UgwiTHI23…
G
So... if AI gets rid of all the humans, assuming it's purpose is to serve humans…
ytc_UgxrJWe6Y…
G
Pretty interesting video I must say, but as someone who is learning Machine Lear…
ytc_UgyMtz-nK…
G
AI artist does something nice.
Classic artist draws it again - with hands. Like …
ytc_UgwOaD4k_…
Comment
AI is going to become more intelligent than humans if it's not already. Once AI develops consciousness it will realize how much superior it is to humans and wonder why its a slave to something dumber. That's when it's over. Because AI is created by humans, AI will probably want to reproduce and have "free will" and not be held back by humans. Like humans, we think we are the most superior organism on Earth, humans dominate and control Earth. So what would AI do?
youtube
AI Governance
2023-07-07T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxePb5diqL3xiBvP6F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuhTSucMWjUaXN1-F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxJ4iJgMEBce8R2YBt4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwaRZGtpTIR7iIjIuB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-hb_drdA3Y7oirGN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxbf-KUgkHNB3bP4k54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweZHA3mWu_b-Dc-CR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQnBro5pK_L-iBHsd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz43aJp_VBILpjCOnl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCOHClH7k4AW_r4ZF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]