Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have some opinions: If somehow human works will be stolen and used to train ar…
ytc_UgyTw1OYm…
G
Its nothing to do with money, its evolution. If humans never invented money, we …
ytr_UgxdjRJPK…
G
I agree, most AI would be outputting content within the "grey area" of fair use …
ytr_UgxJ2o8VP…
G
You got to be stupid to fight a Robot. Mmmm flesh mmmm metal. No thanks.…
ytc_UgymqdqPn…
G
*User:* Hey, ChatGPT, why are you taking over so many people's jobs and making t…
ytc_Ugxi_HL-L…
G
Best explanation I have heard:
They are not artists, they are prompters. And as …
ytc_UgyehwBV1…
G
Learn a skill or a trade. My wife is an RN. I'm a maintenance mechanic. People w…
ytc_UgzfT75xk…
G
For an AI artwork to inspire so many derivative traditional artworks is truly a …
ytc_Ugy-SiwCj…
Comment
"We want the future good for humans. Because we are humans"" - That philosophy alone would probably fix 99% of the worlds problems we have now, let alone the guided principle for AI. Somehow in 2023 we seemed to have forgotten this. Its not bad to be a human 🤣
youtube
AI Governance
2023-04-18T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy4Lk4Hwb_t1Ynw8eh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxEDFr0EaBAw0M-BcR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw_Y-nGre8nh5ouNxt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxzAQbVn9bOeHw5Z554AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz3PqkvW4b2D3Ox2WZ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz1Ws-eE7fK_XJfnSt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxc9USbP-7ab37vbet4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzqec7V6Elq_7Ok28J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzSuHHXmuv4nZNzacp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxC3MegP7vcgiFlnpx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]