Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who was the first human killed by AI as it said in the clickbait. I want to name…
ytc_Ugy-7lT44…
G
the diary of a Ceo. What's the episode "the godfather of AI". We can today rem…
ytc_UgykffasF…
G
Ahahah. Ima tell you this. No robot is going to do what I’ve been doing all week…
ytc_UgzlmpeZL…
G
This is why I hate using character Ai, but I don’t judge you if you do.…
ytc_Ugxb0EM1b…
G
Banning AI for everything other than reserch is the best solution. Universal inc…
ytc_Ugx-pOioM…
G
nice job using AI visuals on a video about how AI is some kind of evil impossibl…
ytc_Ugw6qGhSN…
G
When Man Denies who God is, who is forever Blessed, he becomes ignorant, replace…
ytc_UgylFSVSf…
G
In all the films which country is it that’s sets up AI that ends up destroying t…
ytc_UgxKTb-ly…
Comment
I believe humans are not suited to share their existence with AI. We have our own pace & timeline. AI far exceeds our limits, to the point actually damaging our biological harmony.
youtube
AI Governance
2023-05-06T15:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw9TXVwjjSFNLCG5c94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwsy7OWz9o47GDwaTN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgznXMwzDCQC5nUncth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXcPZau39GZ3uqJhp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2HtbVJr6LPyqMZ1V4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgygcFWmWWHKLPGi96R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwH5BXVyU4_EDmwvWR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzpRb2faheJKIJGxVl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjhZbRVedHH8KG2X54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugycwnnv2ZzEbVJ8ioZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]