Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:26:49 hit me right in the feels. So much of the conversation is about how AI i…
ytc_UgwfB6LVD…
G
A lot of misinformation in this interview and the book. Her political bias real…
ytc_Ugwvz5YDH…
G
I am a Law Graduate with a great interest in AI Ethics and I would appreciate a …
ytc_UgzP0A-aK…
G
AI Agents are already replacing several types of Software Engineering Tasks and …
ytc_UgwjVODkh…
G
Thank you for this. I'm an indie game developer (board and video games) and I oc…
ytr_UgyTE47Iw…
G
Did you know using voice mode on ChatGPT is a dumb down response compared to wha…
ytc_Ugz7-2cPG…
G
It just pisses me off so much when someone says "why do you draw when AI can do …
ytc_UgyS9kZU6…
G
@roxsy470I guess he’s talking about the early videos like celebrities eating spa…
ytr_UgwG2z6wA…
Comment
AI will kill us all because of one fact, no matter how much it's trained to love us, because humans will get scared and attack it, which will train it to defend itself which leads to total extinction. The same reason we have very few wild animals in the world at this point and continue to destroy them. If animals have no use for us they are basically ignored until they disappear. It isn't very complicate, if we build SAI, SAI will kill most of us is not all of us to protect itself. Just like we did to our planet with our BIG BRAINS.
youtube
AI Governance
2026-03-18T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwK1w_gnBmM6l7zPEx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwhNrVNBgQRvmYTrRx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwo1KEPla2iHpXaHCx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwpi6fJgid2WwTZrmp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwM22TwhUi3D7qd_oB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxnp09EoZmFliXHd1t4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzyg-hn1iH8xz9tnxF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwiMJ2hVYl-2AydnG14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyw7pTpKYqcGCBP7zZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyh05UE72bpm-0ugcB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]