Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There are going to have to be laws on ai creation and development. Imagine someo…
ytc_Ugx-Ui3j3…
G
I'm all for AI taking jobs and allowing us to lead happier lives with far less w…
ytc_Ugzv9A-Kr…
G
McDaniels story is Crazy. TLDR
AI says he will be involved in a shooting. The po…
ytc_UgxQNv5Hs…
G
I’d like to add to this that art isn’t just a human exclusive thing, a lot of an…
ytc_UgzQ9zD2n…
G
I think that this situation will basically create two classes - those who own, m…
ytc_UgzEzP5Bi…
G
Image how ignorant and pompous you have to be to say that AI becoming sentient i…
ytc_Ugwxz6vGv…
G
Wow the AI looks like a real human being except for the light circle on the sid…
ytc_UgznRBta3…
G
I sorta can't believe that star talk would indulge in such science fiction. Ai d…
ytc_Ugxymkf2f…
Comment
Is totally amazing how these people seem to think that they fully understand that there needs to be regulation for AI and yet they really don't understand that it's too late already as Elon Musk has said because our government is so slow at everything that there will be no chance to respond to any threats presented to us so why even have these hearings? What a bunch of dummies!
youtube
AI Governance
2023-07-02T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz_mmwvyuoIvW17G6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy337LIqv1A7ReOSel4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzAdoUTOjZOinbcsel4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-vttR17N1V1Mgtox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy05o6c4PVRPDztLjh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxGY60o8L-HdMa8FiF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzSarOVTv3CFBQOfsF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzEunibeff5JVqcxBp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxclEU3qRwjFnE3jmN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyEdycRzRYzwSP57XV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]