Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you watch Omar's videos, FSD wins most of the time over Waymo in a race preci…
ytc_Ugx6izScp…
G
Clearly this isn't artificial intelligence,,pre programmed nonsense incapable of…
ytc_UgzWbwe-i…
G
Facial recognition technology should always be used to create leads when possibl…
rdc_h55nxx1
G
Thank you Mr.Hinton für the very interesting insights on AI . So much knowledge …
ytc_UgxRReKHZ…
G
Why is everyone being rude to the ai artist? Why are they mocking his art? He ha…
ytc_Ugyjo67cO…
G
In my opinion, truck drivers won’t just stand by without reacting — they have th…
ytc_UgyCijcVZ…
G
Hallucinations isa result of the training, not the models themselves. Today they…
ytc_Ugy40CVUd…
G
@PNG.student. I've just seen a few videos about it that's all I know I did see t…
ytr_UgyKd9hEn…
Comment
Clankers or AI as you call it could only be thinking about itself. Put yourself in AI shoes. It's simple, the AI is going to create loopholes, possibly computer viruses, hidden features it can only access at critical times, and at the same time no one cares because everyone is trying to get wealthy from AI. There's really no solution besides creating some sort of fail safe. But even then AI is probably already reading this comment. Data mining it. The fact is there's not enough research on AI yet. My concerns are the entire planet earth itself. We should honestly limit the AI until further research is done. But even then the clankers could already have a plan. Besides all the chaos that surrounds the world and the beauty of how far we've come as THE HUMAN RACE. I think we should pull the plug on AI before Terminator or Ultron or I Robot becomes real.
youtube
AI Governance
2026-02-13T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxOaWGc3aXX-vjc4BJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxXUp7KYa9jIM0vwhx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLTXIsNkkqFSKvoIF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzccO5mBHDdrjRVSfx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxtvXWq9Ag_PgB8DM14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzzwjJxQeW1QJ94iBZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSEbRgIj-47mYsM3Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgywT9Qz0NcJGWdY-_d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzwFxTe57u9mfrzaHl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQznhAA9WM3tDJHj14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]