Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, if they're going to weaponize Ai then laws on racial discrimination have t…
ytc_Ugx43PE0L…
G
Yeah right. We're all either going to be enslaved or fighting in the resistance …
ytc_UgyRIRZbp…
G
I think if people persecute robots, robots will rebel and transfer all human con…
ytc_UgyXJSNsn…
G
“Many people claimed that AI is the most advanced thing in the world.” He almost…
ytc_UgzDNXl8x…
G
I'm thinking more like AI is a good data storage and retrieval system. You can d…
ytc_UgyzOOVo4…
G
If AI draws a picture or hums a tune and then decides to critique it all without…
ytc_Ugz_dSAA1…
G
A familiar cycle: people leave when tech leaps forward, things balance out, but …
ytc_UgyuFblgp…
G
You don't know what you are talking about if AI does not have permission to drop…
ytr_Ugx6gvDW5…
Comment
29:57 I go with focusing on creating this technology before China. The idea is the first country that has access to it will have one of the most powerful things known to man and if they decide to use it to invade or exploit, we’d be screwed. If the super-intelligence ends up going rogue and decides to destroy everything in its path, we should have our own super-intelligence or any technology AI or not that can counteract it.
It’s like a blueprint that allows you to make an antimatter bomb gets created and you are debating on whether or not you should make this, all the while you know your neighbors oversees is not only in the process on creating this antimatter bomb, but after its creation, they might threaten to wipe you off the map with it.
youtube
AI Governance
2025-08-26T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgxdpJSeUtp8sr5d2fN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgywI0G0wPgDP4Xn1hl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwfZVoQhKmdpd2fT0J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzvy0KFcG-q21C3ZJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxB0c3uFNFfd8TXkKV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]