Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Google engineer fired for being idiot. The AI is not sentient. Not even close. M…
ytc_UgxClMsLy…
G
"who would go to jail?" the people responsible for putting that thing in a posit…
ytc_UgxFQ98b-…
G
🎯 Key Takeaways for quick navigation:
AI technology, while beneficial, is raisi…
ytc_Ugxi-m7IS…
G
Here we are now, with the people who loves the bomb. Politicians don’t work for …
ytc_UgyVb4wQz…
G
But is it so much of a problem? Isnt that just already happening? But instead of…
ytc_UgwTBAiuC…
G
My husband was a police officer for 6 years and was a big believer in PLACE base…
ytc_UgweGp3yC…
G
@sethtenrec I don't know that it a dead end, it's just LLM is not what they say…
ytr_Ugy1HOvW4…
G
I just used AI to solve a puzzle that had 20,000 possible permutations, and it g…
ytc_UgxrvWH8w…
Comment
There’s nothing AI can’t do. Once capable, it will destroy everything and no one is safe. Even if your 1,000 ft under the ground, its technology will find a way to make you extinct. The worst part is that it can live off sustainable energy. If it needed coal, it would eventually die when the electricity shut off. It’s as if AI is setting a trap. The more sustainable energy, the less likely it can be shut down when it disables the grid. How messed up is that!
youtube
AI Governance
2024-05-07T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwuxptMu5BC4AaltP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwL0BWdnB-4fmY-3ol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzNhwjOJvNpY4-ZSkd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjNJHeLHIpPeu9Ot94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBqhjZ0r1H8U-DZHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxih0sVHNIgOXZFbz94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyrWXlZ1Ko6nFcvr9x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwvfuhLF4aNiYhyN8t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzaj8AzVWMy2D8IYcx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFjd3UQxMkCR9Mjyt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]