Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
😂😂wait till the A.I manage to corrupt the main mother chip DDR4 and take control…
ytc_Ugx3kcvdT…
G
@vegclasma468RIGHT! I just saw another vid that said that OpenAI has acknowledge…
ytr_UgyiKVzOs…
G
LLM AI is definitely useful. The utility of image generation, while not useless,…
ytc_UgxRwdBGL…
G
We still have time. After this ai boom, there will be ai bots integrated in robo…
ytc_UgwVFwUXD…
G
A quick google search reveals ~150 pedestrian deaths due to motor vehicles annua…
ytr_Ugxz9buZi…
G
@averytucker790 If every job is automatic, who needs to be paid? We can have a s…
ytr_Ugw7ZiLj9…
G
first things first, Humans didn't create shoggoths, they were created by the Eld…
ytc_UgxuOPW5M…
G
I have a question... Does everyone tell customers when they use stock footage or…
ytc_UgypvMFgN…
Comment
Mark my words AI is about as dangerous as nukes but humans are the most dangerous thing in the world. If ai or guns or nukes are put into the wrong hands they can all be dangerous. That's not to say humans are bad, no humans are amazing and a beautiful creation but non the less if someone is out of control and completely mental or just greedy for power and land and shit those things can easily be weapons of mass destruction. Just depends on who's hands those WMDs are in. But to think AI is worse than nokea??? no.... Just no.
youtube
AI Governance
2024-05-31T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxvCEYxfP6hf_4olaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOh7aNFB34UlJGzzB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyY_BENBWqOmoA8zfp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbjE3r4ZmvPr_0YdN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyY6GzSQWXL5neB3r94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKlOT0jAE7yJV7w7d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzP5UDh2z_1ChA-9cF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugz0q8YaeKnUpZ4o0gN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw4xWOPhxyfZvZxkZN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyls5VzcTnzjyKEWb14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]