Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a rare moment in time for humans. We are creating something that will have…
ytc_UgztqgI5L…
G
LLMs should be chopped up into doing speficic tasks in specific narrow domains o…
ytc_UgyN8lUbm…
G
That robot is the perfect form of Bruce Lee and he fights with it. Human's stupi…
ytc_UgyhSrw5i…
G
That still supports the ai image generator by giving it use. If you using it for…
ytr_Ugx3otuqD…
G
@SpeedyGamerTV The AI isn’t using it s a reference it’s just using it. When a pe…
ytr_UgyHvXnQG…
G
As someone who used to do ai art (i didnt really understand how it would be bad …
ytc_UgxZrgCRG…
G
In Islam, the Prophet ﷺ even mentioned that near the end times, people will face…
ytc_UgyPpmGy4…
G
How can you question about robot rights when women in Saudi Arabia don't even ha…
ytc_UgjPgAGBk…
Comment
The statement of 'AI being far more dangerous than nukes' is absurd. It only diverts that attention and manipulates the idea that nukes are less dangerous and our focus should be more on Ai. The truth is yes Ai could become a lethal threat but only because a 'human' made it so on the long run in the future. The similar 'human' in present (at any near time) is capable of wiping part of Earth (if not all) by a press of a button. Now, which is scarier? Which is more dangerous?
BOTH ARE.
youtube
AI Governance
2024-05-25T12:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzNbeps_syOBXZ9ddh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0jGW1My9x-O5276l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_FZliwbxaHDb4mKh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugzg9sonK0PldNNH6vp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyr_buEF_2RHrMaPjh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgweovB5LHBxgcFMlyd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwfO2d4b_do4f_iIzZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxZ3a0H0yw-tJngdrB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxI-IbBzp5GcI0UJ_N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLzBCBDxl1c10LRAt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]