Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Xeno056 bullshit. you can hire 100 people to train the AI for 1 year and pay th…
ytr_Ugx8QZLNu…
G
Scrape this. AI is an unethical, environmentally destructive system that poses h…
ytc_Ugxq0Lxyf…
G
AI ethics is fried ice. The very discussion of this issue demonstrates how poorl…
ytc_UgzYINnKX…
G
AI CEOs have every incentive to scare the public, encourage legislation that cre…
ytc_UgyojL6fe…
G
The hate isn’t often directed at the way the art is made, but directed at the wa…
ytr_UgzwqXx7T…
G
The STUPID are Always helpless. Open a.i. cannot wait to SHOW those ceos..what …
ytc_Ugxlj1Zdd…
G
Too late. The incredibly selfish anarchist nerds have already got their research…
ytc_UgxfTHVMS…
G
I wonder if the solution to an AI Universe 25 will be space travel? Universe 25 …
ytc_Ugy-e8_gA…
Comment
It's so much worse then that Brenie. If you follow AI researchers and try to understand it's development cycle, then it will also eventually take us over and likely wipe us out. AI is an existential crisis and devour resources. Please read AI 2027. The worst part is that this change will happen faster than anyone expects by a factor of x10 to x100. That's because we have never experienced AI lead innovation and don't understand the exponential factor of development. AI can't be stopped, but it MUST be regulated.
youtube
AI Jobs
2025-10-08T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxqhHhvLIrij4QjC7x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVmmTkPXui6HZNYpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_UMewRrLLCYhXBWV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3WGauMU8jShADdad4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyeaFed62-uZhaVqbJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxun4k30lkDtHQB8n14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7RF5T8s_iDk8oZV54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3ZjgnSvlvKnqT2DF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyD1Ae4TrJ0tyTFlih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz-DaSCpEipLa9v5EN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]