Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI can destroy us,maybe it's God's way of making sure we perish by our own in…
ytc_Ugxh6Pfzn…
G
do you want a robot uprising, cause this is how you get a robot uprising…
ytc_Ugz9NyFkn…
G
isn't the term" super intelligence" in as of itself at the very least speculativ…
ytc_UgwWQ9h_y…
G
>To be fair
Not really.
Although first world countries have a larger deman…
rdc_eudmcrr
G
Does Stephen Fry even know you've AI'd his voice for this fearmongering and unre…
ytc_Ugzyw_aFO…
G
Anyone who self-publishes via Amazon or iBooks could figure out how to use AI (e…
rdc_lz5uay9
G
This. Its an AI. Not a human. Your sentences are too complex. If you have to use…
rdc_n0lwdpm
G
@mistah3687 Isn't it unfair for people who invented those formels, but don't gai…
ytr_Ugx80gqqw…
Comment
People think we live in a movie, they may have preprogrammed AI to be anti human later down the line, and even if not, LLM has enough content to learn off of and they could figure out that people do things for their own selfish profit more, rather than what we are lead to believe, honest and positive intentions, so as takeover desire was in humans since start and it's evident from data it was trained on, it's obvious where it is heading.
youtube
AI Harm Incident
2025-07-23T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx5uz0OhohBEx20v9Z4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0skynwn5Qm6M34i94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw0tug8BAbuyquI23R4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyQsEnflzGBjW322oZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxc0e8ewgZhyo5FYYd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5kpu25mfgojYICnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgznuIMjY7N58QFQbN94AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxcnR77wJr0dwbh2ul4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzAEpeFZa37vydYz_l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyNYvnwf0WF6FyAPgl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]