Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is like arguing we shouldn't have robots or AI that breathe, blink, feed, m…
ytc_UgxsE6j-l…
G
People who reject AI are no different those people who rejected automobiles, com…
ytc_Ugwb6Ubp_…
G
Yea, wait til a studio finds out writers used AI to help with a script then sues…
ytc_UgwLBqhKf…
G
@jpmor7327When everything got automation human will free from their work and do …
ytr_Ugxq1QP0r…
G
I actually knew about the ai racism due to a paper I did in school. It's really …
ytc_UgznvjXO-…
G
Lets replace CEO, presidents and politicans with AI i think it is the right time…
ytc_UgyFJLz70…
G
when no one has a job and everybody is poor who is going to be able to pay for a…
ytc_Ugz1pl58N…
G
A robot may not injure a human being or, through inaction, allow a human being t…
ytc_UgyOQW-sb…
Comment
I'm on board with eliminating poverty. Sadly the political willingness is extremely unlikely. Creating a super intelligents is a moronic idea.
I certainly don't trust Billionaires to protect us.
Ai is already causing too many problems. The possible threats to society/humanity Ai poses is endless, and should probably be banned or regulated heavily with serious CRIMINAL penalties for violating guidelines. Ai data centers are also horrible for the environment and the communities they are in.
Elon is a jackass but correct that if automation and Ai take most jobs few people will be able to afford anything to survive, and government financial intervention would be required. In fact without that intervention literally millions of people would die.
youtube
Viral AI Reaction
2025-11-05T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYz-bIbKNrmdHWi6d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzUa8cI3M2qOojaJDR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxoxXzqWLfY2dAg35Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwVHV7W-329zSgkZDp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFSzgrTV3kC7O5_9p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzEElW33rN6CZaSCfp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzhzxhTcDFFFJ63iMl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzg3FidpELVqz9uspV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwLzGrCMznpzUPHSn14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy6gYAYWbscRklmR3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]