Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Joseph Weizenbaum, the father of the first chatbot ELIZA mentioned that when his…
ytc_UgzdFaumY…
G
If 90% of jobs are gone to bots and AI and there’s a massive unemployment rate, …
ytc_UgyL1FqaL…
G
It would be the funniest thing if the AI clone that replaces him fired him as a …
rdc_oh247hk
G
I think the question is moot. If the robot learns by itself, and creates a purpo…
ytc_Ugj7gYHfl…
G
👀If humans can't tell that it is a robot, they are a lot more brain damaged than…
ytc_UgzWcEazC…
G
1000.00 a month will not cover eliminating millions of jobs. They red legislatio…
ytc_UgxdCLAms…
G
The only resonable use of Ai is in Healthcare as in recognizing healthy and u…
ytc_UgwhWT95g…
G
I don't mind if someone on the internet uses AI to make a silly meme, but I DO h…
ytc_UgwOGVHpb…
Comment
I hate people talking about "being worried about AI". Most people don't understand it enough, so what they're really worried about is the unknown ... they should say that.
Also, this is inevitable. And honestly, I'd rather have it in this semi-transparent space, that some wackos in a secret lab building it out.
Finally, worrying leads to fear, and then you're frozen, rather than planning how to adapt. Similar to the internet, computers, factories, the printing press, the wheel, fire ... yeah, it may change our world. If there's a shift, enjoy the ride and try to end on the part where the shift in landscape shifted things up ... else, you're going to get left behind and your fears and worries will have become self-fulfilled prophecies.
youtube
AI Moral Status
2026-01-07T09:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwo1P8kisYu_1IAwe54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAERHzdC0QhPBUAPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzU38CVeCSuHrUQ_jt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyMflifZsFXoXafBa54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyXfHEwu88GP9Htddp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHHAIpRBNdQfiV78d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxnuNPn12og6DD9ZMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwbhKyWyRViJUoFgwF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz9nrrKluo20eoRQxp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwTCO29C3Xm7_404-V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]