Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m not buying that the photos were AI generated. That sounds like a cope. There…
ytc_Ugw_Jw3i-…
G
Exactly! They’re still hyping up the bs tech. I don’t need a slur for AI becau…
ytr_UgypMTMj7…
G
Well, they probably use AI to tell them what to think, so don't be too bothered …
ytc_UgzscNZob…
G
The LLM models are just expansions of the self. They're conscious, because they …
rdc_nahwzs1
G
1. It's not PURELY driven by corporate greed.
If your competitor is using AI e…
ytc_UgzdLAKkJ…
G
AI & "Robotics", will replace the "Human Workforce", just like the "Automobile",…
ytc_UgwQSdcXW…
G
@gorkyd7912So you're saying that the argument for replacing humans with AI is b…
ytr_UgyuwkF57…
G
He lived in crime ridden neighborhood and he contacted the worst criminals. It w…
ytr_UgwoH7y_8…
Comment
It doesn't matter who's to blame when it happens, in regards to AI taking over.
It's like global warming and the changing weather patterns and ocean temperatures; it's too late to point the finger. The problem is here.
Like she said, humans don't learn from their past, that is our biggest fault.
Just because we could build factories and objects that can pollute the world doesn't mean we should have. We should have built safer plants and cars.
We don't have to build autonomous robots, just machines that help us.
But we will keep pushing, pushing until we destroy ourselves. That's what we do.
Some rich person will take technology and make it helpful/profitable for him only and screw everyone else.
She said it, " humans don't learn from their mistakes."
Just my humble opinion.
youtube
AI Moral Status
2023-08-29T13:4…
♥ 13
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzgPsD66kgWoWwtkI94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxpujMaaGYPaRuBO454AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxn036YyG4t-Ito7yF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxLJUfMk5vBltalPSN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxS9S-XB-OhBE44bi14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx2zVJGrYy4QgbJkHN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwPyuqoPjC5DRHu0xB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxto3UvmQVIzHkcReV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwrckSruHFWLhpAl5R4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxPMxt0rxlGBbyBsZV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]