Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Total BS. Every computational biologist in my company tells me how much AI help …
ytc_UgwSMFDqK…
G
Yes, this is so obvious. AI just calculates what the most likely reply is. The c…
ytr_UgwxPC7nT…
G
We are the consumers not just workers😂 they can't continue to profit if the cons…
ytc_UgwpOrxAO…
G
I don't know man. I can't take away from this man's achievements but he sounds …
ytc_Ugy7gT82q…
G
Yeah, nobody needs Baldacchi novels in machine learning DBs to cure cancer. The …
ytr_UgzfEii3b…
G
Yeah I can already see it. I’m in college atm on the GI bill. Students ChatGPT e…
ytc_UgzGe4YBq…
G
THE AI TRI TUE NHAN TAO THE OBAMA CARE CAU KET SCAM HACK THE 9 CHENE & WEB W…
ytc_Ugxlf5AMF…
G
In 10 years from now someone is going to say to one of these students. "Oh youre…
ytc_Ugz7fcQ4l…
Comment
This was the most dire and abysmal AI dystopia I ever came to experiance. It's just brilliant, that it's just total fearmongering. First of all, AI doesn't need to be conscious and second of all, every other option than humans being kept as pets, would be far faaaaar less energy efficient than anything yet discovered. If a sigularity takes overr control, the last thing it wants to do, is getting rid of billions of years of evolution, which yielded the most capable AND efficient being to ever exist in this universe.
We could just be happy, when it abolishes capitalism and be in charge of a global planned economy, where no mofo egoist can accumulate wealth over others.
youtube
AI Moral Status
2025-04-26T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzJPsbZUgnZTCsGjsZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOe4ZURwiEyf4MpL94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzfcJzuijugyHuC3Bh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxyppPb4dtr5SRP-854AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRC2yQxV1y5ISEWmJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbXjsRkbBLgps3MtN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqvP_89QFiSZeh0NN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxgXhiH1lazqWDAxjl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy7cpo-6OMBJG0Nyo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzE5LfWGRo6l0wBBgR4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}
]