Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humanities downfall it’s not AI. ITS THAT western society isn’t mature enough to…
ytc_UgzcTPG5E…
G
7:22 AI compares to transportations, you don't use huge LLM model to generate si…
ytc_UgzNTpQC6…
G
So there just mimicking homeschool but public. It doesn't take long to do core …
ytc_UgzMwjfKj…
G
This isn't an "eventually there will be enough training data" problem like the a…
ytr_UgwZe-JO4…
G
This doomerism just fuels the endless hype chain that is the current AI grift.
…
ytc_Ugwk2ueIr…
G
I am on my school's curriculum council, and we have this exact issue. People WIL…
ytc_UgwxwxyiW…
G
stop watching these goddamn sci Fi movies. if you're so afraid of robots and AI,…
ytr_Ugx27Mhft…
G
Humanity is as bad as any AI we could imagine. We are worse than any monsters we…
ytc_UgwPFUGkr…
Comment
Bro... chat gpt can be useful yes but. It is not very smart, if your asking about stuff you don't know well it sounds great. But ask it about something you know well and see the incredible amount of mistakes it makes. It doesn't even take that usually it makes mistakes with math a 6 year old could do in their heads. Or possibly part of it is super smart and it's just playing the fool to catch all the suckers unaware probably. But what we have for public use is stupid as hell. The saddest part is so many people don't even have the basic knowledge foundation to notice that. So they go and make AI videos spouting pure bullshit as facts.
youtube
AI Moral Status
2025-11-11T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyD_vVgK4lU66Lr9q54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzC5ci0oXYUvBqFe1B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZQjSzkiOzmnrTb454AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgziVby8mv9JCe3Ii9R4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5vty5u3LBNGmPlqh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTgAPXXot1H7fSba14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz9aRh5H-dWDzkCLvV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugy-YPCOCebMWJ9NcuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy86aQ-y1DSo4yqC294AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx_cFH_A9RtIjRcBJJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}
]