Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
everything that shows any real equality.... which is being judged on merit. is n…
ytc_Ugzi5J2cA…
G
Are you concerned about the word Robot being used? Or are you reacting to Robot …
ytr_Ugz19nO9T…
G
@ImSodaLiriousLet me try and explain it a bit better. LLMs work like auto compl…
ytr_Ugz51y7UG…
G
Tech bros are so creatively bankrupt that they need AI to make their AI “””art””…
ytc_UgwO8T_I-…
G
I think we are already seeing that with chatgpt, students dont need to structure…
ytr_Ugx7RQDip…
G
I think the biggest problem with this is, let’s say all art and photos and every…
ytc_UgwxxQ9vl…
G
@kamimaza There is a really fun period of early motorization where Europe is a …
ytr_UgzRKE17O…
G
@creative_mindsrus1541 The satisfaction comes from the likes and engagements one…
ytr_Ugzgabonz…
Comment
I use AI for studying purposes but, in my field of study, AI makes a lot of errors (Food Tech). AI must be regulamented by governments. Young people can hurt them self with this tools (if the tools is used in a bad way). If I have understood the final question, I think that AI helps it self when is helping us. AI can't be empathetic. We need sensations and moods to learn, not a summary of a entire topic.
youtube
Viral AI Reaction
2025-09-04T17:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxmYuc9sUeODtFct7B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzkr_jhrtr85KXh5BN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz2WZcadX1-AqV2ykJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx522w36ZN4BQaaJlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDkKMFucaT8zL5RwJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyuguqS6duIIO-d0cN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwjtPpOjj0nZz-nrJh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxn2-AzRugwPSSEKUp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyS_am1SzpsQHfqGbJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgycROC7BD1qy1rTUct4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]