Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If humans want to become robots with no human interaction then AI is the way but…
ytc_UgyJe6FKq…
G
It sounds like you have some thoughts on Sophia's appearance! As AI technology e…
ytr_UgyzHFAx6…
G
LLMs are amazing: they allow us to engage in conversations that would normally r…
ytc_Ugx3oiiMU…
G
Seeing AI content blend like this, it's no wonder brands need tools like AICarma…
ytc_Ugyo67qyg…
G
AI/Chat GPT and all these other platforms, according to me is humans interacting…
ytc_UgwgpSwdV…
G
The pharmaceutical comparison you made is the most chilling part. If we treat AI…
rdc_ohzd9v3
G
I have a newer Tesla model Y and have been using FSD 13. One definitely needs t…
ytc_UgzfKqQq1…
G
Fuck everything Sam Altman does, straight up. This man is a dollar tree Lex Luth…
rdc_lp8lxee
Comment
The point lavender made around the 10-13 minute mark is so true, if ai is doing everything for you, learning, working, doing your hobbies for you, then what’s even the point? Summarizing texts from the people you love so you won’t have to talk to them as much? To me it feels like ai is making our lives worse even. You’re supposed to enjoy life and all the little things that come with it, not speedrun everything using technology.
youtube
Viral AI Reaction
2024-10-21T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyRwJSiUoxvYGwQmml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmehV9PAapbsosQot4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfPp9Mt8N4R4MGDq14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvE6o5CoLAnv4zzUl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSmgxOGdNyVSkCbOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwiu7YTxj5LNiR0hP54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxOF9wO99PUnmgSlp54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5bsf7hiQ2hn6H3xt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx8esUJrjT9-5bRxhd4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz89O6WjAcmpREpcIp4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"}
]