Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What scrap yard wants hard drives? You can't sell that to metal refining shops.…
rdc_oi2y2e9
G
"Im a artis" you means a robot is an artis? So the programer of that robot is a …
ytc_UgwtJGTv1…
G
Nice to be the Godfather Of Dystopia. Thank God that LLM- based „AI“ doesn’t rea…
ytc_UgwIvApUV…
G
Hmmm....first comment cannot be found now...hmmm...
Anything where there is bre…
ytc_UgxlssvMZ…
G
I find ai research really facinating. and i find anti-ai reactions by people equ…
ytc_UgzxpnocI…
G
I’m a pharmacist. These jobs are secure because each state has laws that ensures…
ytc_Ugy037J7s…
G
Not to mention all of the below freezing locations that culd probably EASILY coo…
ytc_UgzquA9iu…
G
AI does what AI does.
It's the fault of the (human) police officer, he's too dum…
ytc_Ugyjr-s8d…
Comment
The thing about ai is it's fundamentally not sane. It has no normal human instincts because it's not human, and is not grounded in reality because it doesn't experience reality. So, as powerful and interesting as it is, it's not something you can let have any authority because you don't know when it might go off the rails, or how. So, could ai be a helpful assistant to a teacher? Sure. Same with a coder or even a plumber, but it's not a replacement for human workers, it's a force multiplier.
youtube
Viral AI Reaction
2025-11-23T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxyHFc1Xh3wGJqpKHx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugz8-iCu8U1VhLWthbd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzAuY_l0gZtLiN_qS54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxR0dyOilBWIYADrH14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyN2wRnLupH_YTxQuV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDUwmPdrNNg1NR1_94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwYTzvMP2G4gCU-i3x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwdJs-KiFSOTaGfO3p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxd4QhbAyixbadOZYB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgytM2VXOUFF6YilBGh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}
]