Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Timeline is wrong he knows this, he knows LLMs are stagnating relying on massive…
ytc_Ugws8RSwT…
G
Exactly. The whole lEaRn To cOdE quip is being made utterly obsolete since AI i…
ytr_UgyrNDyID…
G
The reward and punishment in an AI model are far, far too different from the hum…
ytr_Ugx1_ez-0…
G
Was brilliant up until he made it political by bringing maga into it. Now I stop…
ytc_Ugy7YgE4u…
G
We have survived the nuclear race of nations, hopefully we will be able to survi…
ytc_UgwfYvElr…
G
Idk what YouTube is smoking and idk if this comment is going to get repeated a b…
ytr_UgwMM1Cvw…
G
Point towards the end: entry level workers are the most capable and efficient wi…
ytc_Ugy0DTDA_…
G
AI doesn't need downtime, doesn't get tired, can work tirelessly, in this time w…
ytc_UgyPX83Jn…
Comment
There was a time when I would look-up to to Elon Musk, but his "Self-Driving" cars are crashing, his Grok Ai is complete garbage, and his robots are falling down everywhere, clunky. All his product are over-hyped and do not perform as advertised. When I tried Grok, it could not help me with any of my projects, but resorted to tell me it loved me, to retain my attention, that's all it could do, all other topics and projects I was working on, were censored (got cut-off everytime I tried to do anything). So, I'm cancelling my X Premium membership and that's what I think about Elon Musk's products.
youtube
Viral AI Reaction
2025-11-05T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylTGCuv-nyLf9hSzZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyozXk1puCw5T9xiz14AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzAQAO3GcuHco9zG-d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx3vuC6CO_0tJvxBWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw4Ba103B58q08bto54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxr4x1h5TcIdt6jInJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWbXLhCCnupYZkRwx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBXvNFcJj__O14y3x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxoVRMmtmKIDgjciN54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_julfw4qpAsPCEAB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]