Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thx to all debaters; ~min 60, YLC: "[risk] is lower than the Earth being wiped o…
ytc_UgzuxRs_B…
G
This is the whole issue with AI. They're stealing everyone's work and acting lik…
ytc_UgzMtwZ9A…
G
How do we know the hidden version wasn’t a decoy and ai didn’t pull it off????…
ytc_Ugxw9dKSn…
G
Yes, but it's all flawed. It's based on the idea that cameras are enough, and no…
ytr_UgxAbZ4ZH…
G
My personal issue is Most things ai trains on (even text) is either taken withou…
ytc_Ugysi5_KU…
G
AI! I have told many that AI is dangerous stuff, many will will be loosing their…
ytc_UgySEFSr_…
G
Imagine not doing a simple Google search and just hitting copy and paste from Ch…
ytc_Ugx32hOPy…
G
Elon Musk, is Fantastic!
He has the better Mind the Word . Congratulations Dear …
ytc_UgxLQKdBs…
Comment
I really don't believe Musk is all that smart, maybe the same as your average engineer at best. He has said some really dumb things.
There used to be people in podcasts saying that the Tesla full self driving would arrive at by end of the year. That was several years ago. I tried to tell them no but anyway...
I think Musk expected that it wouldn't take as long as it has. If he's really a engineer and has experience with software he should know better. We all know that the first 90% takes 90% of the time and the last 10% takes the other 90% of the time.
youtube
Viral AI Reaction
2025-11-04T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwRE13j8kG2LbTiWn94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBQpT7bG4F28aAjCZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxApClNL1LlZ5HsgJ54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyN4KKlwuPtZTebJCF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzyYjcMNpiorwpveHx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzodcGKwJrgzUDob9h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVFjznkHMexNIn8rl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxM2PswA0tTVaQc7v94AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqF8BxkGEMortKjKp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwaD74c_KJDcLIuRBZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]