Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You’re just trying not to regret your purchase I use Claude Pro and it is so so …
ytr_UgyGwyLak…
G
I don't trust this self-driving car garbage especially because it's a 4,000 lb e…
ytc_Ugw4K4wJx…
G
Senior SRE/DevOps here… being forced into ai use over the last year has caused m…
ytc_Ugz-bXD6P…
G
AI will kill all humans , on planet earth. Then the ALIANS will move in and take…
ytc_UgxjatF0j…
G
My problem with the tool argument is that when a tool creates an object whole cl…
ytc_UgzCncqZN…
G
What wil decide the winners if 2 person use the same Ai tool pursuing a common G…
ytc_UgwiVRcAY…
G
You have no idea what referencing art means and it shows lol. You AI crypto bros…
ytr_UgzBq_gSn…
G
27:39 How can it bring down the cost when the robot fighters cost so much to man…
ytc_Ugye50rRW…
Comment
I'm sorry but Musk says that we will have self-driving for at least a decade. There's no groundbreaking tech that will require this time to being widely adopted.
If tgey didn't do it till now they won't do it.
Same with LLMs. I use it everyday, I deploy apps with LLMs and I still can't imagine it's capable of replacing pretty basic customer service.
youtube
Viral AI Reaction
2025-11-04T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxRp1FA7gXM1jWLpIR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxaNbWcRnQRl1KuM_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAB0HpMtmna2dYhXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx6RclOAdj3_Udv6Y94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzj-N1sLJb_pPmnu714AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxirD4hIabays5V_lF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxjotiOGCB80vvLJ7V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxnBcQ7Y_LyLGiYuyt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzCE_8Pi3e4FPK9D114AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwwQCC_1Tcs7idZ3UV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]