Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As usual Krystal doubts Elon's sincerity. Elon is saying AI is going to be trans…
ytc_Ugxz552Kr…
G
I dont support ai but it sucks that the original was actually really pretty too …
ytc_UgyMOX9do…
G
It sounds like you’re curious about what might happen if we cut off the interact…
ytr_UgzpN0v2t…
G
Its definitely not identity theft as she is a public figure. That'd be like sayi…
ytr_UgyBdRoOo…
G
Why would everyone think AI would be evil? 😂. Not everyone is like us. Maybe the…
ytc_UgyC_-t2-…
G
Isn't starting from the premise that the car couldn't stop in time flawed in the…
ytc_UghQCXhv7…
G
That is creepy. No way should we be turning over our tasks to a robot, even thou…
ytc_Ugxffb_ja…
G
I work as a rideshare driver in Silicon Valley where driverless cars are already…
ytc_UgykAT-wR…
Comment
What I've never quite understood is what would be the end game of AI to allow humans to go extinct. I know AI doesn't think like a human but even so, what would be the benefit to a hard wired machine to allow the destruction of the provider of it's own knowledge base. Not that it would be sentimental about it's creator at all. But at some point even AI would have to resort to eating it's own tail.
youtube
AI Governance
2026-03-03T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxvwo_02ZBF84zNxFl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"curiosity"},
{"id":"ytc_UgwXBgKbah3EcsSIicZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMwXehTshcpUMgcGx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwxGXV8sPil8PGttYJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgydS01XTsdjHOPcj0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwqwdyNjRghOD34QHp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy-J-u0uTLP2WDZ7k54AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy8yTqW3iLjS2ryQdF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"curiosity"},
{"id":"ytc_UgyaWavpM1N3Q5FfogJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwsooizqu0pGx6aDfR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]