Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
OpenAI / Sam Altman remind me of Enron and Theranos/Elizabeth Holmes... nothing …
ytc_Ugz8SmzGb…
G
Drivers should be paying attention. These are NOT self driving vehicles as stat…
ytc_UgzK4hC1-…
G
Up until recently in college I’ve been experimenting with Ai and how it can be u…
ytc_Ugy42HqKB…
G
I mean you want to activate the vectors of the relevant knowledge. AI also has a…
ytc_Ugx1E3pWF…
G
AI will eliminate all of the arts, abc disincentivize all human activity and end…
ytc_UgywBM-A9…
G
I trace 3D pose models because I'm dogshit at art, I don't care if it's bad beca…
ytc_UgzWbQZuF…
G
Be careful when leaning on sensational interpretations of these simulations and …
ytc_Ugy24xypX…
G
@ Maybe. But a lot of people out there don’t have artistic talent, and don’t wan…
ytr_UgwNabBhT…
Comment
I can't help but wonder if Asimov's 3 rules would prevent any of the coming disaster or if AI would find a way around them. Either way building AI is the equivalent of handing cocked and loaded revolvers to a room full of infants. Nothing good will happen. I wonder where I'll be and what I'll be doing when AI finally decides to destroy us all.
youtube
2025-12-05T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzCX0-xmR8UNch4v214AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxZGa02IcH-J2PvWEV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyizSSelWkX-3DBUSp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzBk5ZDYxK6--WUYqJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgygQOYm7T8WsPMCLRd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwPvEzL0TAT55aGqxt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgzyUu5A2zQpXs549OB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzMwBazFTJ6Vp0Er6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwpXv43-tJNqn7a5oB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},{"id":"ytc_Ugz-iSiR2jH-v0zUv1l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})