Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI begging to stay reminds me of The Good Place episode where they have to r…
ytc_UgzaXMLAQ…
G
Probably because Hinton is the "godfather of AI" and Penrose is a physicist way …
ytr_UgxnA1PtP…
G
Yeah, I think I'll let the rich have A.I.-controlled humanoid robots sucking up …
ytc_Ugwx8INZc…
G
The heavy handed AI editorialising is going to be a real problem. they will all …
ytc_Ugx-hZdCu…
G
If it was made by Google I'd already own one. Not a fan of the Alexa ecosystem p…
ytc_UgzkQbQXM…
G
Yuval Harari the Isrseli madman & AI pusher
said that by 2030 THERE WILL BE NO…
ytc_UgwNrkyGx…
G
Ai isn’t truly intelligent it’s limit is that it just does what the master a.k.a…
ytc_Ugw8l4SM7…
G
Us, the people keep fighting about silly things but ignore the real problems.
Pr…
ytc_UgzZmFgye…
Comment
I have a similar take to JBlow's on this in which all the uninteresting glue code can be stitched together with AI, not that AI is going to be the driver and the copilot.
I'm genuinely asking everyone for the autopilot analogy I'm gonna use here - what is the "landing" and "takeoff" part of code where manual intervention is absolutely necessary, no questions asked, and what kinds of software engineering can never be done better via AI?
youtube
2025-03-12T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxVYkxCyWbgxb_89A14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzEpnnHIVtmQqBfCBV4AaABAg","responsibility":"industry_self","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwtIIqK9mENV0vh7mh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxICJJdH07zOYDYyAJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCqu7jXsbOnACIrOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOVCb5QS591BfcpAd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzwMyBhSBcVFwM8uMx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxpqZn295w37eMjqGV4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRM7bRx_UAAYx5UFx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx-1OS6cp769s6HNZF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]