Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I had an intuition that AI wouldn't be able to produce the code that I would be …
ytc_UgyVer4j0…
G
Same thing that happened after the dotcom bubble. Consolidation.
We don’t need a…
rdc_nk8ebfl
G
Cope. The era of the technical programmer is over. Anyone who has seriously inve…
ytc_Ugx9eIihk…
G
Oh yes AI is progressed so fast to the point that when you want to generate a do…
ytc_Ugw7BqWbx…
G
Lol I’ve seen this fight and there no robot lol but editing is good 😂…
ytc_Ugz0CdB0k…
G
I find it hard to accept that various road authorities in Australia have allowed…
ytc_UgzmObw2C…
G
I stay convinced that with good architecture, isolating features/components to s…
ytc_UgyM4SQcb…
G
As I've always said. AI is psychopathic BY DEFAULT. it's ALL it can every be.…
ytc_UgwNJe0ip…
Comment
Go back far enough, and they told us we would be living in cities in the clouds. Man would be living on Mars. Your car would be nuclear-powered. All kinds of nonsense and unrealistic nonsense was to be tomorrow's world. AI and robots are designed by humans and programmed by humans. The faults in humans are passed on. Plenty of examples of how AI got it wrong because it is not human. Everything humans make goes wrong. I mean everything. Aircraft, electric vehicles, the internet, banks, politicians, police, agriculture, law, food etc. AI and Robots are just complicated tools.
youtube
AI Moral Status
2025-04-27T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyVfyBXivp_9In7R_R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTmpgAJvviQzTWkEh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwAvhYLl58_6_j_tgZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSWH9eMEXUiOAYGyZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxjv77kadE4spSnrUN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzM-udII6vxPT-2Sh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_FM5npWBcdSLzgBt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyW_vlv-Pq9J1UtdjB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBUyJuw6xVBuxfu0p4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2jtYJkgYYYhG-k_54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]