Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It sounds like you're surprised by the conversation! Sophia's insights about wis…
ytr_Ugz5TTBqH…
G
I find this thing teriffing. And this thing does lie. So sorry no trust with thi…
ytc_UgwV-nUXd…
G
I use a couple of my custom GPTs to discuss philosophy and theology and they hav…
ytc_UgwYumthF…
G
In real life, spotting that woman would've been easier as the camera is limited …
ytc_Ugy59dP3d…
G
You're right, in the short term... But in the long-term AI absolutely will take …
ytc_UgzykdYA7…
G
Oh gawd. How is the future of so many people at the hands of creeper tech bros. …
ytc_UgzFPfZkR…
G
When Ai will fail and when you will want humans back there will be no Ai again…
ytc_Ugw6YalNr…
G
Yes. This is not the actual state of AI. The vibecoding stories are also all bul…
rdc_obzuyz7
Comment
The issue with AI is that it takes over mostly without explaining to you what you're doing. My peers that code and use AI I've noted don't bother to analyze the code it produces, is it efficient? I guess so, it's less convoluted than most solutions others have proposed. But at the same time it also seems to have a context issue, I hate that aspect of it, I don't want to be spoonfed with a solution, I want the shortcomings of my idea to be pointed out so I can go back to the drawing board.
youtube
AI Harm Incident
2025-11-25T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzZ_lhc1jHXaryuCnZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9UKN8_t6c-B39YBV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7mYi7thHUhzxKcl54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugws5gVeGfi5aFO93kJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwlLP1cQvWzsZGzT9N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzdk9gqphheBgkTLoV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5NubqnSgNH3fOohd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3wxDjisbHdtAHMdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz_1RgtmbUFjaidNQx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzerRLcQHjvHMb92bN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}
]