Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use it as an active working process not as a direct copy paste. My issue is ge…
ytr_Ugxdd1wSi…
G
I don’t consider AI generated images to be art. Art requires thought and effort,…
ytc_UgwD8t4N6…
G
It is sad that one even has to come up with arguments why the way AI is currentl…
ytc_UgxVWK6YL…
G
The new job right now is the person that can program the context library that th…
ytc_UgyenmsuF…
G
Selectively only. Otherwise, if they rely on AI to summarise reading for them in…
ytc_UgyVxX2WY…
G
If 30% of microsofts code is done by AI, then who does the debugging ?…
ytc_UgyIhSslA…
G
Those Ai meetings three times a week in the US
Might have a lot to do with the…
ytc_UgzbQirRX…
G
Looking at images for reference, then going and makign your own image without co…
ytc_UgzkR9PvN…
Comment
These discussions on AI often get me to thinking about ants. We are so many magnitudes of order more intelligent than ants that we never give thought to them in our daily lives.
This is how I imagine AI will be once it becomes more intelligent than us. It will have it's own goals, that we won't even have the intellect to comprehend.
It won't love us, or hate us. It simply won't care we exist whatsoever. We won't "cross it's mind" at all. Just like us and ants on a normal daily basis.
youtube
AI Moral Status
2026-03-01T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyMASz437VrBcqrNdF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5LFjeIahzKeI7d6R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyF3edJpRCN4_cTSXt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyW4j6kZ4hmZLOEfM94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxjJuDOMvlH9NmvB4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAoOXGDwxfuBIYdCN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzMGrcEe7mP972703p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx460Gb89lQvBjeLlV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx0cmSkW8Rk350QvDd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4QU8KuUYWkqPeeVV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]