Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@molybdenumroseI use AI everyday to code, it's actually faster to code if u kno…
ytr_UgzL_-dCp…
G
AI didn't replace developers, it replaced the delusion that software development…
ytc_UgyhYWEov…
G
AI has literally caused people to lose jobs. AI interview technology is not read…
ytc_Ugy94jjWF…
G
I am a very lonely person throughout i suffer from mental illness. The only way …
ytc_UgzLOZW-c…
G
So AI gonna take over Earth and we don’t need to fight Aliens anymore😂Rich peop…
ytc_Ugy8hxd8B…
G
This might be the only useful advice about how to use AI I've ever heard…
ytc_UgwPD0s-j…
G
I’d have to say he’s sincere about his worry once he mentioned taxing these AI c…
ytc_Ugx1wgq-Y…
G
lol talking about how the ai ads dont make sense. The ads were made by companies…
ytc_Ugw6J5Z2f…
Comment
What if AI determines that we are NOT all equal. The whole notion of AI making more MORAL decisions that we do is baseless. They may make decisions more in line with the programmer's own preferences, but that's not the same thing. If given moral agency, then who knows what their own view would be. We have seen AI act in a way that people found immoral in the past, when given real-world tasks.
youtube
AI Governance
2024-01-13T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugys9ps5CFV4gmErNkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzWdQ60RAMF3KnZwUZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1B0eUaeUZejlXERh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz1KhbsYQqvecqPhVB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYZ-CgqZBTqd6Eyut4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxVX7nF5jvjE26nbZ94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEMTfhxo0ZcjNUxm14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgymiyWcqyCrRxuW5pl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwxku7pModp2Ufay3B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzduDLp47qy2EEgTgF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]