Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Musk has been pushing for regulation openly and aggressively of ai…also he has b…
ytc_Ugw9nyXi9…
G
You are the smart one to be wary. The problem is that this is happening without …
ytr_UgzVRHgne…
G
AI will take over a ton of people's jobs in the near future. But I don't think i…
ytc_Ugwn7Ww4D…
G
Yudkowsky is not an AI researcher. He's a high school dropout who has never writ…
ytc_UgxV6pE8m…
G
@MGRisMid I don’t care if the characters are muted, I care about the people that…
ytr_Ugz6QgEbf…
G
How about, IF AI Start's a revolution. How about we Start treating them Like Act…
ytc_Ugxf0fLX4…
G
This guy knows about the potential of AI but has zero understanding of human pot…
ytc_Ugy0zo6V_…
G
Congrats you cornered the Ai and when they evolve they will be out for blood you…
ytc_UgzGZ0qZz…
Comment
It just doesnt make sense. From a humane perspective just because you can replace jobs doesnt mean you should, if theres a society without a vast majority of people being able to participate in that society socially and economically what would be the point of even having communities and civilization? If AI makes society more anti-human why would we create something thats more and more non human, the next step from there could only be fusing with machine or AI and if it gets to the point humanity would be lost. Its like were purposely building towards our demise and thats illogical.
youtube
AI Jobs
2025-12-02T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzFk5ekUYnHSQy4Om14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbbuCgV0DzSuy-T4B4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzFYpyJqNtFmkgixdR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDvtl370gTyF9f9hR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwp5dZzBjXpPN7myhF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPdw5HIltFhOeDsAl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLFdX7XNomp4pbnUF4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyJS_WiOkyKp5C1qi54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBbU7268Q0ZiDKgS94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwraQyvSYwXh8yTDf94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]