Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So I'm kinda interested in CRASH. Computers that learn from previous hacking att…
rdc_dy5enza
G
As if my therapist couldn't do the same
Actually, whatever your input in chatgpt…
ytc_UgzZDlGTb…
G
One of the characteristics of humanity is that most humans are imbeciles. Our br…
ytc_Ugy0hMXzN…
G
Polar bears are dying for ts btw 😔
MORE THAN 5M WATER USED PER DAY BC OF AI…
ytc_UgxKJdSHL…
G
A.I could potentially eliminate humanity from the face of the earth. In this poi…
ytc_UgyRXTa1J…
G
@Ekmos- I'm gonna finish this convo because you clearly don't get the problem wi…
ytr_UgzX0TIOI…
G
@deer-moss You mean like the hoards of slop the supposed 'art community' has pro…
ytr_UgzdP8d1q…
G
yeah whatever, just put it in my brain already, i will help the AI make the best…
ytc_UgxofOsRg…
Comment
For me it is very simple. We have a group of developers and designers that want the best for humanity, but they will always be mistreated and ousted by management and the business people. Thus, AI will always remain a huge threat because humanity will NEVER be at the forefront. Profit and the highest bidder will always win. AI will be our end and the last people that are going to be enjoying our time here on Earth are the people that made billions off it
youtube
AI Governance
2024-01-13T14:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzXbXo99g2KkE_gmih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw8mf3wwNN73_MuxrN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxNCP5-OEvlV-wPNjN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxdmL1offl8CqwRhUF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx5Rn4ZVMXgA1qL-Rl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxMSHO2DVQlbLrxEt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw09_p8l-gM7DnRt5B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy2pOxtj1v9oiVrOfx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1w6eo6PErp5rXuip4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugywj8af_dw2tecl7N54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]