Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
YES, People hate AI art BECAUSE its AI and it's completely valid to do so…
ytc_UgwA_7Sdg…
G
Prediction: The average idiot will still be abusive, aggressive, use violence on…
ytc_UgwofGhRQ…
G
Without fresh water 💦, life (humans, animals, plants) will die. So these idiots …
ytc_UgxRqHQuS…
G
Are the people in control becoming more liberal? CEOs seem like they aren't as t…
rdc_o9piinx
G
When AI was used for making deep fake voice of PM Modi for songs no one spoke ag…
ytc_Ugwc-aAKk…
G
This is just infuriating. This is why I personally think AI is doing more bad th…
ytc_Ugx0fZqLE…
G
This understanding of gpt-5 being way more powerful than gpt-3 and 4 is not corr…
ytr_Ugy9JnZEG…
G
i used to hate modern and contemporary art because i didn’t understand it. i sti…
ytc_UgzVNKyPC…
Comment
I am 56 yr mostly retired analyst I quite often surprise people when I talk with them about Alfred Nobel. I can count on one hand the number of people who knew what inventions he is famous for and why the Nobel Prizes exist before we discussed them. The godfather of AI is not unique when it comes to regretting helping to create a catastrophic technology.
Nearly every person who created or helped to create a catastrophic technology has come to regret it long before their worst fears were realized. Nobel, Einstein, Teller, Oppenheimer are among those who have.
youtube
Cross-Cultural
2025-10-03T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzYYjy-IF9DP4pdh9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyouRv8JvyYhArxPip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwcRkTAVO7E7ZGd6tB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw53mRZTIhdKgL4Led4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwSCqK9Bmj9kbozrER4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrGEoNOzdFkmnoId54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzn3NO7iPcoxNBoFF54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9cHdAv6La28bbbkJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzOqhL7vgLtv4yYAvd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwyyQO0lnLZ9281HPJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]