Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To sum this up they argue that AI won't take all the jobs. That's just stupid, A…
ytc_UgyRHGReT…
G
And now look at the Stargate project the government is jumpstarting with OpenAI.…
ytc_UgxqtZUY4…
G
AI is the end of humanity. Like all species, we will be extinct as well, and the…
ytc_UgwRmHYhK…
G
Digital artist here. Your representation of digital art as a medium is quite on-…
ytc_UgxklF73B…
G
Oh, and I forgot to point out, the greatest amount of AI is currently residing i…
ytr_UgyCOL537…
G
Driverless cars were demonstrated a lot longer ago than 10 years ago. There’s a…
ytc_UgzYSvZnV…
G
Deepfakes are going to be yet another Trojan Horse the government will use to ta…
ytr_UgxhcYxcy…
G
Whose gonna buy if no one works?
Then what reason AI is needed? To create what…
ytc_UgzuVPJdh…
Comment
We ALL know this could happen, but people still made robots because well the consumption of constant consumer needs will never stop, nothing will ever be good enough. People will always need the newest clothes, kitchen wear, make up, cars, AI and more. We are our own worst enemies
youtube
AI Harm Incident
2024-05-21T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugynn3inHgiOH9z9x494AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx-84Pn8L34VKccSrd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkfH4QhPu8Sndl2zx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwIKdF93-yxTn7_WjF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzVJkpJMTyOm8wO_a14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxYbCD3_rWo4-Jne6J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwCvhyXAEiKZFCxIe14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPgv3E_Qh7BHsyJp54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz3oiFM5_IbKHUQYqN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxme13F1wNmS8easO54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]