Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Death to clankers (and this is only half ironical)
I asked the google bot, and …
ytc_UgwJqB_pm…
G
Hey ya'll, I understand not wanting to write papers, but using ai isn't like wri…
ytc_Ugx5Vo1D_…
G
@Berserkmannnot really. Even if someone does commit art theft, they have a hu…
ytr_Ugz_oua2O…
G
@Dr-sparks actually you need to read a lawbook 😂. Slandering AI by saying it is…
ytr_UgyF0tnKS…
G
They were talking about this very thing in the early 2000's, biometric facial re…
rdc_ckjh4kn
G
I think it's time for these AI Data companies to start giving Local Residents a …
ytc_Ugxephjm5…
G
Humanities have not demonstrated the ability to achieve lifelong happiness. They…
ytc_UgwuQZqlt…
G
I am not smart, but i don't approve of doing essays with AI. I'm tone deaf, and …
ytc_UgzNneYIm…
Comment
AI bubble is already bursting. All the scenario you painted only happens if AI works really well. And the genAI is already on their peak, also you have all the energy that AI takes to run.
In the end, it is more profitable to hire people than AI, because of the quality vs costs of it. Even artists will not be replaced, because "if you did not bother to write/paint/draw your art, why would I bother to read/buy/apreciate it?" Art is something intrisicately human, and only human. There is literally no point to have a robot make art for us, because this is what make us, us.
youtube
AI Harm Incident
2024-08-06T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx935ByIipnBiPn8th4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzW3JQnVZuWvVOONKV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwO4EIzj-EBoZJbJa54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwhNgxXNyRT6QxSL2V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgznEioGxrVj74eilll4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzZXdaMr4ntweu5K1J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugw90WlqPwCpNGsJ4hJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyKT2mUFp5xLaH79NR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyl3H16ZYManxaWi7Z4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugze95e13fdoZXr-XQF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]