Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The assumption that AI will be like us, just smarter,is so childish. We are prod…
ytc_UgzWklaSS…
G
Science fiction recognized the harm in AI 40 years ago. We warned many times by …
ytc_UgzK4Cs9_…
G
Implying that stating an opinion makes it fact. That's not how facts work, espe…
ytr_UgyPPI7Hd…
G
A.I. is consuming all our water and electricity and we get to pay for it. The E…
ytc_UgzCbaXRp…
G
Nirami why tf does your c ai love me while im rp eating an crosstant😭…
ytc_UgzSV10V2…
G
I'm trying to get work to stop this as well. They are running with the AI first …
ytc_Ugz9waY-D…
G
I'm an artist and I can tell u this is bs 😊
We work REALLY hard, (for me) most t…
ytc_UgxunlDRv…
G
some people tend to think people with artistic talent does art so easily,and tha…
ytc_UgwOZJWgf…
Comment
14:20 Forget AI. We don’t even have a way to make humans do things in their own best interests. For example, each of us knows that putting 37 billion tonnes of CO2 a year into the atmosphere is a really bad idea but no one has worked out how to stop everyone else doing it. Even if we discover AI is an equally bad idea we won’t be able to stop that either, not if someone’s making money out of it. A really clever AI would send us to our bedroom and hide our toys.
youtube
AI Moral Status
2023-08-23T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3FW44humfvRytVuh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzwyHiM1HKyXpiRIWt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzot6D__-S8sebUFCl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz9D9HfagQF6dIIoZJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyzV141oKXgnWuMpz14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx1EAMC1bhRvfsoYgF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwE3zyok6zo91zsR8l4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzyfLdJujegryvVLjd4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzTFRADUvNm_1Gm1zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx0z6MXvel6r-7bgiB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]