Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Giving somebody access to a manufacturing robot and teaching them how to use/pro…
ytc_Ugw3deGFf…
G
Money, as it is, MUST die. We have the resources for everyone to have room and …
ytc_UgyAQ4ibW…
G
Short sighted. Once it becomes sentient, the so called shoggoth will improve its…
ytc_UgzfiO-QH…
G
42,000 people died on US roads in 2023.
The higher the % of autonomous vehicles…
ytc_UgxiTMgze…
G
Looking at the average script's quality all across the Hollywood, maybe it's not…
ytc_UgyEcuy0Q…
G
Im applying this year and I've completed my essay I put my essay here and there …
ytc_Ugxxo79xp…
G
I hate this bs ai! It's not thinking! It's just a large language program. Garbag…
ytr_UgwGm8srW…
G
@jhonshephard921 swing and two misses. I’m an American. And AI can’t make handma…
ytr_UgxXorIDO…
Comment
Tbh this is an interesting question I have asked myself many times.
I do think there is an answer.
UBI will be gaining a lot of popularity. There is also the theory that I think might happen that people will love AI more than Humans in the next 50-100 years and probably not make babies.
Though the most realistic situation will be AI make systems it becomes too complicated over time. People have to go fix those systems and get money. Other wise Idk
Like, Talking to AI about some topic for an Hour gives them dementia.
Edit: I just realized this when watching the video.
When no one has money does anyone need money in the first place?
youtube
AI Harm Incident
2025-04-13T04:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzerNtB-56XD2V0zQd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGaKcqiI3optn9Nyh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEmscr7sJhlJnqGlp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyuSQD6OqHrvBZsbQR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7k0hfCR3v9J2_tsd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCRFQt-dwJBBdKFXd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxrMGb6NOfWjQX-CQp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzsulo0yfFMhJbYmH14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyRIs7V-Vh7TiuEGrF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxV5Z_asrIrGoWxSP94AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]