Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So many fearful reactions. We should be embracing technologies enabling performa…
ytc_UgzNJw1zk…
G
"There are a lot of movies about why AI is a bad idea" Read the book I have no m…
ytc_Ugxg6dUlY…
G
I was learning drawing and sculpting all my life, Promters may have some imagina…
ytc_UgzqJ0u8Z…
G
If you "artists" get your way on the copy right issue, it's going to be worse th…
ytc_UgzyUbI1x…
G
Yeah the metrics they’re boasting aren’t necessarily the right ones. I work in a…
rdc_jrpycmf
G
Yeah, the facial recognition isn't *that* bad as to match someone of a completel…
rdc_gheambo
G
"You have their data available to you."
Not meaningfully! Humans can stare at t…
ytc_UgwS8qojC…
G
Don't think the remote tribes in some deep forest areas would be affected by AI,…
ytc_UgxwCjpvc…
Comment
This scenario can't happen as described for two main reasons:
1) 70% of the US GDP is driven by consumer spending. You fire enough people, and the economy is going to crash. End of story. Machines and AI don't shop.
2) Humans have one power AI doesn't have= the right to vote. A government that allows mass firings and poses no solutions and rejects UBI will surely be voted out in favor of an anti AI candidate.
In addition to the top two reasons, you can expect civil unrest and riots increasing exponentially as more and more people lose their jobs to AI if there is no government planned soft landing.
youtube
Viral AI Reaction
2025-11-23T01:4…
♥ 45
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyeNUlSN2TzsHUSW9l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzrQzDTOXTuiM35o4x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOU0EtOmkx1_Q_nA54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw86xJQrUwPeZuusel4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwUKj041-tLgkKyROx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwXru13K6ZAviMgv014AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxsG0z6BZINQI4Urp14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPTnOcXVJRIOsts2t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxmvZcLPMDp5BNe0n94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwVUntN7YRtHBAZ9bl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]