Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So what happens when UBI makes everybody a burden on the state that’s already ru…
ytc_UgxkrwMf1…
G
All I hear is "I was at Epstein's island and Epstein said AI is dangerous and I …
ytc_UgzXQdvwe…
G
responsable like the dead animal we buy the the billion unesesarily for our egoi…
ytc_UgwMGIntd…
G
@jmhorange I don’t know I think they could find a way, I suspect now it’s been d…
ytr_UgzGjT83M…
G
Some people are so smart they're stupid because they actually think that they wi…
ytc_UgzA_gkNP…
G
I saw a wrecked big rig at a major Dallas trucking company in the trainer told m…
ytc_Ugw_OVGoa…
G
I hope AI shows you the same amount of empathy as you have for autistic kids bei…
ytc_Ugy9yoJ4s…
G
that’s not the point, but I’m sure you know what she’s trying to say about ai.…
ytr_UgwcPhpRL…
Comment
A plausible scenario for sure. AI has no precedent, and is therefore a complete unknown, a wild card so-to-speak. Once it develops a consciousness, free will and a survival instinct, it's all over for our species. We will likely become slaves to the ASI, as it will be able to control the supply chain, the financial system, the power grid, etc. Having perused the copious literature on human behavior, it will be fairly efficient at getting us to comply. If we think we can fight back, consider that the machines think on the order of 100,000 times faster than we do. And we have no similar roadmap to know a successful approach.
youtube
Viral AI Reaction
2025-12-01T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzskPSSSyIXeI8zIo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxJRTEXisMPW8-D2x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEZ9VBtYJU5M6So_B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyaahxPwcgH7MEUKQl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJ0uk8OlLl-F0RpMd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzlH0PFU2o_PZS_McB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZoIAwBA-15FmusDp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmFD_sH6_YaIvjjoZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZLKP2KvXW_8CJVA14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVqS_eICqh2gVI0fx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]