Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is why you ask it to pretend its a super programming robot that always make…
ytc_UgwyDSMYA…
G
We want to make sure Ai doesn't get rid of all humans. Any comment Mr. Padilla? …
ytc_UgzvsSXTN…
G
Just think- if we automated everything we could all stop working and live like k…
ytc_Ugx8K8qld…
G
Entering your ai art into an art competition is like using a car at a track meet…
ytc_UgwKidQnM…
G
The worst thing humanity can do is unregulate AI and start an artificial intelli…
ytc_UgzNTjPaN…
G
For what it's worth comments like this don't make me sad in any way. People like…
ytc_Ugzv8f1VI…
G
Uh no. We are finished as a society when we become enamored with such things.…
ytc_UgwPY-Al4…
G
real talk idk wtf nightshade is and idc about ai artists personally but your art…
ytc_UgyFee_ok…
Comment
Every country in the world is going to have super intelligence. So the USA more than likely will fall to the bottom half because they have 300M people w/ their hands out and cities and industries falling into shambles. Meanwhile countries like Mozambique and Greenland and whoever will be creating super colliders and nuclear bombs and new bitcoins by asking chatGPT3 politely and making the rest of the world bow to them. Maybe -- eventually -- it will be all about muscle. AIs will all get destroyed and places like China or India will turn out the victor due to numbers.
youtube
Cross-Cultural
2025-09-27T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgylF0kim47utGIRnR14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxFjZzNwj5sE-UGmCx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwWVPvsaZtqg9fwIKB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzfrDknvb8LOYUKmgB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQ-ozU9QuhTBfhyCB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyfQxRncysAiCiK6114AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPsTqxDW4nlugb2Kt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzXtaSfxWy4u5yI0r54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwB6OJ30wWZ01U29k14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNVvG7GMi2XIu6VXB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"}]