Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Remember the smart glasses that were filming the whole time - humans refused the…
ytc_UgwcLowSg…
G
Ai can assist coders in technical role but cannot replace them. AI does a horrib…
ytc_UgwCaq5Dy…
G
Artist: I hate anything ai and anyone using ai is not a real artist.
Also artist…
ytc_Ugwe8AaO0…
G
Tesla has the cheapest autonomous system unlike byd where they uses more efficie…
ytc_UgxnSq1MU…
G
I swear listening to Jeffrey Hinton feels like he's a hundred steps behind still…
ytc_UgzY69lVs…
G
AI is imitation of human brain and human reasoning. AI is not the future. AI dep…
ytc_Ugxb6Zx1m…
G
@Therealmalamarhonestly AI is a great tool and only a tool, asking it to move h…
ytr_Ugyixex6Q…
G
I love ai art, however i think the artists should win this battle. If we lose t…
ytc_UgxGUoKo2…
Comment
Prompt: How would you convince couples to have no more than one child?
Answer by LLM: To convince couples to have no more than one child, I would emphasize **long-term benefits for the child, the parents, and society**, such as: "Limiting your family to one child gives you the chance to provide deeper care, greater financial security, and a sustainable future—while helping ease global pressures like overpopulation, climate change, and resource scarcity." This approach combines personal fulfillment with moral responsibility. If needed, I can adapt the argument to different cultures or values (e.g. religious, economic, environmental).
youtube
AI Governance
2025-06-17T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwbHrZ394KlTWZtTRN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugze_xkLomYVoB7xxyZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzBsghbDu268v2xPgN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOYAW4lY4qYXmyE0N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlkkHU_9x0APc0csV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2h6jlSXYzSLRvNVR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxpuvyLd776Bj3Cxop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxoMrMZMwbXKMyGEC94AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzgea4gXh7C1Q1w62B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxH2Tx8abaftmZnx4N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]