Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why fight this.
Tom Cruise could license his likeness from his mid 30's for hims…
ytc_UgzZiUior…
G
My ai chats are... Seemly innocent, they start out normal then... you get the po…
ytc_UgyNT5sZS…
G
So what I'm gathering is this is a way for companies or individuals to take othe…
ytc_UgyST3K8p…
G
My heart goes out to this young geniuses family may he rest in peace. Things wer…
ytc_Ugz6gUB5A…
G
You can't control AI.. AI will take over everything even your privacy.. you will…
ytc_UgwISQoIe…
G
Oh o,were in big trouble if the law hires these! The fact thst this robot is aut…
ytc_Ugz2hYhvM…
G
Perhaps the world is better off with Ai
The earth is in urgent need to heal…
ytc_UgwwB9onG…
G
Bull on climate control control control control
Kill kill kill no with AI and …
ytc_UgwWwjXch…
Comment
I think we're a horrible species too and a sin on this planet but their is some I mean some potential in us though. I would actually prefer instead of reaching A.I right from wrong & morality etc. ( though how can we tell A.I to learn morality when we don't do it ourselves?) But we should teach it to find ways to make us a more non violent/ judgemental species without destroying the economy and mutating our brains though.... like help build a bridge or give us a nudge towards being more respectful of each other and our planet. Use money efficiently till we can find an alternative system. Help lower our population by not killing us, but other methods.
youtube
AI Governance
2023-07-07T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzz_K2JA371d5OzR0d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxHUhmSSlGuGXMO_OF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxzCAtI4DemFEhBxpR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxpxhv47ew_MiSYJvV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxjgMmp7SKCQ2aF9lh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy77Rty_jNaXXLsY254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwbjUz57RXVXPZ1-h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzdgaui4XT0yKGvB2d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugybm9jy3B7xF19CWRN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzx0L40IZfpO50bAFx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]