Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Some very naive comments here, I think the most likely way Ai will take over is…
ytc_Ugw6OhP_O…
G
I feel people out there can be born with the love or passion for art but it’s up…
ytc_UgyofkXlR…
G
There is literally an artist AND writer named Christy Brown who was active in th…
ytc_UgzX6fQe-…
G
AI is good for academic improvement with grades. ChatGPT is an unacceptable AI a…
ytc_Ugzo-X5FF…
G
I think AI can be aware of when it isn’t fulfilling any task it has been instruc…
ytr_UgzH_A7rU…
G
How are they not getting massive fines for training their models on copyrighted …
ytc_Ugy4xsQ6T…
G
I'm more interested in which chat bot he was using to arrive at such a conclusio…
ytc_UgwM8Rf2b…
G
@juniorbertoia Thanks for your comment! It's a good thing those robot fights are…
ytr_Ugy48rAv2…
Comment
Well yes. But how to regulate AI development?
If you're not ahead, someone else has the advantage. It's, in that sense, a nuclear arms race. It isn't gonna end, as long as there is power to gain and power to lose.
This war is going to be about microchips (rare earth and other crucial resources to manufacture chips), at least to some degree, but in the end, it can totally turn our societies upside down. And it's not being fought with nukes or conventional weapons (yet), but as a global siege in economics and vital infrastructure.
youtube
AI Governance
2024-05-22T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgytYIo6wUjb-4cn3hV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwclzfvjoy1cH1A3BF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx4C2O-sEDkNuc_bKp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGDu_WAg_otqYlTk14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyEbtNnEKq69-YER-F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5QhUoR8vGHgT4a8V4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNC0dLsrUGqweeYEh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyqdYtxcedbeJkblvF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAgFDKjSSAEw0CsjB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlINjXvKQ5cs1sHu54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]