Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you ask the ai responsible for telling you about this video to tell you about…
ytc_Ugz6roAIZ…
G
Ok then. Whats an example - even just one (singe) example - of these millions …
ytc_Ugy7wvxHI…
G
I think AI needs to remain as a side tool for doing small stuff, like helping an…
ytc_UgybmV4tZ…
G
The authors of those those books want hype for the tech. Super intelligence is n…
ytc_UgwMswGwJ…
G
But AI can't do that. It can't understand requirements and it can't make code th…
rdc_kiuskci
G
With this rise of AI, I sincerely hope that means that GTA VII wouldn't take too…
ytc_Ugy35r2T8…
G
Any developer is fool to think Ai will take their jobs. Even if you compare huma…
ytc_UgwTCH0_j…
G
AI will remain at the basic level of animals . . . It will never have or underst…
ytc_UgyCDHOBp…
Comment
A lot of people claim that a general AI would be able to do so much, that a universal income system would be developed. But it seems more likely that whoever develops the AI first will consolidate more wealth and power than anyone else on earth, and it will always be difficult to convince the majority of politicians that universal income can be paid for by taxing those ultra wealthy who get rich off of all the people who can barely afford basic living expenses. I don't know if the sci-fi examples of the singularity AGI is even possible, and I don't think intelligence can be measured the way many tech CEOs describe it. Anyway, Niel had a physicist on Startalk just a day or two ago that had a lot to say about this and it was an excellent episode. I don't remember his name, but his book was a long title that started with More Everything... and its all about critiquing the rise of AI and other big tech billionaire goals, like going to Mars.
youtube
AI Moral Status
2025-12-08T23:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxToSmdUI55Ar7oCyN4AaABAg.AKzeOlz8MfYAL1hz6Eq0gN","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxToSmdUI55Ar7oCyN4AaABAg.AKzeOlz8MfYAL4D4tDlnkC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxCPSoh3LipNk7QAet4AaABAg.AKyWbPbFs_dAKykwDf9bo1","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugxc4B6bCl5g9HaoPgl4AaABAg.AKyKkxQx2txAKyL7qS3Dmr","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAKz9w5j_mYj","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAKzC6B_qDAH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAKziGO9VFoI","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAL0I-irX6yR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxwELpb3zk4KZ5kEjJ4AaABAg.AKxu9YWA1gNAQUtA5Tkv5p","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzO4z7a2S9DzFtR3dt4AaABAg.AKxplsvWF7tAKy9M7E7PSU","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]