Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Shaming people for shaming people for using AI is also part of the problem.
How…
rdc_n7vd4b0
G
If companys have to pay big taxes on AI replacing jobs then that money should pa…
ytc_Ugw9cm40I…
G
I heavily disagree with the sentiment Pollock bad. His art looks "fine" and has …
ytc_Ugwyor3bq…
G
Humans cant even decide what our own interests are, how can we expect an AI to d…
ytc_UgxeVF3QO…
G
How far can they open there mouth? Is it soft inside? Asking for a project.…
ytc_UgwoMxGmD…
G
I always get so caught up in my annoyance with AI that I use a lot of general st…
ytc_UgxVA9S_n…
G
I'm all pro-AI and Pro-ChatGPT, but boy, make sure you do your due diligence and…
ytc_UgzYx8QkE…
G
This is super vague and he clearly doesnt give any info on these for a reason. A…
ytc_UgziAP2te…
Comment
If these stages are indeed real and forecasted, then lets speculate for a moment, i think most people would agree that other intelligent species have existed and currently exist outside of earth somewhere in the cosmos, if this is true then it would be reasonable to assume they too invented AI, and given the timescales, this AI could have been invented millions of years ago, giving it more than enough time for even just one AI system to achieve this god-like status, so if the 1st criteria has been met, unless theres something we missed or don't understand, then undoubtably the last one has already been met probably billions of years ago, and if simulation theory holds any validity, then this universe may actually simply be a giant stage 10 AI experiment
youtube
AI Governance
2023-12-10T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxgFiIYUcZXwtF8lEp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKMiGGH3Vb57M-9hd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyWDe1aSCRV874V44B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhwFQadfBszhEiYjN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxsoY4nVl5sSJd5o8R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkHSt2skyP6odqDmh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxRglWg7z8v37FyI0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3WcCl3jKYGVeZSKF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzsbx25SO0-33YCT6N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwL8fM065AJIDLp4J54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"})