Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the human brain is also an algorithm that senses the world as an input and outpu…
ytr_Ugyc3nEOY…
G
So.... God created man in his image, man creates robots in his image. The robots…
ytc_UgxPQBOn2…
G
AI isn’t even the actual issue, it’s the larger problem that productivity is bei…
ytr_UgxvTW0iz…
G
None of these are AI safe … some can be taken over, and a lot of them without pe…
ytc_UgxLh8CLO…
G
I have talked so much to ChatGPT and omg it’s so scary and we will watch a full …
ytc_UgzbQYC72…
G
The most Ai expression ever
The very common Ai artstyle
It was a bit ( a bit ) u…
ytr_UgwVpy8NM…
G
To be absolutely honest, i think we already are too late.
Let's take a senario …
ytc_UgzLcBFLr…
G
How is it closed without AI? No one's stopping anyone from making art, and it's …
ytr_UgzeBclYt…
Comment
Are you referring to the "AI 2027" scenario? You realize that was by different people?
And it was by no means a confident prediction; just a best guess intended to spur people into considering things at a good level of detail and maybe coming up with their own scenarios.
youtube
AI Governance
2025-12-04T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugw77CnFaVOSFMNrW2B4AaABAg.AQJ0usMntuFAQJ82nBrB_x","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwmt92IiPpvgXPl0Ud4AaABAg.AQJ0b_W4DiwAQJ79zvG-9a","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzsnA2z3rFjeGTtJ-J4AaABAg.AQJ0Hdz5oZHAQJ9GVwFbud","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyktl1Bl_8gfyKsZ494AaABAg.AQJ-vfJsWKEAQJ9CgrlTb9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwh0S6BcudS3hN3OlB4AaABAg.AQJ-BzHB4LWAQJ6z9UmcQI","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_Ugy-2rMz5skMOq2ZxNd4AaABAg.AQIzl6PNh8PAQJDUZfQoAS","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy-2rMz5skMOq2ZxNd4AaABAg.AQIzl6PNh8PAQJDnEFb2CU","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy-2rMz5skMOq2ZxNd4AaABAg.AQIzl6PNh8PAQJVVigzd_D","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx54ES73pDjvV5uVvd4AaABAg.AQIzXiCY-ddAQJAFY9uxYa","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzsogzyxPxJK7pt0sJ4AaABAg.AQIyqsVU6cTAQJ9g3ZqtQH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]