Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
hihi so my dad is on the same page as me (kinda? im anti AI) he loves using the …
ytc_UgyTvwZHR…
G
The first AI movie was released in 1927 ... nearly 100 years ago ... and several…
ytc_UgxpbwnKe…
G
So, in the best occam's razor delivery, the reason for AI becoming ..rasistic.. …
ytc_UgxORi3nh…
G
I see the point, however there isn't much difference when comparing it with conc…
ytc_UgxAd7lq_…
G
Interesting discussion with economist Anton Korinek on AI automation, potential …
ytc_Ugyx8b7si…
G
I have a friend whos been painting since she was 5. Her art did not just start o…
ytc_UgyvEuFq9…
G
How about they do a Reality TV Show with all Robot Cast 😂 let's see how this goe…
ytc_Ugz2lf72w…
G
I Think the correct option is A.
An AI Robot with with Citizenship is doesn't ex…
ytc_UgyeRrF85…
Comment
Nobody ever looks at this in the context of eternal time. Whatever great advances arrive in the next 10 years or 100 years, there are billions or even trillions or trillions of trillions of years ahead.
What looks like the next great advance will get superseded, repeatedly. It's an ongoing process of supersession.
What if AI itself gets superseded?
It probably will, and sooner rather than later.
youtube
AI Governance
2025-09-04T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwErjXqr7IV3balt3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxMh1uKxQjxumFQZqd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzrrc6o_aV_mRU5L4V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwYCuJ236aQhX_tVc14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAfAcbg5tfAvXj1Z14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzw8PY18ESht5hNr2Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwclleEd3yahbBFhAR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxrOue4_WYejy_iyjV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyTCvvuqSpvu9BSrF94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyQ6ozb-c1ZcakVVQd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}
]