Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art isn't boring, it's just that there are a lot of people who aren't artists…
ytc_UgxhtAiPs…
G
there are no driver seat in driverless car
and the human inside this car is just…
ytc_UgzTmrEl8…
G
I'm sorry but your whole premise about robots not feeling pain without it being …
ytc_UgzbqgiIj…
G
It's very, very sad that these people have SO much time on there hands that they…
ytc_UggiczC3q…
G
There are no jobs, no sources of income, and no customers. So, who will buy or r…
ytc_Ugy0AMjK4…
G
@rosaevee274 I never claimed AI was amazing I actually dont even like it that mu…
ytr_UgxZ6fDiN…
G
Late stage capitalism is when society forgets that the purpose of the economy is…
ytc_UgyBQIEJV…
G
I've struggled to articulate my feelings about AI for a while, but I fully agree…
ytc_Ugydv_xPC…
Comment
@JoeSmith-jd5zg I’m not conflating them: I’m looking at the reality of the 'generation' we are actually living in. Sure, tech moves in S-curves, but you can’t just assume the next curve is going to magically appear the moment this one flattens out. Especially when the current one is hitting massive walls in power, data, and diminishing ROI. Not to mention the general international financial tensions.
You make theoretical assumptions to downplay realistic predictions made by AI experts... Calling it 'exponential' is easy on paper, but in the real world, you eventually run out of physics and money. If the jump from LLMs to the 'next thing' takes another 20 years or $500 billion more than anyone has, then the 'exponential' dream is effectively dead for our generation. Betting on a theoretical future tech to save a current bubble isn't thinking exponentially: it’s just wishful thinking and very unlikely to actually work out.
youtube
AI Governance
2026-03-17T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxHDZ5KQaR6o2tZdYl4AaABAg.AURbh46GfxbAURi9eHMjMr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgzDOP7WfNhhgSuMDpZ4AaABAg.AURb163ULEUAURdaGEP7QZ","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgysnCfw9iTYf629_hB4AaABAg.AURaRdRfPDLAURewnuLBaA","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgysnCfw9iTYf629_hB4AaABAg.AURaRdRfPDLAUSCTkVGcQp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyLXcKTii2nSRci5Qd4AaABAg.AUR_cR03P2LAURoLj8tSOy","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxK66O7aK-fwGrMUsJ4AaABAg.AURZq9WvuUjAURfhqHV2IA","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxK66O7aK-fwGrMUsJ4AaABAg.AURZq9WvuUjAURgKVrXwpX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxK66O7aK-fwGrMUsJ4AaABAg.AURZq9WvuUjAURlU5krcsW","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxK66O7aK-fwGrMUsJ4AaABAg.AURZq9WvuUjAUSZrIoNtoy","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzDR-pPWeTvmW2u4Ix4AaABAg.AURYieobN3QAUtiu_opD4P","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]