Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's not forget that the companies making these massive AI investments also own…
ytc_UgxgCW1h7…
G
Wow when ANDREW YANG was telling yall this, yall were focused on “Trump said gra…
ytc_UgzDEveDG…
G
As a disabled artist: I hate ai “art” and it frustrates me to see people who are…
ytc_Ugzbp5dOq…
G
When you rely on robots/AI to do your own thinking and possibly reasoning, that …
ytc_UgyLebQpJ…
G
Ai trained with human knowledge and made to be human as much as possible thats w…
ytc_UgyXE2KaN…
G
These people dont understsnd that robots will be doing most of our work / jobs b…
ytc_UgwNp0fNr…
G
The guy doesn't have a high school diploma, doesn't have a degree, and doesn't w…
ytr_UgyeX3fsE…
G
I think what some people are forgetting is that art isn’t always about the outco…
ytc_UgxbbdVMU…
Comment
Anyone that has worked with AI (coding etc) knows it can be a great servant, but would be a lousy master..and hallucinations and unpredictability are not improving significantly, but only compounding along with the increasing complexity of the models. Imagine a robot in your home that could hallucinate and take unpredictable actions at any moment. Along with the obvious human distrust, resistance etc, WHY would businesses risk hand over the keys to AI in the time frames mentioned? Of course certain jobs will go and big change is happening, but I think all the tech bros, AI profs and doomsayers are getting way ahead of themselves.
youtube
AI Governance
2025-09-11T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyHROhXaY6aBa__Pn94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugywzn5nDALmFQaSzzd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxiSQTPhXKNs8B5smR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAwhtiZUZXFRStGtt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1LOf3P6QPz0h_kmF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwqcKLYlzTlsFZ8mPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxWVgHbbte-rk90V8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSQCibJxPdbyoZXo94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwy7xtsdSUR_6aCZwV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyYr67WZVsRzdAKfSR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]