Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You rely on a pencil to draw. Boom, we don't have opinions on art because we bot…
ytr_UgxphrfGJ…
G
Can I please check out of this timeline. I do not want AI and I do not want huma…
ytc_UgxcX32z3…
G
The only thing is work activity and productivity are 2 completely different thin…
ytr_Ugyr3rHQ2…
G
Ehh this is scripted you can tell, there just programmed to move there mouth to …
ytc_UgwtxjW8r…
G
My theory is the beast system. Millions of AI controled robots drones, sensors. …
ytc_UgyVdSAGJ…
G
The school board approves the curriculum. If you want AI to be the teacher, you …
ytc_Ugz5EDfXH…
G
The guys who give Human Mind to AI should understand that we dont have a clone a…
ytc_Ugz_4mMnE…
G
Americans today are the STUPIDEST they have ever been!
If a MACHINE malfunction…
ytc_Ugy2Iragm…
Comment
He's severely exaggerating, he's talking about AGI and there is no way that 5 years from now we will have AGI. Especially given where specialist AI is at right now.
They're sinking Billions into AI, every single AI company is running at a loss, so even if they can manage to build a generally smart AI (because most of them right now are not smart, try getting one to reliably add items to a list you've created, they will fail), they won't make a profit for years upon years.
Aside from that, he's being an alarmist, which is fine, because he's in this space and he's supposed to look for problems.
It won't move that fast, especially not within 5 years. Unless they can get general purpose quantum computers in that time.
In which case, we're fucked anyway.
youtube
AI Governance
2025-09-04T15:0…
♥ 24
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwKoc8oSipcWEkJT2N4AaABAg.AMePnfpHAcZAMeRquWE1g8","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgwKoc8oSipcWEkJT2N4AaABAg.AMePnfpHAcZAMeRqzEvBdm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx648dZmTQY8maUild4AaABAg.AMePVTb1q_aAMe_J1E40ay","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyYey8cYBPEn5C0UAp4AaABAg.AMePJXhI-zCAMeYqnnzeno","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyYey8cYBPEn5C0UAp4AaABAg.AMePJXhI-zCAMec-C50JXP","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugzh-LBospwSKIpgOnl4AaABAg.AMeP5r3eHs8ANVelSthQoI","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugx_uPh_ZdGt_nNLZw54AaABAg.AMeOCxmmV94AMexQostayR","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwhelrLUprqtP3Gm7V4AaABAg.AMeJfKq75xBAMeNihB2l6I","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzDC2vTb7NbAV-7qI54AaABAg.AMeJ0dcOYhrAMeeqtaNmws","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxp-6NxprW7K8A505l4AaABAg.AMeGvsK6-pgAMeKamyjaJ1","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]