Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@logitech4873 that's what the investors of self driving tech said we need to do.…
ytr_UgwXaeOQI…
G
This should be an obvious thing isn't it? No one is expecting that self driving …
ytr_Ugyj8IkmE…
G
For stupid people and A.I artist: The company who made A.I tell A.I is A.I gener…
ytc_UgxfSY_Uo…
G
If you converse to an AI with a similar vocabulary/education as you, then overti…
ytc_Ugz86YZHt…
G
I understand where you're coming from! Sophia's abilities are certainly differen…
ytr_UgzEJ_43v…
G
Ai doesnt have a way to process without a programer ai doesnt stand for artifica…
ytc_UgztrxvCL…
G
Most "AI haters" are artists themselves. AI generation of images doesn't come ou…
ytr_Ugwl8q56R…
G
No. We are already cyborgs. There’s already people out there that are bots. The …
ytr_Ugw95KH2I…
Comment
First thought: the word singularity is also used for the centre of black holes. Just saying... It might be more appropriate than we realise.
The only way you could prevent an economic collapse due to 99% unemployment is to not employ the technology, or restrict its use.
Capitalism will have to be abandoned. What follows will be effective anarchy, where human governance becomes pointless as it is unnecessary. It will be governance by AI. Humans may well destroy themselves through decadent regression into childhood. AI wouldn't need to destroy humanity; there would be no longer any ability for humans to threaten AI.
It would be the hell of paradise.
youtube
AI Governance
2025-09-14T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyHha0tkYnB0sYDtjF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0fc--2eElWnM2LaB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz0HUktXjI_TVLCiq54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMMN0X4G7ODff8VC94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyLAsGJbVBLDIswGod4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNjTxZ8PNRpAm8k494AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzqmyDS45Ty98qpGOt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyo8L8iizv_egbB_lR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwbzkYS45k-kGEAzeF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyx_q1v8uZp_QsP8n54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]