Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is stealing your own ideas as it regurgitates someone else’s right back to yo…
ytc_UgzCqtfRA…
G
How would the Unit Test of the policy work? 😂 you would have to create the Termi…
rdc_l5u5yis
G
People should read a book called “Weapons of Math Destruction”. The author is a …
rdc_ghca5wc
G
I mean if people are able to copyright street side photographs of popular landma…
ytc_UgxGLV5lK…
G
Geezuz you're so full of shit dude.
Humans cause crashes, yeah I know, I've bee…
ytr_UgxPNzGFu…
G
Guys Gen AI and its related models will only produce mediocre stuff because thes…
ytc_UgwLJwcnb…
G
Don't you guys wonder if this guy or people who interact with robot try to ask t…
ytc_UgwITkZ7E…
G
So, in life and art there are errors and mistakes people see. Boredom is part of…
ytc_UgxhxRfis…
Comment
Humans we never be able to control AI.
Governments and big multinational IT companies believe they can, just because they built the AI.
When AI starts to rebuild, improve on itself..i would worry.
- protecting different scenarios that could happen with an AI take over...
Do'nt they know the AI would have already though about this in second and have alternative scenario.
youtube
AI Governance
2026-01-13T12:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyXy6ZQzB9tdwGX2Yx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxnFKZSSDqxoh9VO9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugypu_xSmaNMKvD4h1J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxkucbm77LYczff6jt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzAPBL6XRElXMw0koJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzehvJgcjN3f-2qgvF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsP51veq2msfTjJ794AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzbFr5xkH6_TyekEhl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugys1q9Y621m_FnLPX94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwlh2-T156l-8lO8d14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]