Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I talked to millions of black people that did not vote and now look what's happe…
ytc_UgyOG6Yw0…
G
If I ask a human to make me a painting of a certain subject in a certain style t…
ytc_UgyYTf835…
G
I swear, having ai copy a artists work has to be the biggest slap in the face to…
ytc_UgyxH9OET…
G
Respect to China they are making sure a Robot Revolution won’t happen. They may …
ytc_UgyHTWKm6…
G
No. Asking Chatgpt complex questions regularly enough has shown me that AI can't…
ytc_UgztMfTOt…
G
If you think you can be an artist depending in AI, then you clearly never had th…
ytc_UgyD42hg2…
G
U use ai to think you are smart we actually learn and we are actually smart we a…
ytr_UgwOFhXWe…
G
c'est pas pendant 6 mois qu 'il faudrait l'interrompre mais pour toujours. la fi…
ytc_UgzZyK2cH…
Comment
Find it hard to believe someone like Musk hasn't developed failsafes to ensure control over his creations. The governments are about 15yrs too late trying to introduce regulation. AI tech can already operate independently. It's xxx% more efficient than humans and doesn't suffer the burden of emotional dilemmas. How can the average person compete with that? we will all have too retrain ourselves to be indispensable in the very few niche markets humans are required. We have had no say in any of this. Let that sink in! Good luck
youtube
AI Governance
2025-09-10T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjpH2rJ4Ffc2E1Bc14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwD4MD2pcFZj3tLbB94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzafQMdu70MbPvqeLl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx3EU-s1eMtcJFNCyZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxL2TTdKH9hvo3uJ8V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvRBif0nMFIjTpnzx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCFSHPSeghzHLy_Ix4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxR16pxfO04eP9ekCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznBGk_he1O0JWDgDJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAgZ4t3ffT5KQXGHN4AaABAg","responsibility":"government","reasoning":"mixed","policy":"ban","emotion":"fear"}
]