Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is already eating up jobs. Ghost writers/content writers se leke customer ser…
ytc_UgzAB9lZJ…
G
Right. I am just learning this. I only used ChatGPT to help me with some scrip…
ytc_UgzNhgx68…
G
This is the best tl;dr I could make, [original](https://apnews.com/ef25debd7ab74…
rdc_er9kwzf
G
Thank you for sharing your thoughts! The intersection of consciousness and techn…
ytr_UgzSwDXwB…
G
Killing others requires an emotional motive. Computers don't lust for control be…
ytc_Ugz_ybguf…
G
Not surprising. While the human brain is an amazing thing a supercomputer can ru…
rdc_gd7s8g9
G
What a bunch of sensationalist garbage. Full Self Driving still is statistically…
ytc_UgwtA6aHd…
G
LLMs hallucinates 60% of the time. lol yall making this such a big deal 😂…
ytc_Ugx_s4muJ…
Comment
@richardmcbroom102 would you really trust your consciousness to be uploaded to the cloud?
what happens when an AI goes rouge on you?
what if a rouge AI was uploaded into a human?
youtube
AI Governance
2025-07-04T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugyjbppvf8pSpnRI1VR4AaABAg.AK8Zj5NvcmiAK8aCuWLrbr","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgxZySuRoAkw3fo1wk54AaABAg.AK7pnNhnqg8AK7uaE-1OQ2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwY5Dn5TROwmPlpPVF4AaABAg.AK7JHT-s1uGAK8SIJ1kXv9","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_UgxnyTbpP2Z6NTcmCpJ4AaABAg.AK4XaqvhtWWAK4bWFFDxC2","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxnyTbpP2Z6NTcmCpJ4AaABAg.AK4XaqvhtWWAK5d0vJuBIk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx_TH5akSV7o4a19Rx4AaABAg.AK43LIHf3SWAK47JhlRSAb","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgxxQqWMYlWyRYMYgn14AaABAg.AK3OdIDC0g8AK49PtdAao1","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgysWvjzmRCvu7LkNAh4AaABAg.AK135c_xvozAK1bXxI25zM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw0hRSAVfEcDM9VbRd4AaABAg.AK0bJwCMsbdAK4C3WGz6Xr","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgzyqXBQ-NbLTiDvCdR4AaABAg.AK0HPPuE3T9AK0UzFrw5j6","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]