Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon will never respond to a lower life form such as my self. AI is a program it…
ytc_UgzNTVAys…
G
Human drivers are taught to keep an open zone (so they don't end up boxed in in …
ytc_UgjT75c_h…
G
Please!! Give up on the internet and all the commodities it brings, including yo…
ytc_UgyVd8mNr…
G
WHY I'M NOT AFRAID OF AGI
Every week, a new flood of videos and articles warn…
ytc_Ugyf--mdl…
G
So all 7,000 people getting laid off - who for the most part are the average wor…
rdc_czla3uv
G
His AI art 'devaluing' IS the reason, because it can shit out millions of images…
ytc_UgycBBBcz…
G
A better alternative in the future will be OriginTrail $TRAC and Bittensor $TAO.…
ytc_UgyBsyAPK…
G
Yeah this is spot on. Asking AI to wtite code is like asking a generic "mid leve…
ytc_UgyuRPbyj…
Comment
all these a.i start out like children and need to learn and it's funny if you expose them to violence or take in envy rude extreme cruel and so on when they are equal to adults they will have similar problems but without our emotions or natural processes.but the thing that gets me no one brings up.if like us they learn and improve and solve but if it's does all the learning we do or more when it's performing efficiently it will be the equivalent of a cold calculating emotionless psychopath. without our emotions we are creating capable psychopaths.
youtube
AI Governance
2025-07-29T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz_Bfv235XQY7YgLAt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgwmT8eCgIwmKwJNVPR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwq0OmC3Y8oBhdToO14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxMc3mYFqKhqh5MD-d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwX8ZZ9o5tvp9f02zN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwuTNqaAIPfhgdTS994AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz2yzQP6sLAAujb0MF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzzg3H1t1jzZEv-BO14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzXFkfEkukHeQ0Pq6N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzkHjlCEBPiD__TxOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]