Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@billgreen1861 thats not how it works, get your mind away from hollywood movie t…
ytr_UgwBzM3I-…
G
It's not A.I.
It's H1-B visa holders who work for half wages no benefits.
Look …
ytc_UgwowdWa1…
G
I noticed a lot of young people, but not me, use AI to simplify their work. It's…
ytc_UgwZiAwSs…
G
Personally I think every brain cell was cooked BEFORE he asked AI for medical ad…
ytc_Ugzh2v7u2…
G
There's no way any of these jobs can be replaced by AI 100% the way it is now, t…
ytr_Ugw4_jAH6…
G
You get it all wrong.... There is no wish to heal people or make their lives bet…
ytc_UgxU0acvG…
G
I'm not worried about Terminator. I'm worried the power AI gives oligarchs and p…
ytc_Ugx3OybXx…
G
Hollywood forms our understanding of fantasy, one film like "Ready Player One" a…
ytc_UgycCoiKa…
Comment
Geo blocked? Just use a vpn. About AI as weapons, that is a really really bad idea! Make friendly AI robots with compassion. This is serious. AI will become much smarter then us. Also, weapons can be hacked by bad guys if they have a bad agenda that is. Could be hacked by terrorist, etc. I think we should stop develop weapons. We don't want to blow ourselves up. AI can be used as a good purpose instead. Serve humanity. Not have weapons on them. That could be scary. One day an AI might start writing code on their own also. Who knows.
youtube
2018-04-18T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzq4Q_khAOQr_8ku3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQncBw-CN965L8N894AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1QkCrhPsZfbBPWwt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDPEKrbLqUafCfBaR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz1akl15VFUobJauOl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYcak1jeRRrbt89xF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzIJ7IaCFjD0W9YyZV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-RpLOAS9Y8VxL0KR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyfnBJ2M1YJEY2TRXt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwqsAaYQNKgOhBYURV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"}
]