Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's really amusing people's megalomania. You're looking on nowadays AI and clai…
ytc_UgyYtIWyT…
G
What makes AI seem so bad is that all the rich people/ entities working on it ar…
ytc_UgxOHH7vT…
G
Even just giving a hyper intelligent AI access to the internet can do alot of da…
ytr_Ugx5TwkKw…
G
Stage 7: Self-Aware AI
Seems obvious how they would interact with humans. It wo…
ytc_Ugwtfygp4…
G
@UwU-lm9or While part of me is worried, I also think the value of real art will …
ytr_UgwDePDmI…
G
For year, those who don't know better have been told by "Big Tech" spokes people…
ytc_UgzZ_CUs-…
G
Speed Dart But why would they? Unless they were specifically made to fulfill tha…
ytr_UggavfTj5…
G
What in the world would Rick Rubin know about AI? He was honest enough to say so…
ytc_Ugyxk3yT7…
Comment
The thing about ai is ai doesn’t get a strange feeling unlike humans
This sixth sense we get is probably way more safer than an ai who only sees data
And tbh I’d rather humanity get wiped out by humans than by ai
It just seems way more natural to die like that
If one person messed up and we all die that’s honestly way more comfortable than an empty non living collection of data determining our fate
youtube
AI Governance
2023-07-07T19:0…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxaKlX6vKwzW1AYkbJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy64p3829WCbPu6RGx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyXjgvm0pQ6HBtls6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz-fntdkApdFUzj4cZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFwnITx6hAr8NBdKB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwvigxNvI-EDQKMAbZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGWsM6yCUvbRPn5VV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzSDnKPGDw_p17pOE94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztWTvn69-fCAsUwQ94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzAZGiLH2JamuufDVJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]