Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I deeply appreciate that this video promotes human artists, because I am one. AI…
ytc_Ugw8u3cdc…
G
Anyone that believes ai is dangerous doesn’t believe in unalienable truths. Anyo…
ytc_UgzGDG9Hk…
G
One of the things that bugs me most about the ai conversation is that it’s not j…
ytc_UgzjX_Ejj…
G
If Roman thinks that there is a 99.99999% chance we're in a simulation, then I c…
ytc_Ugy2qqEO0…
G
I don't think it's really possible to make an LLM off of just your own art, unfo…
ytr_UgzQQCid-…
G
Not sure why everyone is so worried.
It's not like the current system is working…
ytc_Ugy9Wx3hl…
G
You mean exactly like it says in the article?
> Overall, it prevents the cit…
rdc_fvzkilz
G
I get it! It’s definitely interesting how names can have different spellings and…
ytr_Ugy40yeIo…
Comment
I can’t help but think of that alien prequel movie with Michael Fassbender playing a robot. I don’t remember much about it but I remember what the movie made me philosophise about. Like what if ‘god’ or ‘gods’ created us in their image, but we were superior and essentially took over. And then if we create robots in our image but they are superior in many ways to us and whether intentionally or not, they end up taking over from us. What if this is a cycle of life that is unavoidable? Pretty outlandish I know, but just an interesting theory I had.
youtube
AI Moral Status
2022-11-06T01:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyMLClJZr9zziKHsOB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfeNQ7ZiqlrXiNjOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXDZhmUcG3ORGUc_14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgymI4NBwGgCBJWs5F54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7mv8CDpMRt7CmxtN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyncpjAvuqq57oWImt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzc0qa4fBHo0x-a_bV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyiEFunAJlCgRezqd14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxfi61DfDQRWlGFpfd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHiXcIxl18ntyTF054AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]