Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Copyright forbids you to redistribute the literal content without license. That …
ytc_UgwCQ_j72…
G
oh you poor thing
See, the average person *doesnt need a personal chef in their …
ytr_UgyVBYVJQ…
G
Will the advancement of AI make some people lose their jobs? Absolutely. Does th…
ytc_UgwEaj_AM…
G
Think about it this way: We built these things to emulate humans. A very basic h…
ytc_UgyGcDUkj…
G
The video phone comment is just not accurate, I had video capability way before …
ytc_UgwBDBlYK…
G
That was like one of the 1st rules for AI safety was not hooking the AI up to th…
ytr_UgyfvrRWe…
G
Me knowing one day there will be all the smart vehicles self-driving cars but th…
ytc_UgyL5uVwy…
G
Exactly… especially if you work in media. Doesn’t matter if it’s television or r…
ytr_Ugxvb38kK…
Comment
The thing is with humans is that we find satisfaction difficult to hold on to, and sooner or later want to change up that thing which we were previously satisfied with. for example, we get a new car or girlfriend etc etc. we are excited and fascinated for a while, but sooner or later start to see faults in it as opposed to appreciating it. then we tend to want a change somehow, an upgrade . im sure the AI companies are currently finding all of this very exciting at the moment. BUT even in the best case scenario, which is unlikely, it is going to cause tremendous problems, even for the super rich, as they will be expendable too! it seems like stopping or getting rid of AI will be impossible. A bit like a psychotic girlfriend which will not go away x1000
youtube
Cross-Cultural
2026-03-29T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyfangPtyYac6VUM9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-9Ja1xx-se_RjHdt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyydkX0SkBjiy6yZOR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz518_SLFWX8tCObsV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyIiaudTaOGVkdBtLx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwJlTdvjBaFwypxKUF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyl3ixaFlCAR18pGQJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxGzs5EqvCKKMbLFF54AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyP7cyRxd1abS6_0SB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9pDJCO6HTcVqsTZB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]