Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Karen Choi, huh???
Trump said he wants to gain the AI technology first before Ch…
ytc_UgxAZGM1q…
G
This video features an in-depth conversation between Steven Bartlett and Dr. Rom…
ytc_Ugw0xK6x7…
G
There might be a bubble, but it’ll be different than the internet one. The basic…
ytc_Ugw0p9W65…
G
Le danger de l'IA c'est le non alignement, c'est tout, c'est ridicule de pleurer…
ytc_UgyHXIj_h…
G
This is a joke, only protecting the government, how about protecting all the une…
ytc_UgyTkyEzo…
G
J T This is a theoretical philosophy where you can only make 1 choice and all ch…
ytr_UgzhU9chL…
G
Naw. Fuck AI. I tell it to eat shit all the time. Why? I tried the nice way. I t…
ytc_Ugzl9htK7…
G
It’s like the hate and ai is winning for once… no we can’t let this happen…
ytc_Ugw7C9Pf_…
Comment
Isnt it the same that happened with every technology? When it comes out we do not see the boundaries of it and we think it will rule everything. Then time passes and we see the actual capability of that technology and the actual impact it had.
AI might get to that point, but assuming an AGI is possible I think it is a big step and assumption of a lot of things. However, I love to see this assumptions of futures because it gives a lot of room to think about our life and goals, what if there is something that is just better than you in any way? What is left then? I think it is a very interesting and unpredictable topic. Still, I feel we are ignoring a lot of boundaries technologies have. Is it possible? Yes. Will happen? Well, I'll at least doubt. Lets see how it evolves in a 5 years time frame :)
youtube
2024-11-24T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyGsAPrWCjFa4ysmmZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyTu7eL7HGQZXLWU_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw5ecbIy6ZLnRdllB14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzaay4ZQIyX-iZlU5h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyLe751UL631IUVUI54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQ2DFD7pYN_GGTJKl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxu778Ha5DpBbXY4Bh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyIvrgj95sPKKEHSO14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzfrMKHUPbqSskB0Jp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwflMQd_5ouw4MYcsB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]