Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot to Maker: "Your hat makes me want to activate my self destruct sequence" 😆…
ytc_UgzaVA6_H…
G
Are they taking into account the cost of lawsuits and having to comply with what…
rdc_n4egkbd
G
Movie but we quite literally can make this in real life we actually have minus t…
ytc_Ugy9UJgll…
G
@SytanOfficialthat’s not the issue people have with training AI off of artists’ …
ytr_UgxYzzo7Q…
G
I hope this features my favorite autonomous weapon of all time: the landmine
Ed…
ytc_Ugxu1QQ29…
G
therapists and artists are kind of safe. art is always going to be there no matt…
ytc_UgzjVGRIS…
G
Am doing product packaging design as i listen, with a number of redesigning fai…
ytc_UgyvJ-7lr…
G
I know I’m late to this, like a month late for this video, but as for the refere…
ytc_UgyKl2-SY…
Comment
I think any even near human intelligent AI has to be open source, or at least make their code public and live as it updates. Not only so we can make it as non-idealogical as possible, but also because there’s no wayyyy we should trust something that will affect all of humanity to a few suspicious corporations and governments
youtube
AI Governance
2022-06-23T15:0…
♥ 38
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxw8cxjxhx6DRboGfd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw1gUInOOj21EM_BQd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx43Lyn4y4N_yGqmrR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1UbHH2rA72niwJcB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwH66oJMffUMgdEykF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwtCczcJTf_3doxjE54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxZvnf9eLj7lOd8jwx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxAAN__ZIjE5BfoXyp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw39q2HwqWRSlXOTZ54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz00t8hYj8-LDZEq_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]