Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
O ser humano criando a própria desgraça, mesmo sabendo que em filmes não acaba b…
ytc_UgwRDL97V…
G
I feel like tech isn't good enough for generative AI yet. Not all AI is bad, jus…
ytc_UgyaqJ6gg…
G
@विचित्रलड़का They would like to do that. Nobody wants to spend more than they ha…
ytr_Ugw5My7Yc…
G
Some Open Source licenses allow the code to be commercially used (eg MIT), other…
ytc_Ugy2d1fI1…
G
Absolutely, human wisdom is invaluable and brings a depth of experience that AI …
ytr_UgyT8OqX5…
G
I approve of technology and prototype testing, but I wish this autonomous robot …
rdc_ic0r7r6
G
Any image that has been created by or with the help of AI should be marked as su…
ytc_Ugx_fQiNp…
G
I AM NOT DATING AN AI GEORGE WASHINGTON SHUT UP SHUT UP SHUT UP SHUT UP AAHHHHHH…
ytc_UgzaX-DKl…
Comment
If you want to mitigate the risk, you have to limit the power of chip capable of running AI. AI need a lot of computing power and electric power ... it can't escape on a roomba ... that's absurd. AI need computing power, a lot of memory and a lot of energy. If you can control those, you can always shut down AI. But if we are so dumb of producing low power chip with enough computing power and memory and we put them online ... then we deserve extinction.
youtube
AI Governance
2023-12-15T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyniQNCzbIdt4p8nfl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx4-SZCAvkFGLrt5d54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKP9KqAV0NW0FrAz14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztXQ4YScJ9AdzJeCR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwZpM4t3-j7eQsBtMx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_abvZOs5954IgnZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUsdHZXp-FHegMjKh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyV9XUlWDaHd0mUy2J4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwe06M5XpoV_fILwBp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxc_G41xws3UY_nTdF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]