Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
something I have to say is that we aren't inherntly creative ai and humans get c…
ytc_UgwzH-j7c…
G
It should be clear to any intelligent person that what they ask AI must be verif…
ytc_Ugw9h2MgP…
G
In my opinion, a conscious being is going to enjoy doing certain things. Humans …
ytc_UgzjuZlys…
G
That sounds nice but I don’t think it will work because ai has taken private pub…
ytr_UgwiVnZUM…
G
I can't understand that if we replace human jobs which is based on human salary …
ytc_UgwWFtIQP…
G
An important note about "made by a human."
If you modify art that is computer ge…
ytc_UgyaukdCD…
G
As the old saying goes. "You get out what you put in". In other words, AI is onl…
ytc_UgxUGoDCV…
G
Everyone is ignorantly working and feeding data to AI without knowing that the S…
ytc_UgzZcmp1Q…
Comment
It doesn’t take a genius to figure out what the bad is… what if the AI decides we’re obsolete? What if the AI decided that the Earth’s resources are better allocated to serve it’s needs? What if the AI declares war on humanity? What if the AI decided we’re not able to subsist in the long run and it decides we’re better off gone? The creator of AI is a brilliant moron for not seeing it.
youtube
AI Governance
2023-04-18T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyVGMeb_6J0Ysde4sh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNxb82dIHnlMd5Hkd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyeBQ0JyHTwM7hrt-F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxQUAVvEtPOPUkU8xh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyTGLty2oggVuLenvt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxMxt5wUsGRrj1pHLB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpHqp-dflXFg4hrqp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxXionQ-fWM5Wr7_h54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxy2d8181ZvBsW0xNh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxyYiuRdAEOTSgqQ_Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]