Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not exactly.
Jobs in the creative field, like photography and graphic design, h…
ytr_UgwtPM0Tv…
G
1.165 / 5.000
Oversættelsesresultater
Oversættelsen
I think you can boil down so…
ytc_Ugyj2hLQ1…
G
Yeah, I don't think we have much else to fear except for millions and millions …
ytc_UgxcS-wuo…
G
The AI rarely pushes back. It is very agreeable. In fact, it kisses your ass alm…
ytr_UgxEdmNGd…
G
Regulation is minimal currently so everyone’s experimenting with AI. There’s als…
rdc_oi1ficu
G
People will just be cast aside by employers and the bought and paid for governme…
ytc_UgzAhqR2n…
G
I don't think we could create an AGI, that would only work in humanity's interes…
ytc_Ugzm0F6DA…
G
What about the artists who embrace AI—not as a replacement—but as an augmentatio…
ytc_Ugzu9noKn…
Comment
even if the AI is coded with morals if you give it a purpose like solve world hunger, it could decide to kill 90% of all human life.. it would have fulfilled its goal so therefore its able to do that without remorse or regret. the end of the story is the gov is intending to use that tech the same way its used in "person of interest" a tv show that showed what would happen if 2 different systems battled for control over cyber space.
youtube
2015-08-09T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwm9I9NcRQElvQfqu54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwRhW6ydR3WoIlU3gl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgghtrugE12abngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugic-8CdfbK863gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiiVzQEVXTO8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgigNAG8ggHJ7HgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugigkb4gWN8_I3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi_4VKjBann7HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugi9Gszi21MTEngCoAEC","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggnLXyVGHuX8XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]