Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
First off AI Safety Research is a real international thing. And governments can’…
ytc_UgyiWrJAj…
G
That's probably because you're using a model which doesn't have real-time search…
ytr_Ugwmr8t2C…
G
In the future we will destroy ourselves, but a few will survive by uploading the…
ytc_Ugy4wG5Rr…
G
I understand this work is necessary; however, just like a factory, offer safegua…
ytc_UgxJHrr9N…
G
Would like to point out. Artists art is included in the dataset, which is not il…
ytc_UgyynapqR…
G
I am AuDHD, and though i don't draw much, i do a lot of world building and runni…
ytc_UgzInN3_j…
G
It all started so simply and playfully. Then, the real nightmare began. Physical…
ytc_UgxDuNziw…
G
...? Really? How is this person literally the least intelligent human ever creat…
ytr_UgjeKkhTi…
Comment
@bst857 Yep, it would seem they did not think of putting these limits from the beginning. Would have been easier to work under a specific frame (meaning not worrying about anything apocalyptic) than let it all hang out. Makes it harder to do it now that AI has overtaken all their expectations.
youtube
AI Governance
2023-07-07T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxNeMogGNDTjq9Azxx4AaABAg.9rr0aEqYbe89rrO3ODrhtS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxX3iA2497iA-QFEdF4AaABAg.9rqz_4QKsyX9rqzryBbd-b","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzG_m65cCPmF8Btsp94AaABAg.9rqzKBPP_c59rqzfyPW55W","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugy5xF1d0vIdgo7yJid4AaABAg.9rqxKJJooi59rr3PEbJlap","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy5xF1d0vIdgo7yJid4AaABAg.9rqxKJJooi59rr6W17aRYa","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugy5xF1d0vIdgo7yJid4AaABAg.9rqxKJJooi59rr97XsLMaA","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzrlpz7hAgwe7m767B4AaABAg.9rqwy8HyUA09rr4nj5s0Pr","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugzrlpz7hAgwe7m767B4AaABAg.9rqwy8HyUA09rrE4CP-MwT","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytr_Ugzrlpz7hAgwe7m767B4AaABAg.9rqwy8HyUA09rrEW22C8hA","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytr_Ugzrlpz7hAgwe7m767B4AaABAg.9rqwy8HyUA09rrHA3dya_9","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]