Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai can take any art job as long as its legally allowed to take all data in 2025,…
ytc_UgwmUfskL…
G
I'm most cynical about the quality of code that's fed into GPT. The only way GPT…
ytc_UgwR60PDn…
G
the only reason why people want to use ai for "reference" is because they simply…
ytc_UgxbIIGd3…
G
I abuse to chatgpt when it doesn’t gave me the respose i want
It's over for me…
ytc_UgylKxzJl…
G
Or A.I just observes reality without any political correct bias and just takes t…
ytc_UgxFcJPYy…
G
@StormEyes1991 and here I thought humans could never be stupid enough to call AI…
ytr_UgxXZiB8u…
G
I use this to get AI to tell me AI CEO plans for product launches.…
ytc_UgxZsYDy0…
G
*Planet of Robots.
*Rise of the planet of Robots.
*Dawn of the planet of Robots.…
ytc_Ugxy_r9tK…
Comment
Humans using ai for bad is the second existential risk per individual imo. The first is the coming societal upheaval to put it mildly. Nano bugs may occur eventually but you may not have the ability to worry by then
youtube
AI Governance
2024-04-09T04:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx7Jr6d49aeSh6-Lkp4AaABAg.A1ykHpb2sgaA2-IWHFGU9w","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzjrcWNhoFWh1WnfIt4AaABAg.AVkpye_ROPGAVkqlrlXdq6","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxmTbPs44TvAsPPh4h4AaABAg.AVgi9oHtEjHAVgiQTcEKhE","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxbsO6vuGAclXXNcBp4AaABAg.AVgW_PWopp-AVgWqEJ9HvP","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgwXFhCZfo4t2aTG-gN4AaABAg.AVKSBwm3IH4AVmYX7ghL_g","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwXFhCZfo4t2aTG-gN4AaABAg.AVKSBwm3IH4AVmZ5PYP7to","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwpn5FESsuuZZC4vz54AaABAg.AVGkMBT2n4nAViImbkqp19","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugy49lcWlbfh8O48bpB4AaABAg.AV9lRjxePGjAVi3QoR1s7p","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwki3zaa2wXDgi-TY14AaABAg.AV6oWY1mgk4AVnpqn2aPKQ","responsibility":"government","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyE2kNK_2Uwepn9BFt4AaABAg.AUwmwg77TZDAVPq9zTH4cu","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]