Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My experience is completely different than yours with copilot x and I think what…
rdc_jprdwi2
G
The analogy i use. You should use ai like master chief and cortana
Use it to …
ytc_Ugy2PZtE7…
G
One aspect that you ignored but might cover later are the physical mechanics of …
ytc_Ugwvaud5s…
G
you too have off switches the ai might try to flick if you beco e dangerous to i…
ytr_UgwxQRt_E…
G
I love this. There needs to be more open discourse on this in a safe environment…
ytc_UgzRyTaWe…
G
Dont know how I feel about AI. I restore old stone buildings in london. Everythi…
ytc_UgwsNG_bn…
G
Seems like "robot talk" experiment started with our "cell phones" when we can sp…
ytc_UgxsbARIm…
G
>I understand they have to die so we can eat
You can also just not eat anima…
rdc_gyzivsc
Comment
Having worked in customer service and technical support fields for over 2 decades, where's the video that shows "AI" handling the call when the first thing a customer says is "FUCK YOU AND GIVE ME YOUR MANAGER. I HOPE YOU'RE A KIKE AND DIE IN THE NEXT HOLOCAUST"
Yes. That happened.
"AI" in it's current form is nothing more than a tool. You don't replace a roofer with a pneumatic hammer. You give him the hammer to make his job easier. Anyone who has to actually work with this crap knows just how limited it is. Sure, the "humanized speech" is impressive, but the only people falling for this being "AI" are regular folks.
youtube
Viral AI Reaction
2024-05-23T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy2IAcorDr2LqEwWlx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwGXlD1fuRvgKRmgDJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUeKqQnMKP1buSBuV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxwu8Q9rOO699oQjPx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7htykiDPrL5YmVWh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_1CJP7cwVioE3uLN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytD9AgHSMnyo_ebIN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzg5pg7ViprgBEToKF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyz9v8jcLv3E77CrPF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2ADyjSghbYxyYQZd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]