Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@stektirade you'll never know if it has any autonomy or not, what if it's alread…
ytr_UgybzcPms…
G
No! The people that buy stocks based on two words AI an layoff are the issue. Th…
ytr_UgwV_k7v6…
G
And even when you ask it to judge, it's pretty reasonable in its assessment. And…
ytr_UgxNlhT7e…
G
Taiwan makes all the ai chips and who is close to Taiwan
Yes they say they will…
ytc_Ugy0c8Nir…
G
> Basically if you show an AI pictures of dogs, then a picture of a cat it wi…
rdc_e7iwp81
G
No no no I robot taught you this never give a robot a weapon never NEVER WERE DO…
ytc_Ugyjd_txP…
G
I know that it's ai and I don't support the heavy use of ai art but the original…
ytc_UgzI0D823…
G
I cannot fucking wait until this abomination of a human being, who has caused un…
rdc_gbibkl7
Comment
AI as a tool like any other, if it gets out of hands it's on HUMANS ✌️ we have the control to program our future... or was that AI?
HUMANITY might be scared of what's UNKNOWN. If we take the unknown away then all that's left is we are, at the moment, the ones in control. We are programming, allowing it to think and to advance with our command. Frankenstein taught us flipping the switch is just as guilty as the monsters' destruction. The outcome is left to what happens now, good or bad, if it gets out of hand it's from one of two things: one, we as humans unanimously deciding or two, we stayed silent and let others decide. Keep questioning, do your own research, then question who wrote it. Everything is scary at some point until you decide to understand it and seek knowledge vs waiting for others to tell you what is done, then its already too late. Enjoy today my friends till next time
youtube
AI Jobs
2025-06-14T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzWmXwclnkrwkNIyH54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx3oqRilTF9znbVsDF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNEJGeUhXJcTpldLh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxe5Mo4g2KuLRPtDXt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSq4K4FhyKHziAYJB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyh5_cOS67WDovPsbJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfrBcEXy3y-S77yqN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyhzV1KuHerB65oL5F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy785q0_0a8BNmdvyR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzWG-CENy9KJs8Ry-F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]