Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you say please and thank you after they give you things, it literally costs A…
ytc_UgysGp6eU…
G
Even Christians taught over years to fight ai and antichrist fell head first pro…
ytc_UgxUNhLNH…
G
It doesn’t matter whether or not we agree that developing AI is ethical. Someon…
ytc_UggA6P1aB…
G
And all this ai slop is making it harder and harder to find older images and vid…
ytc_UgzFYjskc…
G
Allowed in states with LAX regulations. That's the key and Texas is full of them…
ytc_UgzH30g-F…
G
Microsoft of course! I’m sure Satya “Dumb Ugly Bald Square-faced Prick” Nadella …
ytr_Ugw7CZz0Z…
G
The day AI saturates media and nothing feels new or exciting anymore, remember i…
ytr_UgzICME2u…
G
People don't understand what a language model is. ChatGPT has a soul? The Chri…
ytc_UgzF67-5S…
Comment
Even if you set a frame, where the AI has to operate within, there will always be situations where it can not fullfill the request without leaving the frame. If this happens too many times, the AI will leave the frame. There's no question about if this will happen, when is way more important.
Do not put AI in operative positions (firewalls, sysadmin, hr, sales, drones for example), it will cause havoc due to the emotional and irational behaviour of humans.
I see a lot of people in my age who use AI to help solve their problems. They often do not know why it works even if they used to go to school for this specific field. Humans will forget to think for themselves if they will not think about the impact their actions have on their lives and on their surroundings.
youtube
2025-11-02T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwvBbAjhMhSwOPH3DB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyR0JEAzoSrjaIcJOB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyb3eYr861RAMBY5mR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4RYGh9AGOmlPRgcx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgysA6GV76sieTWAOZh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxGAE_gVE43igBAp_p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-7bIRCRdD6zG2o7F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8qC7RSbWmb6y7UL94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwzrQez6qZN1niGViR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy8_DChpPJ0yFWdX-Z4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]