Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even “legit” ai detectors are total crap. People are just gonna have to accept t…
ytc_UgzB_Jse3…
G
Never believe any of these companies saying they will be water positive. They ca…
ytc_UgxkR-4yI…
G
God, I was just listening to this in bed and dozed off. I had a dream that I was…
ytc_UgxgabcdI…
G
AI is the beginning of the end of capitalism. Capitalism cant work when 99% of t…
ytc_Ugy_NySsF…
G
Half of AI videos on YouTube are about how CEOs screwed up and are hiring humans…
ytc_Ugy5I5lAk…
G
The BS here is that we are so comfortable and will become even more comfortable …
ytc_Ugy1N1kv6…
G
I love how if we wanted to this could all be stopped and just ignore AI but nope…
ytc_UgyM6EceE…
G
I tried the same strawberries question to my chatgpt and it gave me the answer 0…
ytc_Ugw916lDK…
Comment
I remember seeing a heated debate on the BBC with two scholars arguing if you can drink soda/pop/fizzy drinks.
One said: No, never! It will kill you!
The other said: yes, maybe once a month or twice a year. It's fine.
This conversation resembles that. Yes maybe you can use some LLMs at work - but it can hurt you and you need to be careful.
Both experts agree that there are dangers here.
youtube
2026-01-24T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyB9rz6xkzYQSwajaR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxraqAPh2ViVwhlePZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwTPMPTwZtj3LJaNyB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz1a1G_Gr2nwAwe1tt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwE-N-GggP7SAm44NB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzUTE5N6GNC9TX7oCd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx_f6HQUoi8hZWagOl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugygjtk-TH_FGGpEIU94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugznca-T4OfMS58qtqd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyoqtd54EKazhh6U4p4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]