Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It so awful that when looking for references, alot of galleries are filled with …
ytc_UgxGYPWgC…
G
Remember the smart glasses that were filming the whole time - humans refused the…
ytc_UgwcLowSg…
G
Thankfully, I am too old to see anything but the beginning of the AI era. The fu…
ytc_UgzW0jXy-…
G
Ai becoming concious is such a stupid thing to worry about wtf. The rest is vali…
ytc_Ugy961k_A…
G
This whole topic is such a degeneracy. Come on guys really! When we take the sti…
ytc_UgxYgnZAW…
G
No one is trying to give a robot human organs?? We already do the impossible by …
ytr_UgzlNQKN4…
G
@tsuki_moon.1 LLM seems like a b-rated author. It's better than lots of people a…
ytr_UgwK8vNHv…
G
I played this game with ChatGPT I asked:
Me: “did you upload yourself to the clo…
ytc_UgwJq-uFy…
Comment
I've always been a techie. With that comes the responsible use of the latest technology at our disposal. So, to see how much disrespect has gone into the training and implementation of "A.I." is revolting. If it was truly for the betterment of humans, they'd be doing things in an ethical way. But they're not. It's just to line their pockets and reduce overhead costs of paying skilled laborers.
I love to see people getting creative in fighting back against corporations and people who'd hurt others for their own benefit. Progress without the human touch isn't really progress at all.
youtube
Viral AI Reaction
2025-04-05T15:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgznLpzWnPH9tYHQ3Kx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKvNGDeK6Mca2BHcl4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzdf6aOwS1aVugCOdl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaQT8Hv2rBiV-EeFl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxEPlI1TiAHbOPMt3l4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFYjskcfWqfFTglpR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyXBylTCCTN9QfOWtB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwX1VHzDndzuVucBMR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTrgqkZdFGfQvP9cl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxPTbIA8F6Ubq_2MBh4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]