Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@B0XAL0TXD clanker is a slur for robots, not humans. I am a traditional artist…
ytr_UgyCEnW4p…
G
Hey! I work in HR mostly with HR tech (admin, not dev)
It's super easy to have…
rdc_jex39b9
G
@laurentiuvladutmanea you just described how the AI learns. And soon they will c…
ytr_UgzhxHMkv…
G
Maybe don't let the oligarchy have control of AI?
But remember bot intelligence …
ytc_UgwHi1EfO…
G
I disagree with tuck about college kids having ai write all of their papers... T…
ytc_UgyhWgMcj…
G
Wowwwwww, this is crazy. My husband was using Gemini Pro 2.5 and it convinced hi…
rdc_muow3vv
G
ai could be useful as you said, it could be used as reference for the artist to …
ytr_Ugzmpd8lW…
G
Synthetic Silicon Intelligences are just the Next and Finished Stage of their Or…
ytc_UgzXB3R7c…
Comment
LLM’s and large servers are training models that learn from the inside out; making them able to be downloaded to a mobile device.
Think of a viral AI program that uses you as its LLM. How are we going to regulate AI when everyone will be creating their own unique model?
youtube
AI Harm Incident
2025-09-26T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | unclear |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwM1_2e02yJd343Bs14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDNI-bUlgL2NQq5Eh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2fHBWNTJ66dHoo4J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzykRQUZN2bkXAoN5J4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxzqR2bVDUvgWO_HgZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyrRQKO-x0oTkqyrv54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzePyCvkBpRQqQhnxN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwwKkfb4dIpySohIOl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFkM4WVFdpmQg_Uex4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy6-ZEePsNBii4BMkR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]