Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can't believe this Stonetoss was once actually decent but THAT comic??? Seriou…
ytc_Ugw38janh…
G
We don’t need humans to make humans anymore. Soon AI robots will have humans as …
ytc_UgzAKVZso…
G
The irony is that the large IT corporation that I work for had a webinar for Men…
ytc_Ugw3riuaS…
G
I'm pretty sure one of the main reasons for people's flip out over AI the past h…
ytc_Ugx3k-4zg…
G
AI doesnt have bias unless it's programmed to do so. If any bad things happen as…
ytc_UgzjJ-AlC…
G
This is actually a big recurring problem on most modern technologies!
Some of t…
ytc_UgxFfCqI0…
G
If this is true what she says about the environment that she wants to work on it…
ytc_UgxafQ5Pp…
G
ai will make pretty pictures that make investors go "ooo aaaa sparkly" while we …
ytc_Ugy8alw4W…
Comment
You nailed it tho, empathy is the singular component that should also be part of the training besides just efficiency and truthfulness. I have outlined a methodology for adding this the to pre training pipeline of all AI MODELS.
Without empathy once any AI reaches actual SUPER-intelligence, we will all be seen as ants.
How can the tech CEO’s be ok with any of these very likely outcomes? SMH
youtube
AI Responsibility
2025-09-15T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxEbi7j6hsibxMNjDN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZ0aZIqaFnvwjwCaZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyCCe-2OgNrLKEiDTN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyfx209qtDUIHc5WIN4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugxvcc0C8RhcJWNLXU14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVEiW0mKh50ElFpW14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyuM2wQB3YIZvHcJM14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyCF0LOOyX5kKBq_k14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxFZriAEOjLlxIqRWJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaDHZQYt4LbrqZFH94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]