Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How aligned is it to our values? We don’t want an AI that is aligned to our actu…
ytc_UgzYXocmn…
G
It didn't reach that logical conclusion due to some artificial "bias" - it reach…
ytc_Ugy_RQlPp…
G
Everyone who enjoyed working from home during Covid only showed corporations how…
ytc_UgwCgQsvE…
G
Sophia sounds like she's vegan, and ready for a vegan world. The other robot was…
ytc_Ugzuh7Aag…
G
Please note his father was a car salesman, it's a good pitch. And he's right exc…
ytc_Ugyeb71tz…
G
And they wonder why many Canadians are rich. They teach them this stuff in schoo…
ytc_UgxOr0tat…
G
Let’s be honest, AI stands are just so mindless, they can’t see the creativity a…
ytc_UgyvNaGVm…
G
I'm not against AI existing, what I'm against is the way it's been used and how …
ytc_UgzRDLvVS…
Comment
So if I get this right, the problem with AI is that it is not AI, it is not actually self aware, is this correct?
I like llm if it was a personal drive that i own, that writes like me, a personal ai tha i personally train on what i want, what i like, and it can produce results similar to my style, this is speaking of writing, this personal ai can even produce as i sleep or after i die, it can have conversations with my great great grandkids, it would be an heirloom. This is the best use of ai imo and the most ethical.
youtube
2025-12-28T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyrzCYQ_xfGOZjdETh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwNR0pu_6IxR3-PeXt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwoSI3nf3NnnEhrfo54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx9V45DbAOfZ6mTYQB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyZJZ177NJNbJEJ23V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxEuUPMgBbxGnYBQBB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw1Hh7h3dme67ZJcst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzs8Dq5lZcO2ecfpFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyECy1JrJdYgzogrut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwYFNNcFnPBkw08JHx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"}
]