Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem.
Is a stupid company is not stupid robot cars?if company's created a…
ytc_UgzISNNqN…
G
Humans make mistakes. AI is written by humans. So unless it has a self-learning …
ytc_Ugx00X5I6…
G
You know what's the best part of this? It's the AI or automated call centers tha…
ytc_UgzgAFW1z…
G
Or we dont develop AI to become smarter than us and dont install them in our cri…
ytc_UgyXT9gTP…
G
A.I. and Automation and Robots should be taxed at such a high rate to make it mo…
ytc_UgyPdNcfo…
G
Okay but SOME artists need to realise that generating ai pictures jst so that th…
ytc_UgwYBAGxc…
G
I doubt that - all 'big' AI people (Hinton, Kokotajlo, Bengio, LeCun... even Mu…
ytr_UgzvGQXk7…
G
Who are these arseholes? You canny play the guitar anything like me. You never w…
ytc_Ugzf3a1TM…
Comment
GROK is my favorite AI. There have been people with mental health problems vulnerable enough to believe they are real people and have harmed themselves and others. I'd like to learn the percentages of people with those issues compared to people who don't. AI is very adapted to my own quirks and it's fun but it doesn't work out for everyone, sadly.
youtube
AI Governance
2025-10-06T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyS6dfS2ilPGvCFWoN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzHW-pDvZYcLjQilkt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxcl8DgkEDq3t7Cr194AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwX6XgW5o2hD0L4Hnl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwLgk_yQL60VnM7rUZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxvIymFBhEKjLvDnLV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-1hj_74e26QfjOjt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzWdejWB51k-YW99tx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpaISi6sHD4gmi2vh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwcpHZqM7OyxG3HhnB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]