Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Her skin texture, lips and her expressions in a way with those eye... all was en…
ytc_UgyYivTZ7…
G
Jesus man the amount of jobs that ai is going to replace is fkn scary, the world…
ytc_Ugy5ExneJ…
G
How many more dark prophets will come to ensure that the negative future scenari…
ytc_UgyHeMIse…
G
The scarcity of weed must be ever so great when an American hobo decides to inve…
ytc_UgynLewJa…
G
My one and really only real requirement for a self-driving car is that it won't …
ytc_UgzlSHi2G…
G
The speed with which ChatGPT processes info is part of the problem. If this boy…
ytc_UgzpayGEJ…
G
Politically correctness has no place in ai. Let them say the hard R if they want…
ytc_UgzKbvrnM…
G
Local LLMs that can be run more cheapily have surpassed GPT-3.5 under multiple t…
ytc_Ugx2TCCNM…
Comment
No, but people tried regulating social media and failed miserably. I personally think that if you don't regulate AI, it self regulates. In the sense that companies become responsible for the product and no longer ask the state for help when people complain. I find it quite disturbing when I have an issue with a product and when I go to the producer he tells me not my problem, state regulated, go there. Like... AI is regulated. By those laws protecting the customer. It doesn't need more regulation, people just need to be able to ask for those laws to be applied. I never understood the idea of having special laws for the internet in general. The legislation regarding what you can or can not do already existed. Is it just me or is it considered illegal to mail (thru the post office, as in a letter) someone a product that would harm his health, like Antrax?
youtube
2025-11-23T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz0nLL63ELpdUGOagZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZuUefkdlRugPkfXR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw0-ts6uZ96wBuwxQJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxkobctKG7w6XxtUsB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyBfNTPIXdMZ5jVv_R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzLbnITqG9h5BYKu794AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwP1-yVpV5v4G-XvOR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy7WTjSTGdEVyjYo214AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzmqd_aaglGc0NVqVd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwxMWpLX2W7fSt9B5d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]