Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If it was an ICE agent he’d shoot the passengers and say they switched seats in …
ytc_UgxP9iAJE…
G
They have already spent billions in AI research so they have to somehow reduce t…
ytc_UgzT8zxuV…
G
Wait people do REALLY do this😭🙏
I am cooked I am just treating AI as my personal…
ytc_Ugx0M7mXG…
G
ai seems l ike a good help for some things, but it wont replace real work…
ytc_UgwEeRwOy…
G
Has anyone watched the movie wil smith I robot how are we not seeing this…
ytc_UgzS5MYwU…
G
Nearly,3 years since chatgpt and its gone backwards. It's only good at what it's…
ytc_Ugy0rNFeA…
G
Anything and everything observed through a screen is not the ultimate truth. AI …
ytc_UgxblWMPK…
G
The overlap between people who will benefit from AI replacing humans the most an…
ytc_UgzlrF6ma…
Comment
If it's inevitable then why don't they find out a way to source the images they use to train the models ethically? Make it so you can opt out and that if you do opt in you'll get paid for them using your work. Like it almost sounds as though these people are trying to say "it is inevitable [that artists will be f*cked over]" instead of "[AI] is inevitable".
youtube
Viral AI Reaction
2025-04-07T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxZhHvmK0jCVbfWLnl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgygpmCP0YyOdsTmqQJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypVGTGoHZ977FUpTZ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxUvNeTYppj-bbPgn14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugybn_y0eA_I9SMmrd54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyWyxpN1x5McT1GnWp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwmOn4ijELgxYnDj3l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyyPe4R7sZ45fyCm854AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzI4_iv-gI-PNkSa1J4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxLWvWbH3KwDR6RUtN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]