Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is like google on steroids, you still need to read what it gives you, otherwi…
ytc_UgzylKRYT…
G
florianschneider3982 AI is NOT hard work!! You are typing prompts into an AI gen…
ytr_UgxdFV7E9…
G
It’s two years later. A lot of companies who have tried to replace devs with AI …
ytc_UgxgseIe-…
G
The best way to shut down ChatGPT is ask why arab Ottomans suddenly changed thei…
ytc_UgzAQzvXg…
G
A friend of mine for some reason wants to know how durable that mouth is.…
ytc_UgyMNxhHU…
G
it’s pretty telling how mad people get about ai artists when there’s a long hist…
ytc_UgzlH7xf5…
G
Didnt the ai artist say later that hes glad he was able to connect everyone toge…
ytc_UgxLKRAtK…
G
Same logic as "You drive a car, isn't that killing the horse breeding industry?"…
ytc_UgybeLuef…
Comment
Between this video and the one that prompted the responses, you make fantastic points all that I agree with. But for me the biggest most important point of all is regardless of my opinion of AI, I as an artist should be allowd to opt out of my art being used to train AI and as a consumer/user should be able to opt out of AI features. But these companies dont give us that option, they didnt say "hey do you want to participate in this new technology?". I think if they had, people's perceptions of AI would have been different. Not to mention it sets a precedent/is an alarm bell that companies can do whatever they want and if the users say "i dont want this, i dont want to participate in this" they get hit with "thats too damn bad". Its malicious and unsettling.
youtube
Viral AI Reaction
2025-05-13T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx4gHjWQK9yBHvizmV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzKUL-I3kHnpPiWN_h4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgwOjGb4PL9lopI8py94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugw0rqm6o8KVlLY-Awd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz5q13j8Mie9MwvIDp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgxcVAEFbmdyexygSNp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzChLXQxeQIroVz-Wd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},{"id":"ytc_Ugyl3A1Qw-kBA8OEel14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgygoFr_SA2ANMHY4zV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxYKi73aKDRaVmnm0F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}]