Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why would "he" take responsibility? Tesla is a company. Thousands of people ma…
ytr_UgzfUS0xt…
G
Uptake will be low, the top 1% can spew out all this agi and ai junk all they wa…
ytc_UgxUAUMyD…
G
Robot dont complain no health insurance no vacation no pay company save a lot of…
ytc_Ugxs_KwBQ…
G
Yeah but also that jobs already have tools that work 99.999% of the time and are…
ytr_UgzZm8pwY…
G
why? you ask .. laziness and idiocy. Driving is a skill.. to those who think.. …
ytc_UgyNGPWd6…
G
10X cheaper to get an Optimus robot from Tesla and glue some implants on it.…
ytc_Ugw8bAW6j…
G
I wholeheartedly agree. Me and my friends joke about what happens in the ai chat…
ytc_UgzYnRsjC…
G
@olmecsunshine868The parents ignored him they probably didn’t care hence why the…
ytr_Ugw6mEvQS…
Comment
My opinion how AI Tools should be approached: you know all those training data it has? Owners should charge royalties for it, or better yet subscription unless otherwise negotiated. If we have to pay subscriptions to programs, those programs should pay for the data they use.
youtube
Viral AI Reaction
2024-10-11T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwEEmeOA5iG6xCy71l4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzLQTaa1HSeQN662Vh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFzD9vMZ2fGD2JqRJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBjxhncmApaXva7v94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXYu7Av27W46j2biN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgziMBGk0wNV7MyndCx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw32CNca9BQg7aEZg94AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwGf27ImYt9k-ipBqN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyOSe8cmU5dDANi3Yt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxfE3uRq2pUZQ6WJsN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"})