Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't act like AI is the problem, it didn't create itself, it doesn't own itself…
ytc_UgwG8lfbI…
G
What if I dropped AT&T. Who will AI talk too…they had to pay for that system, so…
ytc_UgxPrCjoX…
G
This is exactly the type of lazy person who would support ai I unfortunately. Th…
ytc_Ugwfo2xQO…
G
Something that can be used for good within medicine, teaching, etc. can’t be tru…
ytc_UgxjpPgtn…
G
I would honestly support heavy regulation or even an all out ban of generative A…
ytc_Ugw-me9GT…
G
>Cybersecurity researcher Jeremiah Fowler discovered a non-password-protected…
rdc_lm6fa3m
G
Stop with the AI production we as humans can have children who in turn will fill…
ytc_UgyRg9U3D…
G
A genious once said: Computers are cruel, they do what you tell them to, not wha…
ytr_Ugw1FATcI…
Comment
@disorderandregression9278 AI doesn’t get stopped by not caring about a lot of people who are indifferent to it. What’s the point of all of these videos besides spreading more awareness of how harmful AIs are by robbing people of opportunities in favor of a soulless machine.
youtube
Viral AI Reaction
2025-09-23T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugyqsi80wd8kle6FwOZ4AaABAg.ANIyg5czziSANSjxcKH47M","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyqsi80wd8kle6FwOZ4AaABAg.ANIyg5czziSANwLEfoO1zo","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyqsi80wd8kle6FwOZ4AaABAg.ANIyg5czziSANwSxcb6ZWw","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzcaXCtspXPvz7jhC94AaABAg.ANFKOZsC1iWANNuZUw4QM6","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx6qF--Y4FNb91xEwt4AaABAg.ANA-sEp-7wOANOWXzyx-AY","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx6qF--Y4FNb91xEwt4AaABAg.ANA-sEp-7wOANPmntWtzoj","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyPvxTcebozQbLNnCt4AaABAg.AMtyRgislKlANQJygQUbOO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyPvxTcebozQbLNnCt4AaABAg.AMtyRgislKlANQKK8ChdCP","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugw1H1qOJQnCm9xQQ1B4AaABAg.AMqHRpjoKfnANwD1qcn9PA","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyGhT1ZqTtkizWLopJ4AaABAg.AMeMkG4dy_QANwM_0nKXni","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}
]