Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can’t see people wanting counselling and other services that require human emp…
ytc_UgwH4Vvth…
G
What they call AI is the automatic compilation of paragraphs from the internet b…
ytc_UgyxOVAwb…
G
You don't take into account how much resources it will take to automate most of …
ytc_UgxGaea-b…
G
I love how these AI people have all the answers. You have a question, they will …
ytc_UgzDfsapx…
G
What about the whole IRobot idea of programming rules for the AI? Isn’t morality…
ytc_UgyzHDcw1…
G
Sorry, we don't have to wait for 20 years to see if this is successful. Listen …
ytc_UgwUGAlRh…
G
Sometimes I really wondered. do we really know what we are doing?? What would th…
ytc_Ugw0jQn-k…
G
Im 43 minutes in and this is painful. Graduly, slowly, controlled not really how…
ytc_UgzI_8D5K…
Comment
Maybe if they focused on shit like medical industry applications for AI, it'd be much better around the board.
Much easier to make sources transparent, much easier to build at scale and not make a giant ass bubble for everyone to get their dick in a knot over.
But y'know, the numbers don't look as good, and since when has good business been anything but number right?
We live in hell.
youtube
Viral AI Reaction
2026-03-10T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzTEdckes_jvFyoDq14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzd8B5MLK3G5Ba3ph14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznuA_gmSMLwkBu0Sp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1qPPm2W8pRLCBPlp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwR6CugiqJ0KiAIG-B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwWyPrET6mIifCmnkp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz1ytCOZY1TpFHO8lh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzq4kMxpse7OYZ3IRV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgysVUPPhq52GqDuGAl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwiGdIT5OwY5UpxddJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]