Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Been using AI in the last 6 months to create apps, and I would never call it art…
ytc_Ugwdtl2WG…
G
AI should replace CEOs and the bigger jobs that people just sit around doing not…
ytc_UgzumCch-…
G
The vast majority of software companies don't measure their productivity so they…
ytc_UgwB3Hhut…
G
fuck yeah i'd give rights to my robot if it was advanced enough on the ai depart…
ytc_UgwWz7BZy…
G
Brian Gibson when his brakes failed on a steep grade took his truck over an em…
ytc_UgyIEfezo…
G
is it fine if I use ai to help me with pose ideas or what colors to use for my c…
ytc_UgwijGK5n…
G
Me every time I try to use AI:
Me: Hey, can I do X in Y?
AI: Of course, here is …
ytc_UgxIaEc2g…
G
Shad sure has some audacity to talk about "feeling threatened", when his entire …
ytc_Ugxady3QN…
Comment
I hate when these two discuss AI. They don't have any real background in technology or what these things can actually do. Krystal is convinced that companies want to use it to replace all workers. Nonsense. If that happens then there are NO CONSUMERS to buy your products. Elon constantly over estimates how close things are. Full Self Driving isn't as far as he promised years later. We're not going to Mars in the next decade. A robot plumber being able to navigate under my sink or climb into my attic and fix pipes on it's own. That's not something anyone will see in the next 50 years. These things take much longer than anyone things. Terminator scenario is just the most ridiculous nonsense I've ever heard.
youtube
Viral AI Reaction
2025-11-04T20:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyG3SJTxFrJaKTxRrd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwp_SvxNwza7gUy9z14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugze9wkbN5kWYTliKOh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy8oSy741h9-XUsNlZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxuTArInSimDwKdSsZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxeYiDRG-tMX3HunIl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydApB1wgTxsIGAt014AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgyWzNL4__1qYlSZHSp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9weXUCfH7FQE6xup4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxmPM2FdwVJSoFOcOB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]