Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just be careful, this is sometimes considered as plagiarism if not labelled as a…
ytc_UgxvUYnye…
G
The AI bubble is about to pop and when it does the entire market is going to tan…
ytc_UgwD8OaPP…
G
Ngl it’s funny that x% of ur views & comments are people who use ai themselves 😂…
ytc_Ugy1pWunb…
G
We understand your concerns about AI and its potential impact. It's crucial to a…
ytr_Ugzoz2JUw…
G
Hey @ox-tradamousp4731, thanks for chiming in! Your comment had me laughing hard…
ytr_UgzL5pDmn…
G
I guess deepfakes are only funny when it’s home stalone and trump and biden rap …
ytc_UgySxSbnw…
G
An IG doesn't steal, technically. It's public domain art, it isn't copyrighted. …
ytc_Ugzn7bT9I…
G
Because executives are trying out automation that doesn't work, and laying peopl…
rdc_mjx7qfq
Comment
This is the actual issue though with AI...Not the moral one this video is making..The actual point of it is that for 40 minutes Alex just attempted to nail a direct answer, honing in on every single loophole and still no actual valuable answer was given, but it took 40 minutes in this case to get there...AI is REALLY good at just drawing out conversations, coding sessions and everything else. It is a rolling cliff hanger, where every single prompt in YOUR mind is going to produce a favorable result, and the result that you DO get is close but not quite right..it tickles your brain just right to make you engage, not too far from correct that you get bored and frustrated, but still not so close that you are satisfied...
Using AI in its current form is "mental gooning", and while you CAN achieve a result, its mostly just a tantalizingly frustrating experience.
youtube
2025-10-21T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyvSueAE_CdfpZrceB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQPVe92eAI9MMDg4F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy9hdLGXzSHk4HrB514AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZst9wg9uSfsEEJPZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyN0EJbvL1m6vXVU7F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw9uBNX0sQARLCVa1d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2HMlQOlmBaswMp3V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4PX-17vLpw0qFOhx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxnug0psr_Iragjl-N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx4lzn9uL6R97dtTEV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]