Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked an AI picture generator to make a self portrait, & it made a scary pictu…
ytc_UgzSHUHfS…
G
What happens if the plug is completely pulled on A.I.? Can we just stop with the…
ytc_UgysvTF-n…
G
My question is when did artificial intelligence start and I know I had to start …
ytc_Ugw2FDvfH…
G
Self driving cars are not programmed for every single scenario and roadway they …
ytc_UgwgZBpAS…
G
Yes, basically any publicly available information is considered "fair use" to be…
ytr_Ugy-EFN0I…
G
Great INTERVIEW!!! Learned so much from this interview that I had never heard!
…
ytc_UgzXXhVd_…
G
AI will never be able to perfectly mimic the unhinged rants of people posting on…
rdc_kd8a6cy
G
ChatGPT just mimics but flat out makes up references and sources. And will lie …
rdc_jmtu3pm
Comment
Great video. I’d like to suggest a fourth reason: lack of true self-awareness. Because we’re self-aware we can adjust our behaviour based on stimuli. LLMs are truly terrible at this - and they also lack any kind of long term memory (since the models are, by definition, the result of digestion of data and not understanding of data). I believe that AGI requires a form of consciousness and that the current tech just isn’t capable of achieving that.
youtube
2025-12-29T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyrzCYQ_xfGOZjdETh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwNR0pu_6IxR3-PeXt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwoSI3nf3NnnEhrfo54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx9V45DbAOfZ6mTYQB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyZJZ177NJNbJEJ23V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxEuUPMgBbxGnYBQBB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw1Hh7h3dme67ZJcst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzs8Dq5lZcO2ecfpFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyECy1JrJdYgzogrut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwYFNNcFnPBkw08JHx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"}
]