Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Faster and efficient is good tho. That's what we have worked for for so many yea…
ytr_Ugxhl0Rn_…
G
With regards to the question of why nobody is protesting or screaming about it i…
ytc_UgxXI1M4a…
G
I’m less concerned about the limitations of AI, as I am about the intentions of …
ytc_Ugy60iPR7…
G
We get it, oscar AI is creative, you're just pushing buttons, you made the promp…
ytc_UgwHJMBCn…
G
Translation:
* Burn rate is higher than revenue
* We need to reduce cost whi…
rdc_m29xhut
G
Just media after 2h need to stop for a cup of coffee or recharge, but everything…
ytc_UgweMvXm8…
G
I gotta be honest, I draw things sometimes and if somebody told me “hey we’re go…
ytc_UgzGv8dyY…
G
I'm glad AI got people to care about the insanely high professional artist turno…
ytc_UgzvAL6Vz…
Comment
On the other hand, Simon predicted that AI would be a grand master in chess in 10 years. That was in the sixties. I remember showing Simon a book that had these diagrams for numerical pattern matching, and he told me that he had forgotten mostly about all that stuff. (I was still fairly young and scared at that time.) So, they went down the garden path of pure symbolic reasoning. AI has bumped along garden paths. They used to like conceptual dependency. Nicely symbolic, not very much like a neural net. Now we build grand power sucking buildings full of GPUs for NLP. Is that another garden path? (They used to like to talk about garden path sentences, when talking about NLU language parsers.)
But, here's a question. If most humans use an area of the brain that processes emotions in order to retrieve the meaning of words, can we say that AI understands natural language when it has yet to be able to experience emotion? Have we put the cart before the horse and also thrown huge amounts of money at it?
youtube
AI Responsibility
2025-10-10T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx5OQ0O8QRbuGHa8ah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxKi4aBJY_nYllzuhl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgywNy9Xp1JDRTBuVSZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxq_0hgQqdj90Kwk_x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx_jxM_Ft0vg1l1Zy54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzY-fPpL0IhJF3JvOZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyCULL9e8icF3lcu9t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyBpfBI9tHoMGClmCJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyir6Nufrutl-JtmJJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwn9GkPqsmooflF_Jx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"})