Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tesla's Full Self-Driving is now called Full Self-Driving Supervised. This syste…
ytc_UgzQBAXwG…
G
*_Where are all of the EBT recipients going to get jobs???_* 🤔🤔🤔🤔
*_Production i…
ytc_UgxkG6xp7…
G
We appreciate your humor! While the robot in the video might not be into lace fr…
ytr_UgzPezXkK…
G
AI won't be the thing that ends us. Our own stupidity will be. Climate change, c…
ytc_UgybEjy0O…
G
Some people don't need talent to use AI, but I've spent hours and days on my mus…
ytc_Ugx9PKpdT…
G
If you ask llm a leading question it will give you the answer it 'thinks' you w…
ytr_UgzuhiM4y…
G
Sorry guys design is dead, ai is here to change everything better faster and alm…
ytc_Ugyu5_oY_…
G
@BrendanDell It can, in fact I'd say automating C-suite task are easier than a l…
ytr_UgxUfU-x6…
Comment
My one question is alway this: these AI systems definitely need good amount of energy. So right now VC's are pumping their own money because they feel this is gonna change the world but will it be as cheap as it is today always. Would'nt there be a day where we will run out of energy or the cost of keeping this humongous AI system will be so much that these are not sustainable anymore. I feel economics of keeping these AI systems is not discussed enough
youtube
2025-06-07T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz-WNKxz5yxZO5ufzh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyn_r3e1GTRp6akSR54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxpxXhOw8Gbz1XfTlJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzMZjYqmrY3TtzZFth4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFSMbAHybT81g8TcF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxq4inAcuj1NxsHCCJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOAJdWka7a56YHTFV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzi0acRg3dn_r6AgQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyiefXddm_bKng38_d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQaXN-Yr6ec3LbMbF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]