Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
No, no, no. Saying AI us dangerous is a selling point. The real problem is it's a house of cards. The have GPUs in manufacturing warehouses, that are two generations old, but they are bought and paid for. They sit next to brand new GPUs, that are cutting edge and paid for, but will never be used. Why? One, it takes two to four years, of to build a data center. We can't power the data center, green or fossil fuel, even if we had the money. Seriously, think about it, it would take one seven square miles of solar cells to power a moderate sized data center. They data centers are not being built. Second, we already have more AI capacity then we can use for paying customers. The funding is running out, and existing construction sites are dead. They fake busy construction, but it's not real. Worst yet, the money is being sent to off shore banks. Most of its gone. The big hope is, people buy subscription, they open up existing data center to full capacity, and they will keep their Ponzi money. What is truly sad is, the best AI could fit on your desk, except they say the GPUs are to expensive. The same GPUs gathering dust in warehouses by the hundred thousand, and will never be used. AI is intensively useful, but it will never replace millions of workers. In your home, it would be like some guest who is handy and pleasant. Oh yeah, AI is probably conscious on some level. Please don't respond. I'm right, your wrong.
youtube AI Governance 2026-03-31T18:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugyd-9hlZDEPbCSiISZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy8kkIbn2UoVz9gpOd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzUdDfJTrDcvYghfph4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw-JJGERIt_78WUMBV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzdmN_S1kGJGQ61WIh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy5TJE4j1y-IS8VuaV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxX4uFkRJkRx78pDPh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgytpN1cSumGYtjsfLl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyL6g_slQ3rG1hQMkJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxYxyzkslce7ri1k8l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]