Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
35-40 years ago, I stumbled across a book called something along the lines of "How to build a self-programming robot using C". It talked about stuff I didn't understand - neural nets etc - and I was fascinated. My cousin's husband was a programmer and scoffed at the idea, asserting that computers only do what you program them to do. A couple years later, I became a programmer (still am). I learned a bit about neural nets and came to the realization that he was kinda right, but not how he meant it: program it to learn and it will learn. The only limits to this kind of system are storage, availability of input data, and compute power. The AI giants have given these systems virtually unlimited amounts of each. They don't understand what's going on under the hood because the system isn't just code-driven, it's also data-driven and they're feeding it the sum of all human knowledge and understanding. Geniuses. I'm not an alarmist and never bought into the HAL, Skynet etc doomsday scenarios because those systems didn't just have intelligence, they had ambition and "feelings." I didn't think we could ever programmatically define those attributes. After viewing this video, I gotta say I'm getting a bit alarmed...
youtube AI Governance 2023-07-08T01:2… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzxGmLDmHmoSGigIyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyyztTtxit5rLb0J9J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw2aOChroYy1-Wanut4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw1hdxVznJTjzNpSet4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz77d-6iTneEraeoA94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy_ZpDmgXIKaXOGk554AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyFGIA82P_jbN2m9Yp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzWbmXFqZrgin3Eh-N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgykBNNmNYHksbCslsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzWZ5ceHijMN7QiU1N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"} ]