Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Then again, a person who's not being told they're sacred will not develop to a c…
ytc_UgwLkExtZ…
G
@Summatradb if there is no difference why pretend you drew it when you didn't? …
ytr_UgyUPmmO2…
G
First off that's not fair the weight might be the same but the robot literally t…
ytc_UgwtSagS0…
G
Remember though, execs don't see the difference between an AI and a sloppy emplo…
ytc_UgxnhQ1jq…
G
If people have no work, they can't buy anything, and that would destroy the econ…
ytc_UgzhkGC6i…
G
Elon Musk is wrong for the first time in forever. He is wrong. I actually don’t …
ytc_UgzyR5S0Z…
G
Fantastic follow up. Your original video bugged me a bit, but i saw plenty of co…
ytc_UgwPZJUbZ…
G
I like that sharp-elbowed capatlist moxy of Falcon-78, that A.I is going places …
ytc_Ugyq94my2…
Comment
When I was at university about 10 years ago (I studied Biology with a focus on Agricultural Science) we were told regularly that the next world war would very likely be over food and water. The food production system is on the point of collapse, and that's without considering climate change that increases the probability for failed harvestts each passing year. At any one point, we only have 6 months worth of food in reserve, all it takes is ONE global failed harvest to obliterate civilisation as we know it. AI is very scary, but I think our physical needs for food and its multiple stressers (climate change, antibiotic resistance, soil degradation/desertfication, stressed water supplies, and biodiversity loss) are very often ignored as the most serious threat to our species. Without food, we'd all be dead very quickly. It's frustrating that we're inventing problems like AI when the two biggest threats to mankind, starvation and disease, are for the most part totally ignored.
youtube
AI Governance
2025-09-09T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwHAWWt603wMsz49P14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKCcgwg0crUzvPf9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwnhYx7T9SB5nO9jIx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGPOgrhad2JKd0WxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxuvj5CHm9sKFN1UPd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHUqdnVdX-m4vffYh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3cBBudNf9kK9lgPd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyfOx7V3EgteZJJax54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJq5mRNcIkLHrZoZd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLRxvOdNAa82UVd2x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]