Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think it’s really funny how all the ai art stans post their ‘art’ online and m…
ytc_UgyF99wSX…
G
I'm only an hour into the lesson, but I'm clueless as to how this could be relat…
ytr_Ugx284WJq…
G
Sophia hanson robot genovese boss in jamaica wants your skill to develop company…
ytc_UgzP314xB…
G
😂 so lets put a guy who do t know a single shit about a job and he will just pro…
ytc_UgyWolRet…
G
AI: Give me control of your nuclear weapons and I will repond first to any attac…
ytr_Ugxyk5h9M…
G
when the term clanker first popped up i genuinely thought it was hilarious and e…
ytc_Ugzy-mWip…
G
Very apt sir. What is micro conditioning in AI? Is that a term ? Heard somewhere…
ytc_UgyqZU8PP…
G
This isnt a telegram problem, this is an AI problem. AI such as this should neve…
ytc_UgxPCaAiw…
Comment
We are already spending more money on AI than what it would take to help end lack of food, housing, and healthcare, for the homeless and very poor. The developers are willing to spend more to house, maintain and sustain energy to more and more data centers that make AL possible, it looks like humans are already not as improtant than the rush to achieve super AI. It is already important to find a balance between techological advances and empathy for humans. I don't know where the future will go but unless we start taking care of each other in a balanced system, it won't matter.
youtube
AI Governance
2025-08-24T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyIyX6moK1lnoMJ6p94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8emJK3wAA1zncbFp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMRJiSrFMhAYOS7t54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwCo9_agtb3HUCVkpZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkJktRzFMEeEiNNXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugya5wbf7dVNBsQc6aN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxywmazJsBNYVyxQkR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgztYIoiVHMNXTwmZuF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwzptrOKFlGHrHAgdd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwgfLDos1LSfj7eOql4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]