Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not at all, we already have full self driving cars that even pull over for polic…
ytr_Ugx0yEpS-…
G
I dont know if you ever read "Player Piano", but it was written in the 1950's an…
ytc_Ugy-wXMOK…
G
also ive started losing friendships over chatgpt. people feed every perceived sl…
ytc_UgxJHMTTn…
G
The fact that google doesn't label AI images is truly stupid if for no other rea…
ytc_UgzMq9tm4…
G
This specific problem could be fixed by running an LLM offline, local LLM. Still…
ytc_UgxkC8mlk…
G
Is there someone in this world who continues to ignore that Harvard is not the l…
ytc_UgzM41L-S…
G
WTF... you're in a specialize field - AI is not going to be able to perform dent…
ytr_UgzHqHhSh…
G
Interesting that there is no mention of the important difference between Autopil…
ytc_UgwqmTPbR…
Comment
Posing a question....will there be enough water on this earth and electricity to run a Super AI? The Almighty saw the Tower of Babel "singularity" and then made it a race to the bottom for those who tried.
Amazing what an intelligent, properly functioning " brain " of an individual can do with a relatively small amount of water and ATP powered energy.
Data Centers show a striking comparison in stark detail. How many billions of gallons of water, miles of structures and nuclear powered energy edifices are necessary for this feat of "genius" to occur?
youtube
AI Governance
2025-09-20T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz75BRy43mxZOrXEu54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3HQwFppCl1j1Zha14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgweCBjWGgPoWFNaOyR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugye7H2e2tZZ8uxGzx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2sBiRspZRqpAvUMp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwoX5GrLveiF9UZPPt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxgcE43MvydC4pYifV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw22vRnyjYiGy9ZDGp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3yjrnX4DHTwySuV14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgweAu6DIW3Xa7xP5QZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]