Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One day I'll make a robot to kill myself, and then it will be so depressed about…
ytc_Uggl3gbXV…
G
Polygon should redo this video with the ai image generators we have today :D it …
ytc_UgxlSS-Tj…
G
chatgpt still wont admit it. it said it's the earlier version of the model that …
ytc_Ugx0czxZz…
G
Some counterpoints:
1. AI and robotics will make goods and services much cheaper…
ytc_UgzVSwECQ…
G
The thing is that artist haven't seen the potential in AI, I am a software engin…
ytc_UgzofsZ-E…
G
lol you would never give your toaster that lvl of ai. you give the robot the lv…
ytc_UggNgQ_Hy…
G
sorry but john "dishwasher" ai would prefer spending more time doing dishes and …
ytr_UgwWkbi7a…
G
There are more than a dozen reasons why Neil DeGrasse Tyson is overrated - but h…
ytc_Ugydq_5tI…
Comment
AGi deploys and the economy changes.
The people receive guaranteed basic income and a fake Capitalist economy continues to govern business. Inflation will push up prices, but it can be controlled by the Ai.
After a while, people start to enjoy being on permanent vacation. Humans start to study art and music because the AGi has all of the job stuff handled. It might not be so bad. We will struggle to find new things to do to occupy our time.
I think the ones who are worried are the billionaires. How will their status be protected if AGi is calling all the shots?
youtube
AI Responsibility
2025-11-09T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxbWvRGfkZatVs3qH54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzqYMRSr0zUaKSl6pd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHM5XPogJyxrpEqDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7rTRDYCUWbNhEErB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwxjPmYLy8SKwuFYr94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxd0CfnFQ0Ckkdclm54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzHYm8leLXD42JlzSh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzs57Z2aBPm0zevect4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxXBdK2R_h4xkc7o914AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTLfOiAxNwJ0LgH6Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"})