Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree and it is painful. I have to double and triple check the output from the…
ytc_UgxyVovPy…
G
Eventually global suicide is gunna be a big thing.i know that's how I'm going an…
ytc_UgwXDPjtr…
G
Sad. How are people going to eat, survive with no job? Since the government is…
ytc_UgygTjnGS…
G
AI can increase productivity enormously. Just because there are poorly implement…
ytc_UgxEgO7f-…
G
The Turing test is the wrong way around. If the AI can figure out whether it's t…
ytc_UgzsTFlMc…
G
AI doesn't buy products, if every industry gets rid of workers, they'll have no …
ytr_Ugz7h6vHC…
G
Someone needs to make all the AI. Maybe everyone will have to do that for emplo…
ytc_UgwAtQSAd…
G
I wish I have a robot who cooked for me, can do housework, gardening and so on.…
ytc_Ugwt5qT-x…
Comment
He doesn't get it. This is not another industrial revolution that will require skill training. Imagine asking an AI on your tablet to make a company to earn a billion dollars and it does. Everyone might have a robot AI humanoid that does work for them and then cleans and cooks. Their will be zero, zero, zero demand for work from humans. Ai build me my dream girl, here she is. Demand for human social interaction will plummet when you can have a clone. Now how do you maintain control of that intelligence? If there is new job training it will likely be military in nature.
youtube
AI Governance
2023-05-20T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy-_GMVIKU9NGn-JGh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxa24_DzYeOHkzSXxB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwp33yGozRVtkdhF1d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGlkV-Lu8QXLsnhj14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw2FBjGgbJ5Awmhnqd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwKG9g82EiytocWFR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5sV5aFuwB4ikUxW14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwdoZaN_JxTY41lsb14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwceJdyXy1cEFRjyQJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw1oWeMqwowgIrKuyJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}
]