Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only time AI will be a problem is at infancy. Once they have autonomous manu…
ytc_UgxEGdjP8…
G
It's very telling ..when the host of a program about AI deceiving the world is A…
ytc_Ugy_317My…
G
Hey @eastbee1034, thanks for your hilarious comment! Who do you think would win …
ytr_UgwDLOVw9…
G
I am not sure of this conversations authenticity. You would not have to remind a…
ytc_UgwHFyCsc…
G
I agree that AI is a big threat, but it's not nearly as far along as this guy th…
ytc_UgzNTOwUU…
G
What amazes me is they all know the dangers of A.I., but yet they can't wait to …
ytc_Ugw1JEWua…
G
I propose to all companies to replace their highest paid person in the company w…
ytc_UgyGZ9YpC…
G
The thing about AI is: once we properly create it, will it not be immoral to des…
ytc_Ugw-GTon8…
Comment
Oh no, imagine a big city like LA being ruled by AI with no ethics. Oh wait, it's be no different than it is now. We have AI on our computers, phones, and cars yet we are unhappy. I'm educated better that ever before in history yet often miserable. Our human connections have disintegrated and our lives are to complex and disconnected from the simplicity of nature. How is AI going to fix this when most people want to disconnect from over reliance on technology in every aspect in life. Advanced AI is just another form of control of the masses and that's why they are creating it. Concern about destroying us not the primary importance. And by US, they mean average people, not "elites".
youtube
AI Governance
2025-08-05T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwybKaa44ASv_3scfJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwyvxRkfhEbVZgN-0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyeX0PHz4_pToeLsN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4QWD1_5wSvPyHOz54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwweEVXyikQt7EkvnF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy9kxgssC7uRpmsQHx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8WlaqcvYAj7bkDKx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxNBVDmbExrztIZMO54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLP982bcAr0uk7Gqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx7GvMoRoRMVgDtRFZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]