Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Godfather of AI that led the team at Google and recently stepped down said t…
ytc_UgwYs9whn…
G
Intelligence is more a of a subconscious level factor, if an ai develops it, I d…
ytr_UgxXOJFDC…
G
@laurentiuvladutmanea sure thing if you really believe that, the fact that you a…
ytr_UgzRN21e5…
G
This mechanical robot seems really encouraged to know what is out there and of c…
ytc_UghKg5Rji…
G
Ez fix Jimmy, the car automatically calls you to come to the vehicle to grab it …
ytc_UgisYzGRb…
G
Air gap AI from robots until we figure this out. This is analogous to the testin…
ytc_UgyCfe2Gb…
G
They're about to send a quarter trillion to Israel...... Whatever piggy bank you…
ytc_UgxsQ0Xhs…
G
Btw, I heard you can add ":before 2022" to limit the search to the time before A…
ytr_Ugw9DFZ1-…
Comment
This does not have to happen at all it would be good to help companies and not kill every job in the world. AI is just one thing then there's robots millions of them. Elon Musk was asked a question how about how many robots are going to kill humans. The answer to his question was 15 to 20% of them. In my opinion that is not a good thing out of millions and millions of robots. If you do the math of 1 million robots that would be 200,000 to 150,000. You have to watch some of Elon Musk videos I think elan is about the only one in this gentleman telling the truth about AI everybody else is gaslighting everybody and lying to us
youtube
AI Governance
2025-12-16T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyb-d7j-KtOn1TDBfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMq4QZJ_uQWpPRvjF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHNnsXGOMTU37Uva94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxEtmYX5eY_W0WydlZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxo81r5WxHxJSXWOgN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxDQOu83CoaHbZk64d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwEjtOVA3d8Db72DWV4AaABAg","responsibility":"elite liberals","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy7wFxa5ko3nGK3HGR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2OETra_yyzNzM9Vd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxgJicFXG-XHXRpkA54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}
]