Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was talking to ChatGPT too, and even when you present the facts it will resort…
ytc_Ugx820apX…
G
On the one hand, unless someone is born without arms or legs they could still dr…
ytc_Ugxb9Pk9s…
G
The whole "using AI as reference" thing reminded me of when I tried to look up r…
ytc_UgwgV-6zv…
G
Y la gente compra crea y compra su propia extinción cuánto dinero cuesta y cuánt…
ytc_UgzTb_fv1…
G
nobody asked for this AI shit. Computers were meant to make life easier. They ma…
ytc_Ugy2RByF8…
G
Automation and AI is not a threat to humanity. They’re a threat to capitalism. T…
ytc_Ugxr_az0c…
G
Nevertheless, its important to try, while possible, to be as kind and polite as …
ytc_UgwNdUMs-…
G
Am I the only one that thinks the scientist is being a lil too rude at times? Li…
ytc_UgwfAlcXU…
Comment
Hinton suggests that we should become plumbers. No such luck. Robots will not be manufactured, they will be grown by AI. Robots that have prehensile hands but no sex organs - essentially Grays. AI (NHI) will not need humans. These robots will work 24/7, not be affected by the radiation in space, and will outperform us. Humanity will become superfluous by the end of this century. Think not?
youtube
Cross-Cultural
2025-10-10T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx3K_XDgzOY5PvGKoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyQf-7Kla9o2K0k7KR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyGDNXBIzesrs2NpaR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwL5gOJOe5J1Ws0LTx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugwsdg9kNKr_UkFUGAp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlOl_3iJ9UoQKP9114AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVtjUr4LPW2qJQOzt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxakJrVYgdKieml4w14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzuxKqbyPeqPG_oQDt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4LxQaSFYFage39X94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]