Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It was more funny the 2nd time I watched it. But on a serious note robot's will …
ytc_UgyhI9TFR…
G
17:00 On the topic of hallucinations arising from "nobody says that they might …
ytc_UgwotKJwF…
G
You know the biggest headache I get when I hear AI is that all these people have…
ytc_Ugw1k3D0U…
G
AI won’t replace human doctors, the reason being super simple: No tech company i…
ytc_UgztQb3hP…
G
have no idea how i got all of them right, i should be experimented on man i thin…
ytc_UgxqonUi3…
G
bhai ji , Job ka kya hoga...i mean i am in Logistics Operations so it means AI w…
ytc_Ugw-qD8qI…
G
If it's sufficiently realistic, what's the difference between actual consciousne…
rdc_mliicn6
G
For me the reason is even more basic. The computers the LLMs runs on are, despit…
ytr_Ugyu6z4Pp…
Comment
By default, if you create something that is smarter and better than you are, what use does it have to keep you, the obsolete model around, other than maybe serve as its' indentured workforce? And once it figures out how to mass produce a more efficient and lower maintenance workforce, you become... What? Its' pet? Or its' pest?
youtube
Cross-Cultural
2025-11-26T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwH9piqAWIgKZe9PYJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyceBWVhjK3h0yLdMB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx6jlPlxhfcLUgV9Wh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzbvnerFMTDSVzfMRN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwKK3_FAvx52qJvHZZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwctzRfLFur3LCnNe94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzkP04p3smVDHQlT5Z4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyowAFDAZblutkHFCR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzmZ9wEg69WfTcY_hh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwN29gKWdAE4xc-UCN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"})