Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's an awesome video, but I think it makes it all pretty clear. At a minimum …
rdc_gtcu11a
G
Part of how Tesla claims minimal crashes is because they internally define a "cr…
ytc_UgwoBcvqY…
G
If AI reaches that level without us realizing that.. this will be the problem. I…
ytc_Ugwhw34_a…
G
well if AI analise USA and Isreal there is 90% chance that it will a…
ytc_UgyqHpsPB…
G
AI is coded by humans, learning from and modeling after humans, ending up amoral…
ytc_UgxKiXfRU…
G
A Big part of the problem are that liberal leftists have their hands on the codi…
ytc_UgyMAThoA…
G
1:29:17 what if we find out that we can't co-exist happily with AI? Why don't yo…
ytc_Ugy9D1PkV…
G
I think that I robot it’s the perfect reason of why we shouldn’t let robots lear…
ytc_UgyXUFRkY…
Comment
We are the ones who will make this profitable by being complicit.
So, think long and hard before you use any AI.
Alexa, Siri, Grok, Chatgtp etc.
Ask automated systems on phones impossible questions so you get to a human. Don't use the AI chat box that companies offer.
Stop using Google for your searchs.
youtube
2025-09-04T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy_wPFIiI6VuN_ufqJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBz-G5bGgMfwO31kh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwUEWzB-rYdxfNyFuZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwYSc9SXqM9yCwv5XB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWaKiQ80BgJIh4_pB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwPLjlfH5_TPjd3hOl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxxg33bJq3O4VyrniV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4s3MGasu2KgSFNkJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPvquSZ9vqNScg_VN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzpyYecQJvHOyjm_xx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]