Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:17:28 this is utterly ridiculous. Dean has the crazy opinion that there's only…
ytc_UgwV5WUhE…
G
OK. It's bad enough adding fluoride to tap water and having bromine compounds i…
ytc_UgxDvUzvf…
G
As an AI engineer I will say now. AI wether you like it or not DOES have creativ…
ytc_Ugz2Ueeie…
G
Ai though can replace thinking itself. Some technology and automation is good, …
ytr_Ugz3T1Tjk…
G
On a deja pas pu réguler internet
On a deja pas pu réguler les horribles réseau…
ytc_UgzS0YA_M…
G
Do you really think the greedy AI folk will share the wealth that AI will genera…
ytc_UgwlBwnFW…
G
Can you speak on the rumors of using AI poison (watermarks) are negatively effec…
ytc_UgxSFqMEq…
G
Waited in vain for the part where they chased driverless trucks in Texas and saw…
ytc_UgxqV8F1i…
Comment
Guys, we need AGI to build humanoid robots ASAP so they can help us set up colonies on the Moon and Mars. And there’s really nothing to worry about since governments already control the internet and can limit traffic speed, block certain servers or shut it down whenever they want..
youtube
Cross-Cultural
2025-11-16T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz0yCImuESpQP4x6Bd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw0LAZksbgEzOtbTFl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyuM3iWucntHYBW94d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzn7Skx_5AVIwrgXXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrJsGh0tdnYE4Xxb94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVd1QYCLuIx5V2sO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqLRasuGHpovpgZ6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJpcBZoFwbVk9q2z54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdFlqWte21A8J13Ut4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzzeyMnkCBScBi5gLR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]