Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What are you talking about?
China is a formidable military force, at home. They…
rdc_ohyjd2r
G
I absolutely believe self-driving cars should have two emergency buttons or opti…
ytc_Ugxd2UBQA…
G
**Nearly 90% of H1-b's are ENTRY OR JUNIOR LEVEL JOBS. Cheap Indians are replaci…
ytc_UgxlrDSB4…
G
Its a fkn prediction model that can produce text that is concerted to audio... W…
ytc_UgyLO3dGG…
G
Ironic this video is about dangers of AI, but seems AI was heavily used in the m…
ytc_UgzodTQXq…
G
It should be required that all videos made with AI should require some sort of t…
ytc_Ugwfs9-Q9…
G
In order for AI to feel anything it would require a physical platform built with…
ytc_UgjNsVEAg…
G
He is seriously underestimating the advancement capability. Before the end of t…
ytc_Ugz1T6-lY…
Comment
Do you really think LLMs can get us to AGI, which would be the only kind of outcome where the tech “thinks” for itself. Right now we are a long way away from
youtube
AI Jobs
2025-06-23T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugzgi-llRCvIik2BKy54AaABAg.AJhaL30Pxv9AJktCL6bBHO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzgi-llRCvIik2BKy54AaABAg.AJhaL30Pxv9AJlKKS86ZO3","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxrCbCh1cXD9vbtYy14AaABAg.AJhSFRf4bYjAJivxOGgYho","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw7kB6ZD7V-eYRo1CF4AaABAg.AJhE6GIAq4DAJjGEHQ-Thq","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugw7kB6ZD7V-eYRo1CF4AaABAg.AJhE6GIAq4DAJv-Knm19Ob","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxcN8H2BImy9W7RTY94AaABAg.AJhBief4FHQAJjrl4_C2kY","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwdIYFxt9jJMbnEDVl4AaABAg.AJh7pBOx5o9AJhC83x3HeZ","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwdIYFxt9jJMbnEDVl4AaABAg.AJh7pBOx5o9AJhVvN5OQqN","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UgwdIYFxt9jJMbnEDVl4AaABAg.AJh7pBOx5o9AJi4bmOcX2M","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgxBmPS9ecigoRNPBmh4AaABAg.AJh1zVH7s-hAJh8PowmHRk","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]