Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot death squads are the inevitable future of California, the technology to en…
ytc_Ugy0Aw43Z…
G
Imagine if they were out of control on the stage and ultimately we end up destro…
ytc_Ugy2Qyf0r…
G
Shouldn't artificial intelligence only be served for industrial jobs and robotic…
ytc_Ugxlw4rDe…
G
Shallow psychologic and ethics descriptions made by people that have no idea of …
ytc_UgzgvwysC…
G
These people creating the AI robots try to make a joke by programing them to tal…
ytc_Ugxfco0ji…
G
Adding “please” to an ai prompt (according to chatgpt) costs chat gpt £60 Milli…
ytc_UgzRL-h2N…
G
This is why self driving in its current form will never be viable. It's the lega…
ytc_Ugz2EZU9R…
G
Plus literally the only reason an AI would know to use that term is because it’s…
rdc_kgpp4w4
Comment
she echoes the same sentiments emily bender and alex hanna do in their book "the ai con" but I like Karen's wording way more. much more straight forward, where as emily and alex will give computer science exmaples that i get kinda lost in. great talk
youtube
2025-09-10T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwj4c2RhCiOiarrCM14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxJ4xi9x2tv_1PG3x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-ULbsSAnfaVGDGo94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVF9_yBdUVSFhF1b94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2IV5iTIAD0e-RYRh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw5TXAd4b7woFFrntV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGRwKMfCR0QPSuo4p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxAlehNecZlyYroQPt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyT2euUWINCH0JZxkV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFQm3lOtDc_VrxzzJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]