Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These Washington heads are absolute morons. They have done nothing to regulate t…
ytc_Ugw6mugid…
G
Please rewatch “2001: A Space Odyssey” - now we can watch it and see clearly wha…
ytc_UgxNaXN5o…
G
I can't hate that AI learns faster than we do, all art is a blank canvas, just l…
ytc_Ugyz33ChA…
G
Well, all hollywood movies that told humans vs ai stories always ended happily, …
ytc_UgxO6eJ_T…
G
*No one:
Ai : ‘Your art and creativity is mine now, and you have no choice…’…
ytr_UgyNdngrd…
G
Typical of pseudo intellectuals. She's an alarmist and a communist. They are aga…
ytc_Ugy4AMki8…
G
I've only just been experimenting with Midjourney and was surprised I could chur…
ytc_Ugz2xqk4k…
G
@chrism.1131that's not a virus. I mean an actual AI virus that we have to use a…
ytr_UgyUBKQwR…
Comment
My only caveat is that Mark says self driving while autopilot is not self driving. He should have used the most up to date tesla tech FSD which would give it a fair test against the latest Lidar.
youtube
AI Harm Incident
2025-03-25T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzxN5G-HEPZtFefR3B4AaABAg.AM8Wl9suTNGAMGvUYC7pIv","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugxx9qpGuLd2ykgLO_F4AaABAg.AM7sedQbyMIAM8MkMaCRmW","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxx9qpGuLd2ykgLO_F4AaABAg.AM7sedQbyMIAM8U6B1Wrs-","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgzUCeUp19d7VNj4li54AaABAg.AM7r-yqK7sIAM9GSCPrC-L","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgxmJOtMC0JNnttbqRB4AaABAg.AJhUbwUumESANr5nAD7yaE","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxe2d2JPYqAUiHFe854AaABAg.AG5k4KeGE9sAPo3930YHdr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxPwbPKWkG7J43GV-R4AaABAg.AFmxL8-SEYZAG5zd7Gx9wB","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytr_Ugz6fw_yPs5y9_v87Qp4AaABAg.ACRye-XycaqADHF9AAflQf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy-isrs7pM_0H3TYNN4AaABAg.AB52ZnmZE_tAE7YQyMZnHV","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzFRMLJYKKj3rDmZOV4AaABAg.AVsbjQmcNf8AVsjey0y1xV","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]