Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is stealing people’s art so artists are getting there art stolen and used by …
ytr_UgwKkTbxC…
G
image generators generate something that kinda looks like a dog if you squint at…
ytc_UgznE8eTz…
G
crazy how we will watch this in classrooms in 200 years from now in 2225 talking…
ytc_Ugy9neK19…
G
Submission statement: The conspiratorial "[dead internet theory](https://en.wiki…
rdc_m5la6h9
G
Copilot save lots of time if you know how to use it. It's far from perfect but i…
ytc_UgxskXcdC…
G
I'm sure self driving cars are sub-standard BUT someone in dark clothes on a dar…
ytc_UgwK_NKcd…
G
as long as AI isn't intelligent and therefore can't create anything itself I rea…
ytc_UgznhtSPQ…
G
Enjoyed this, Shelby! I visited San Fran last month and rode in a Waymo for the …
ytc_Ugw0kf6IO…
Comment
8:59 for another giant leap forward we need more computational power and the rate at which we are going is already unsustainable for our existence on this planet.
Current LLMs are already pushing our existing hardware to its limits and we can always cobble together more of them and say enthusiastically "we can do it!!" but using 10 men to do a crane's job is not really sustainable or practical now is it?
youtube
AI Responsibility
2025-10-09T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz1ulVObLo5Xhbz3SR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzzUcdq3a1terGtASZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJfTYuhvOkkLnAy-J4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyghzBSdDeI4iCJP0d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYSgiBhIt6kTdBJqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRmMShOC72uyDiA8l4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzd-jEKd3XziUP2spJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxh4LeGFnacbZ87qIt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwoZoTxamBlWTTOMY94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyukMN_cEaSERNxIyl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]