Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, I'm sure Turkey has many active cases with deepfake, the difference is Kor…
ytr_UgzwJnMp-…
G
Secondary question, how far off do we think the singularity is? 5 years? 10?
On…
rdc_j42tdj9
G
17:23 Y-You don't think AI made the script for the ads? L-Like they know that i…
ytc_UgxswfEfJ…
G
When i go to ibis paint x i get ai art ads like what??? You interrupted me makin…
ytc_Ugz7cOxK2…
G
I’d rather be ass at art(which I am) then use ai and judge people who actually s…
ytc_UgxD6ZSVD…
G
I don't know about secret AI's, but these large language models like chatgpt and…
ytc_Ugxmc9E9q…
G
You guys are going to end up just having to adapt to the new paradigm.
Musicia…
ytc_Ugwvtymf0…
G
@group555_ I don’t think an AI can emotionally feel anything about art at least …
ytr_Ugzdx2RSq…
Comment
i'm a dev myself for now 30y. and i'm using ChatGPT (3.5.) for creating some code. especially stuff that is annoying. but it has a lot of problems. a lot of stuff isnt really working or has major logic problerms. maybe in 10-5 years. we are gonna see. imo the biggest problem is to great a good frontend with the working backend behind it. Chatgpt has no clue how to create a good UI.
well a lot of devs have a pro blem with it. key is watching your customers using it. talk with them.
or as i say "if you make something idiotproof, someone will build a better idiot"
youtube
AI Jobs
2024-01-14T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz-fo82PtqDlyXUdC54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzngCRoKQzoqxBBULV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9fPpL9vGODYhQSPN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwXQ4waBsF5PBrxGVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFP3kDDiAkYzXtdJl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgznDKPbLo3r6PGoJB54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw_srfSfyiDCXPfjmx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZxhI-T_xiBZFubZN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymMMChTbY3VZTi15t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqZ8iNIokba5HAaTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]