Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All are stupid the robots perform from the code insert robots has no emotion if …
ytc_UgwUMNRSe…
G
There is also a large number of people that believe ghosts and angels exist. The…
ytc_UgzhENLOZ…
G
Who are these virtual influencers and fictional AI characters? Have I seen them?…
rdc_oh85ak1
G
Just get into a mouth fight with one. Thank goodness they don't have bodies yet.…
ytc_Ugzn6jBOX…
G
i would literally rather watch a goldfish try to paint the mona lisa using digit…
ytc_UgyJMhmp7…
G
If ai is not humans. How it wants to take over the world? So, the problem might …
ytc_Ugwa4nUGn…
G
I am deeply sceptical about Geoffrey’s claims on AI consciousness and his theory…
ytc_Ugx2ExhbQ…
G
i work for myself building/renovating homes, a.i can help by smelling my balls w…
ytc_Ugy0w_PvX…
Comment
I'm not overly worried yet, Just now I was using ChatGPT 4 to get some input on the render System in my ECS. When I asked it how it would implement a hypothetical function (get all entities with a given component type), it basically gave me the naive approach. I then suggested a rather obvious optimization, and it all of sudden agreed that this was far superior than it's suggested solution. I then gave it an alternative and also optimized way to solve the problem, and it liked that as well. Where it then was useful was to ask it to summarize both optimizations and considerations in pros and cons for both and what would align the best with my overall goal.
youtube
AI Jobs
2024-01-14T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz-fo82PtqDlyXUdC54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzngCRoKQzoqxBBULV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9fPpL9vGODYhQSPN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwXQ4waBsF5PBrxGVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFP3kDDiAkYzXtdJl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgznDKPbLo3r6PGoJB54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw_srfSfyiDCXPfjmx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZxhI-T_xiBZFubZN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymMMChTbY3VZTi15t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqZ8iNIokba5HAaTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]