Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
do deep fakes to the male CEOs, chaebols police officers and politicians. and wa…
ytc_UgwN4kfH0…
G
A hypothesized future sapient robot being offended by our treatment of current l…
ytc_UgxaNtoZk…
G
why can’t truck drivers buy an autonomous truck or 2 and negotiate directly with…
ytc_UgzuZYCe-…
G
Lawmakers really need to completely ban this. Imagine if they deepfake a persons…
ytc_UgwtMJ1UM…
G
Thanks to everyone who participated in the discussion. Some how I now look forwa…
ytc_UgyZI4HkA…
G
If a car is advanced enough to sense the cars nearby and vice versa, I think it …
ytc_Ugh_9XnDJ…
G
Honestly I think the world would be better if you need a permit to use ai…
ytc_Ugx9rwGfu…
G
Every man did that which was right in his own eyes... Unfortunately, our adminis…
ytc_UgyXdHsAW…
Comment
Will coding exist in 10 years? 15? 20? At some point code will be an inscrutable internal property opaque to human understanding within systems functioning without human input. The question is when will we reach that point. Returns are compounding and accelerating. How will we be able to keep up with that change as humans? The value of most software tech is circular in that its used to create itself. Once AI can create what we need without humans, that ecosystem collapses. In fact, the entire economy collapses. I find it hard to predict beyond a year or two.
youtube
2025-03-12T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw5kIsLp0cAeCUDhcF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyxp2v7e5MCow-4ZG94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgzQxiZRuA2kT6Wwtvl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgyqpSdteBwQz70wLRl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzTmCu_MHqRNPKzqNh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgzYBXsjAI_sl41bX8p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy2h3cx0FJdNaRSzwZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugwn4tELYck-Xqgs-wB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwlbSTznMowbzM4XHR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxdvrBllkDQ8Xq-qD14AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}]