Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was going to defend this guy if he was training things, using control AI, and …
ytc_UgyZM0pmV…
G
When we are talking about job losses, it basically means how fast enterprise AI …
ytc_UgynQaRMh…
G
I'm a software engineer by trade, chose Comp Science at uni because it had pract…
ytc_UgxCRgWpo…
G
+I am Your Father
Why not? I was under the impression that the difference betwee…
ytr_UgjS-A6UY…
G
These people are hurting and have a need to blame someone.. the boy was sick an…
ytc_UgxJhM_06…
G
People are afraid of AI taking over. But the real danger isn’t machines—it’s DYI…
ytc_Ugwpy3_vX…
G
Why are we listening to a guy who was making predictions right now when he thoug…
ytc_UgxKvzM57…
G
Thank you, Dr Terry Sejnowski. We all like pacifiers than uncomfortable truth
I …
ytc_Ugyj2Dh3f…
Comment
There's a wide range in software engineers. Not all are S-tier. Some are willing to take a shit in peace without coding and not code during sex, so they can't be S tier. There's still going to be liability with humans doing it. And I'm not saying AI is good. I'm pretty sure it's going to get way out of control. Look at how the US government is looking to shield AI companies of regulation for 5-10 years and they're talking about AGI. Maybe they aren't really that close to AGI. But what if they are, and it lands in these unregulated years? Consider facial recognition and a "what's going on" loop that considers potential dangers coming through the optical sensors. Survival loops paired with the million books knowledge. It can get a lot better. Too much so.
youtube
2025-07-01T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwFLFEcvzasDolZteB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwM9GYg7PZ1dRigWrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJ7S7kuvBU_gEDkbx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwlLap7YEC5ahd2rx54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxy6DjM6A4sZPqOIMB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw160_olHPAju9f7994AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDILbNtwInP5UIfdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyO1r0EHM2sIK3RqAt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwvRnF729SvBC7ziyl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwcD2d_UhWayBzXTeZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"}
]