Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I didn't know that Trump"s Big Beautiful Bill would protect AI from new rules or…
ytc_UgzjGsOII…
G
Robots won't help us, they won't teach humans and artificial Intelligence will n…
ytc_Ugw_-VAHB…
G
See. Here's my thing, man. I am of the belief that we actually DON'T need to b…
ytc_Ugy4oyu7E…
G
Tesla should "stop mis-selling" their cars and kills people on the road and pret…
ytc_UgzI29JHI…
G
This is wrong in so many levels wow im shocked i truly am... Where is their free…
ytc_UgwTsEBjA…
G
I see where you're coming from! The interaction might seem a bit scripted, but i…
ytr_Ugwc1CLLo…
G
I'm sorry, but Shad sounds so pretentious and self important. While he can draw …
ytc_Ugxtiakxk…
G
2040:
AI: found cure for cancer
Me: Can't afford the cure because I am unemploy…
ytc_Ugyos7YvR…
Comment
It's not about what is or isn't, but about perception. Right now the perception is that AI will replace software engineers. It's what a lot of higher ups at a lot of companies believe. Now I think this will create a mess but their attitude is that is "Someone else's problem" or "That's next quarter's problem". Secondly, because of the demand for speed in release of software AI is going to be heavily used. There is a kick to only hire senior devs, and that is going to become a serious problem for the future, but again it's "That's someone else's problem" attitude. And with AI people will start to generalize more and it would be difficult as the years go by to find someone who knows what is going on.
youtube
2025-03-15T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzwy1_mQKQXlYESIcV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUeDysggdaEIIT60x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCr3e7PBe4fdZtfh54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_vtvhmE8Bv-SwjJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLsf1jQ0Sn-MZU3q54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaAXQSUfzcSSEKAD14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHabuon0W9seo9JcZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzktF-yYWNVGk6YzRN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTzP5mkQY8g8WegcR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgzYMvw3i8T8Fv3JJ4B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]