Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is this AI-generated? BBC usually says what they want and expects others to 100…
ytc_Ugz_pUlSI…
G
You give us too much credit believing we will come up with a General AI that wil…
ytc_Ugx5aXz5A…
G
Well, facial recognition is pretty bad since it can't tell my girls apart, but w…
ytc_UgwnCL-qR…
G
"AI has no values" is the most braindead, ahistorical and anti-intellectual stat…
ytc_Ugwk8Of9T…
G
Predictive policing? Was this sherriff watching tom cruise movies and got 3 woma…
ytc_UgxtkMWTX…
G
A mother's cooking is better than any food in the world, and I'm glad ai knows t…
ytc_UgzY18eVH…
G
They’ll be walking around with us soon and we won’t have a clue. Perhaps they al…
ytc_UgwK90EWY…
G
Tesla Full Self Driving are very dangerous.
They will kill people...
Politicians…
ytc_UgxsyVuEt…
Comment
I work for a large tech company. Our devs have not been “officially” allowed to write all of their code using AI.. but that’s def happening behind the scenes I’m sure they are using it via whatever AI they subscribe. I work as a PM over the devs.. corporate is not sure how to proceed yet, they let them QA their code but no writing at least officially from our leadership at least. I think it’s coming once they figure out the risks and how to move forward we will see very few devs and only a handful that are super devs that just use ai tools to generate the code moving forward.. if I went to college for computer science I would be rethinking my choices for sure.
youtube
AI Responsibility
2025-10-28T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwFO9lyJU9xERuq88F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRqw0bdFrmHh86Mx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaqZ6CcAOu5T3Gqup4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw2iOwMH0Zt3DvjCZt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyIi5YgUtWyMTG_IJR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-TwkFx8pobRAq13Z4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSyqy8IBt1RX9RxLB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxgvfDpaKoXykV9yKd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwI-_ICq5rWMeGlp-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUuOQipuQL02EU6ad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]