Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not to be that guy we are cooked if we keep hating on AI low key we don’t want a…
ytc_UgykDDMCI…
G
Stupid, all she did was regurgitate the same stuff hundreds of other researchers…
ytc_UgxyDACw5…
G
@damnjustassignmeone Not watching, just commenting on the absurd idea, No wonder…
ytr_Ugy53wpwb…
G
We need to really, really re assess the value of work. Work = $, but if AI is do…
ytc_Ugxg_e54D…
G
Tractors dehumanise farming.
Emails dehumaise the postal service
The internet de…
ytc_Ugxirp4XY…
G
AI may be the end of humanity but I don't think it would be in their best intere…
ytc_UgzUKGoRD…
G
id honestly kill to see an AI bro try and make an AI art tutorial, because it'd …
ytc_UgyF3GQFA…
G
You completely whiffed. Is the man who's using a hammer to build a house less hu…
ytc_UgyKOyVQx…
Comment
The biggest problem isn't even AI itself, it's that companies are seemingly going all in on it without properly evaluating how it works in their use-case. Being an early adopter of anything carries a lot of risk. There are no established best practices, the understanding of the strengths and limitations aren't yet mature, and the technology is still evolving and changing. There are going to be things it excels at, and there are going to be things it sucks at, and it is your job to determine which category your use-case falls in and how to best apply it. Instead we are seeing companies blindly deploying it as much as possible and as quickly as possible, which says to me this is almost certainly a bubble.
youtube
AI Responsibility
2025-09-30T17:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwbx7AhAW9MwAi_Sn94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6IGNjRepUSN1ybId4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyTz-YrG2tUHkB8XP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx78ceBJk-GWsUk80V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxfwZjfKtm5W9P6MfB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAg3I3MAqovDcXhYp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz25TgPwmMpUXNSnup4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxR3uVYJb5O9SzDyXF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxJaavegEW428aVcdV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZ4FrSsRcr05RBicx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]