Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Currently, Circadian AI is not available for public download. It’s undergoing re…
ytr_UgwrdGS9j…
G
I spotted the flaw in auto driving vehicles right away, there's nobody DRIVING! …
ytc_UgxqOTDgK…
G
Any system or technology being designed is only as strong as it’s weakest link. …
ytc_Ugx6CkLEY…
G
So you blame the chatbot... instead of the gun your son shouldn't have been ab…
ytc_Ugzw5IZVr…
G
1:13 I wouldn't take advice on what AI can do from a guy who's using an Apple mo…
ytc_UgwX409qP…
G
“Breaking news: Police officer goes on a drink-driving rampage in a stolen vehic…
ytc_UgwdUpUzx…
G
By the way the last one is correct if you think about it, since you gave the AI …
ytc_Ugxv_fdUJ…
G
Shelby - just Compare WAYMO $250,000 cars with STOCK Tesla model Y.
let that si…
ytc_Ugx2enphX…
Comment
my current employer is nudging employees very strongly to let Google Gemini type their emails and anything else that you can let Google do. And they have recently insinuated pretty strongly that you’re going to be graded on an AI component as part of your performance evaluations annually. Meaning, if you don’t use the AI, obviously because they haven’t been getting people to do it as much as they want, then they’re gonna penalize you. NO NO NO. 🤡🤡🤡
youtube
AI Jobs
2026-04-02T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0tZVh18T6qYUewGN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCpEn5m3eaqiiOP6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwR-Kc6VIi3Cmreq3h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzSFVcSCo18EsOpI7Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyATA0jDDb77ipIpB54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCfg6P6EhPolYtjLB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwAvly1i7PD3jmG7AJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgykKGNwFiyss0Xxcgp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8S1kHIM2anskiDsp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugz11w7qkLmFM3f-GRp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]