Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Meh, i think it totally depends on the contexr tho, we could use AI for inspirat…
ytc_Ugxhqyf_n…
G
After about 2 weeks of me learing about AI i figured out in about 60 months weee…
ytc_UgzIJna3p…
G
“I’m an artist” buddy, the ai is more of an artist than you, and it’s just steal…
ytc_UgwzWsK8q…
G
As with all machines, they will replace some people, but then managers will real…
ytc_Ugzv8qxAM…
G
As far as using AI for therapy... it is a tool in a tool BAG.. (hopefully) as a …
ytc_UgyWWNk0v…
G
The only reason I do not consider AI art as "Art", Is beucase part of Art is add…
ytr_UgwOPyoBJ…
G
it's wrong definitely.
AI art is not art cause there is no artistry involved.
If…
ytc_UgzvM02ni…
G
Arguments actually pretty useless because AI you are just writing at least digit…
ytc_UgynGLBB5…
Comment
Mark this down: by 2030, robots still won’t be able to handle electrical or plumbing work as a full service. 99% of those jobs won’t be gone. Even AI cars aren’t great yet, and replicating the dexterity of human fingers is incredibly hard.
I’m not disputing everything this guy says, but if he’s as smart as people say, he should know this. I’m not saying it won’t eventually happen, but it’s a long way off. Maybe robots will assist in 5 years. We need to hold these predictions accountable “you said this then, and now we look back and you were wrong, so why is this different?” Right now, he’s sounding more like an extremist than a realist.
youtube
AI Governance
2025-09-16T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx4C1CJcIAie6JyfkF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxFPkTNWxN2sni2xt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVqNUPTvA-_RcUnGV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpUmXLMT80PcMpyqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxf04wjZ81zKubyhOB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyKZNOM_Ew8juTGZIZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx4LYpJnAzH_W9MbV14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxjqT9iHCEy_WHGXP14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxNnBlggs2UvjcqjB54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx2LD3eZO4mS1P1hLh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}
]