Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think ai should replace only certain jobs such as doctor because doctors have …
ytc_UgwOoHBBy…
G
5:06 One key difference today is that everyone wants to park close to their dest…
ytc_Ugxmi6VXH…
G
Retired developer here.
Knowing how to code is NOT enough. Developers also need…
ytc_UgyzJVQ3L…
G
There definitely needs to be a kill switch or policy globally, that in case ai c…
ytc_UgyRyFeC5…
G
Responsible approach to AI? By firing your ethical team, Google? Really?Hmmm....…
ytc_Ugy6Ncxl7…
G
I feel that if good developers use AI directly to create products and establish …
ytc_Ugxe9emC8…
G
I wonder if it can handle complaints or abuse from a karen? Could ai stand again…
ytc_UgznjUOJ2…
G
It's interesting to hear your perspective! Sophia's responses highlight the bala…
ytr_UgwNeoEpJ…
Comment
Stephen Hawking is a physicist. Elon Musk is a business man. Neither are computer scientists, let alone computer scientists that work on AI. Why do their opinions matter in my field?
I love most of your videos but this is some Elon Musk type BS. Look at SmarterEveryDay's video interview with a 4 star general about the future of war. That is a much better assessment of the future of war than the musings of a man with no experience in AI or the military.
youtube
2019-08-13T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwt8sMdBrhPQyu_LUB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRxh4ObPNBNt67lh14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQoiHSIs4eldEWBpp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzoUFxnh9VDPKRP5TV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyyYfDnFH8lVL1HuFx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgylFoBYaF-oMCnMV1l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzXiT_wWXRxjIgkXOl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmKYJt_40W0-9SM6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy32y9cdwEaLZcfa2R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwJD2lznMWMukhxokd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]