Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nobody is suggesting there will be no more software engineers, it's more like we…
rdc_oi3xnwn
G
ive been using it lately as a worldbuilding brainstorming partner and it does he…
ytc_UgwgznNG1…
G
I agree with the Ghibli taking down the Gib business, but I don’t agree with get…
ytc_UgwFOZjj0…
G
Cant wait to hurry the f@ck up and dign up for that AI girlfriend eh?…
ytc_UgyaGMMpa…
G
Not worried about it honestly.
In manufacturing we cant even get alignment for …
ytc_UgyNyD1yO…
G
Everything they done to prevent to happen some china guy will do in a basement a…
ytc_UgzCpIBa1…
G
Actually AI art is good at something believe it or not.
Its good at making peop…
ytc_UgwAd9gdf…
G
FACIAL RECOGNITION DOES NOT WORK ON POC!!! The software relies on shadows to cre…
ytc_Ugwesolea…
Comment
Agree with Demis that AI is going to bring advancements in science, that's obvious. However, what is also obvious is the huge risks this super-powerful technology is bringing as its capability increases. He is totally avoiding talking about the risks, and just deflects to talking about the benefits. This is a very foolish strategy, and one that is hard to understand because, as anyone knows, benefits won't be realised unless risks are properly taken care of. He should talk about the AI risks more if he expects to be taken seriously. Hopefully he has some kind of oversight within Google, and Google and other big tech corporations have proper oversight from government. Otherwise, humanity won't make it.
youtube
2025-04-24T03:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzbXL5ect4uuj90RQZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3Wui8tRaNYOjlyEN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_ool7qhv8FL9-P1h4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"none"},
{"id":"ytc_UgyBKcahHKHxe9LWC9x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugze5O4GRmOXFkynYEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwv63oGaOdpPS4Qb0B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwJ62VZbAjJwYOcM9x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKl3t0v5rW3pqeJ1h4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwxRygbZsUgG3-EgN14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwgbjV34db7TXeVXvV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]