Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's funny how the developers have also to coerce ChatGPT by repeating strong wo…
rdc_kop3di7
G
Will coding exist in 10 years? 15? 20? At some point code will be an inscrutable…
ytc_Ugy2h3cx0…
G
0:45 the future would be better without AI because everyone would have to use th…
ytc_Ugxau7k2i…
G
Its not ideal, not at all. We all are learning about how to chat with an ai and …
ytc_UgyUKIx9K…
G
I had an AI predict how the strike will play out in the end..... it doesnt loo…
ytc_UgwG7j1tu…
G
”AI writes a lot of bugs, so we need to use humans” is kind of a mute point when…
ytc_UgwGgGIo3…
G
Then we have to remember they can reprogram their self at that moment to conserv…
ytc_UgznzKWNP…
G
AI is based on data based on facts if anything it exposes what men thought to be…
ytc_Ugw4ChmNG…
Comment
I wonder how Dr. Yampoliskiy views the one big issue with AI and that is it's massive need for energy. Humans are quite cheap to run energy wise and if AI needs energy for computation it would make sense to let humans not androids do the physical and artisan work, electrical and plumbing for example.
youtube
AI Governance
2025-10-12T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwA6IxJVAwlqyz1OyR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxNbKR6sG4PDXpuoLR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4YtpnhQeMGKFtNIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1NKUBoF5KK4WCC8R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzLbKF-p16A0L6ye_94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_KTxp_ZtiMfTYhsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBAR9_YlSpTiO2X5l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzxBbchGyWvZYZ4KJJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyaHh07RzqhF8QlQbJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzIjoXYcDTuDJyMR414AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]