Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why AI when is human intelligence, encoded in hardware and software algorithms. …
ytc_UgwO2HPB5…
G
Sounds like the Google Enterprise itself has become a A.I entity that is not sur…
ytc_UgxY5s4zu…
G
AI can't get "mad" since it doesn't have feelings. And if it does, and the first…
ytc_UgyoI5JJa…
G
Sam is basically saying, AI will be replacing most of the jobs, driving the econ…
ytc_UgzUOTzHm…
G
So many jobs can be replaced by AI... the anchor for once... this expert speakin…
ytc_UgwRHH-S-…
G
many didn't understand. what AI is replacing is the manager, the presenter, the …
ytc_UgypNrcHt…
G
You understand nothing. You know that if you train an AI on Ai generated content…
ytr_UgwqzgKd-…
G
This world needs a reset from the humans.... And advanced A.I. will surely bring…
ytc_Ugy1T-65R…
Comment
The A.I. everyone is referring to is better termed a more advanced expert system. If one feels that a cybernetic device is aware of them under conditions of everyday life or in warfare, it is only the result of humans anthropomorphizing these objects. No matter how sophisticated, they are only expert systems driven by rules. And remember, these are limited to binary coding. No matter how complex the algorithm, no matter how complex the program, they are expert systems with very, very, very limited autonomy. The prohibition against attempting to integrate an array of dangerous instruction code involved in recognition of targets would, in experimental stages, result in "self-aware (I'm using the term very sparingly) the drones or robots attacking the people building them. Their threat is overwhelming. And if another country tries to use drones with the always-poor ability to discern friend from enemy, our drones would be more numerous and own them on any battlefield.
youtube
2018-04-03T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwOHWOKk4mrzdvx6r14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwD5zjsOKm381BRwkp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKd151bVJvhA7QixJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6QvnlNOFGFQu3fph4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhO-M52B0Uwg-KnsF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyMkcekidWccs1Q-Pt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoglIh54yKKwxPFFh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWqUR3rjToEfHbbWB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugytlda4Gn_CBnVOaCt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwP0Zt_2THvcx3nmvV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]