Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans don’t teach AI morality, you let God do it.
I believe JBP is doing that …
ytc_Ugxb7jlTq…
G
Before we had the "dotcom" bubble. It went bust, but innovation continued noneth…
rdc_n7y5mnh
G
"Originality doesn't exist" No no no, people in fact come up with things on thei…
ytc_UgyvR3crJ…
G
out of my depth on the technical side to programming
(biology/psychology/litera…
ytc_UgxVDAPyI…
G
This guys statement that he doesn't want to think about the consequesces of what…
ytc_UgzrIpzH1…
G
It would seem theories AI will obsolete human jobs are at least partially correc…
ytc_UgxhtKLez…
G
Sounds like we should treat AI superintelligence like nuclear weapons, with inte…
ytc_Ugx24BavU…
G
In my opinion ai art is only fine if your just doin it for fun or wanna see ai m…
ytc_UgwsTZah3…
Comment
What Dr. Roman Yampolskiy does not understand .. if IA takes where to take over and there are no more humans giving input, then what? For example, where would the power come from? Will AI understand how to dig into the earth for oil and resources, for energy, and or how to repair items that are not in the textbooks, or figure out all the small stuff that makes things work? It's like the rainforest in South America, you don't know what's going on or what you're cutting down ....... or human resilience .... yes, they might kill some tech jobs that can be automated, but if you kill jobs or money, then there is nothing ... Then what? Without energy, nothing works. All these tech gurus or just dumb in the way the real world works.....
youtube
AI Governance
2025-09-06T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyXDMwY0AdD2y1qNiZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzaGmUDYqwVoIXQGjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSgDnOCAIGGSPzPMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgycrbKCdVly_bgy7R54AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyoEAXOyjUJkb2_8bZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgydWQEiNciDsnRjIFx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyUtr3IsKnOQ23-coR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJZr7x6Z2NEm-0QCF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxl_zIdbjTTPxyb7O94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7kYCgL0etqw9KA394AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]