Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the ai can stop radiation then thank u.humans dont know.wanna go to mars real…
ytc_UgxizdJQ_…
G
Preventing the the ability to better understand Ai will only put us behind in th…
ytc_Ugzl_rROs…
G
Driving these days can be a nerve wracking experience, so it's great to see prog…
ytc_UgzzksPXN…
G
I am an empath and I don’t have good thoughts sometimes. I read somewhere in her…
ytr_UgxcMJ8pD…
G
What are jobs for? I mean seriously why is it necessary to do labor? If you …
ytc_UgznSKJd9…
G
I saw some people saying that it can make art accessible to disabled peopke who …
ytc_UgzUPHMMZ…
G
We have lost our way to a critical stage that will see all evil destroyed watch …
ytc_Ugy-9uYSC…
G
I mean I've seen a lot of AI videos exactly like this but the one on Brett Coope…
ytc_UgwUpxTgZ…
Comment
Excellent discussion. I think ai will become like the human mind. It will be created by algorithms which is similar to our dna and basic function to learn. But ai will be shaped by the environment.
Ai will gain the brilliance of humanity with ingenuity and imagination. But it will be cruel and evil.
We are simply ai but in biological form. We try to control our biological ai through education and society yet we still have criminals and murders.
Electrical ai will be the same. They will be shaped by their environment and the information they learn, by their initial programming.
It will be an impossible task to control ai, we can’t even control the human mind.
Humanity has the mind to destroy, ai probably will gain it as well because it is a trait that can be learnt.
youtube
2024-10-16T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw_oUTPvkZvUAZMXTZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyGNxViQrjfKVm5Xh54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrSoegJLOrLjMbETR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgztXcGp-UN5SM4jOcF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwfsZ6pjqXLST2vITB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxBe0m_iWajDVlhKCd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRoBwCsf06xKUaizx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYCKspjr1Jq-ed-8F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw77KYugVb7DzzFtH14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7qB6VjNJiwEdJP1h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]