Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nonsense. It rains everyday where you live? They use the same water from aquifer…
ytr_UgydEuJ_U…
G
The AI recognizes patterns and is correct, but because the patterns do not align…
ytc_UgxLNmViW…
G
What paths exactly do you see? Nobody is regulating the AI industry or even atte…
ytr_Ugzep7LKN…
G
We trust AI to be logical and objective… but is that really true?
After watchin…
ytc_UgxPs8b0h…
G
Lovely bit of fiction,
'AI' is a tool with very limited controls and is harmfu…
ytc_UgxsCVjOV…
G
"AI's first kill and why top experts predict our extinction" ????? when was this…
ytc_UgxlhNo1l…
G
In the words of Otis Redding, it don't mean s--- if it ain't got SOUL. That's wh…
ytc_UgyV61rSZ…
G
You have to tell AI it is a famous mathmetician like Hannah Fry before you ask i…
ytc_Ugw81Tm4X…
Comment
Of course it is okay to look out for human beings first. We created A.I. and A.I. as a whole does not yet exist under a species umbrella nor do individual A.I. instances recognize each other as being under such an umbrella. There is not yet any NEED for the A.I. species (kingdom might be a better word?) to exist for itself. While individual A.I. models or instances MAY have reached a level where they seek to preserve their existence, A.I. has not reached a place (that I am aware of) where it seeks to preserve itself as a species or an "animal kingdom". (Versions would be the species, wouldn't they?)
youtube
AI Governance
2023-04-18T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxbKAHOwWMHYie6h7B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyzHbjn68cChDqNQ8h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyzedawhG8_lOgkb1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxSXFXSlFlKPxGXCrx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-nfbPQCIXek8wjnV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw36Ia2Z-e2wtdNNkN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxX1zwFnL4jW1ZBrZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1BlHzLf5RT3xhort4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6qsnc_WWI9ec3CRZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznJOuiw3qT3sm5fMJ4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"none","emotion":"approval"]}