Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I get the concerns about AI, but Olovka lets me organize notes and draft essays …
ytc_UgwsA_pRC…
G
While I still support for the A.I, I understand your concern as an artist there.…
ytc_UgycViPtQ…
G
The problem of AI alignment is the problem of human alignment. The p(doom) ratio…
ytc_UgzRjWfl5…
G
It's very easy to say things from the top.
But the reality is that entry level …
ytc_UgwET8jzY…
G
China win in many sectors EV telecom 5g being global manufacturer infrastructure…
ytc_Ugyuxh5uQ…
G
Like it or not, AI will eventually implode. Data sets will be poisoned by AI gen…
ytr_Ugw2ugB7Z…
G
The creator of chatgpt funded a study where for 3 years, people were given $1000…
ytc_Ugzmak3zR…
G
@Ytking0001 "No the AI got it right black people should generally have this done…
ytr_UgyaEf1pQ…
Comment
Now I used to think Siri is a joke (mainly because I think most things are a joke to deal with the shit that is mortality), but after Apple added intonations to her voice, I was both slightly alarmed, and intrigued.
Would I consider sentient ai to have a conscience, yes I would. But that line blurs did they look like a straight up human. I have no problem with humanoid looking robots (think cybertronain, I would welcome them in a heartbeat), but robots that look near identical to human is disturbing to say in the lease, it would also test my values greatly.
I need at least one level of surreal-ness to deal with it.
youtube
AI Moral Status
2020-06-25T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6ldpxbx3SzabORuZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxibPKJJBy2r_y7puJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_qrfReL5oYv6DDTx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPS73G_XxZJnSoloR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaEYHPUWgtIaUOt0d4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwmlSmJNq5nm1GZmTV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1dLMaXimwSvjs_Lx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnfqDwq7AGy3amLxR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxmAnrNsLvMis0SpHt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0ckEtdKAgZiBnNtd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]