Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have seen tons of examples where Waymo saves people from injury but are there …
ytc_UgxPfmUA-…
G
I have to say, I'm cautiously optimistic about the incredible fight-back against…
ytc_UgwTYYhnu…
G
I totally stand with you! >:3
But I also have a question.
is this on all social…
ytc_UgxtR7Bot…
G
If people want to use AI in their work, that's fine. It's just needs to be comun…
ytc_Ugz0qR4-4…
G
Musk has been candid about the dangers of AI. Hinton sounds like a shitlib who h…
ytc_Ugzxu7fDM…
G
AI taking 99% of jobs very quickly is the best thing that could happen. We will …
ytc_UgxhEBnQc…
G
First it’s that one crash out robot in Japan, now it’s this. Guys I think it’s h…
ytc_UgzI3e1a1…
G
I felt that in my own experience, and I've been doing it anyway to teach it to b…
ytc_UgyjTK6ds…
Comment
@innosanto yes what you say is true. But Google is by and large an American company. Data and AI are power. It’ll be good in the long run for every nation to have its own AI implementation. But then again, as far as I know what happens inside a neural network is more of a black box. I don’t think AI is sentient Atleast for the time being unless Google is hiding some really cool stuff from the public. I go to school for AI/ML robotics. It looks like we are in the beginning stages for AI. It might get there in our lifetime or Atleast within the next 100 years. For now I view AI as tool to serve humans by putting them into cobots/drones etc. and yes we need to design a kill switch in every machine so that if a hypothetical terminator like situation does arise we can shutdown the machines as well as the AI computer that should be part of every machine built in the future.
youtube
AI Moral Status
2022-06-28T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwHNvxlcIOLl3whqyN4AaABAg.9cm6j3b1til9cmEN9FaCEu","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgwHNvxlcIOLl3whqyN4AaABAg.9cm6j3b1til9cmPG8553vk","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytr_UgwHNvxlcIOLl3whqyN4AaABAg.9cm6j3b1til9cmq5XgxPyn","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgzhTzXraPEsfUtMovR4AaABAg.9cm1qhupJGn9cnYEC0rKYh","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzhTzXraPEsfUtMovR4AaABAg.9cm1qhupJGn9cnZxCWYNEE","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzhTzXraPEsfUtMovR4AaABAg.9cm1qhupJGn9cnj6sZEcq_","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugxerun33wpH7qxjr4F4AaABAg.9clvXau1zZV9cnEtVOojaw","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugxerun33wpH7qxjr4F4AaABAg.9clvXau1zZV9cnwTEj6sJv","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxerun33wpH7qxjr4F4AaABAg.9clvXau1zZV9cnz_3dw2Dm","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgySZYYHyaAX7Z0CzSl4AaABAg.9cldrnPffC29cpJ9LgeXaj","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]