Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artificial Intelligence (AI) is transforming the way we live and work. From self…
ytc_Ugy0W5jiR…
G
Places AI can be genuinely helpful:
- Grammar/spellchecking/tone assisting profe…
ytc_UgzV-bdGm…
G
im so glad lavendertowne is speaking up against ai
also, to help filter out ai t…
ytc_UgwqsgVHD…
G
@Zeroscifer yes, that's ONE of the ways people learn. It's the only way for ai t…
ytr_UgyTuw0P9…
G
God is giving warnings about AI to many people.
If you haven’t read “I Am Sara…
ytc_Ugzu4wp1t…
G
AI is asked to resolve climate change, after a few seconds it found the issue, t…
ytc_UgxIVLlhw…
G
Human leaders have over and over again demonstrated their inability and cruelty.…
ytc_UgyIvhMC2…
G
Glad to say I wouldn't even need to add artifacts to my art to poison AI, my art…
ytc_UgyPwmShp…
Comment
@dogsuit an AI will never not be a machine - and machines do not have the capability for thought. Unless we fundamentally change the entire way AI is made, it will never have motivations of it's own. And at that point it is no longer AI.
It isn't shallow. it's understanding how it works beneath the hood and knowing the limitations of the technology. They are hyping it up because they want to sell the idea of independently-thinking AI, but it isn't possible or feasible in today's world. even language models need to be fed samples, then work by mapping those samples together using repeated logic, algorithmic learning, and defined rules.
Defining AI as a "black box" only happens because no one can manually go through and know every single piece of data that these AI are fed. What we do know is the algorithms they use to parse the data. Someone had to build that. Image AI look at images. Chat GPT creates blocks of text. Chat GPT is not going to suddenly decide that it wants to make phone calls - not unless someone tells it to do that and hooks it up to the correct systems.
You could probably write an AI that could randomly decide to do different actions - but those actions would need to be provided and programmed. And at that point, it is not independently thinking. It is performing the action that the programmer told it to. There's no problem that this solves, and the cost would be enormous to program in the amount of possiblities necessary to recreate "independent thought" on a human scale. I stand by my "shallow" interpretation XD
youtube
2024-05-23T17:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyKpD-izLaizM1DAmV4AaABAg.A4i1UWtPTn3A5qhK-UGxlG","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyKpD-izLaizM1DAmV4AaABAg.A4i1UWtPTn3A6pdO_4QfJb","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgxXlJ4VoZawm5PwQ2V4AaABAg.A42K0T8Q-ZmA5qiZ9l78Jx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxEQ8L5BxnopkrM9Od4AaABAg.A3xoKRGIOATABpEB-cBHvy","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwxQRt_EVqEbLTTYal4AaABAg.A3ey_BfFp3mAV6LqI-aEiT","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxyaYCV_VpHbrM4iCZ4AaABAg.A3URKViBy_HA3d3zziZVHp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyT7jrUo409g04jzpx4AaABAg.A3TRjNZQdAIA3aqQjczf7P","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyT7jrUo409g04jzpx4AaABAg.A3TRjNZQdAIA3mvHCCHfWg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxq7vfOJk23V8JJ-dx4AaABAg.A3T6xwIAHWKA519qeJ1LS7","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxdXMqKuumox8TUeDd4AaABAg.A3Mc4uQMRvrA3arjD9VSmk","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]