Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
good to go back an look at year old AI videos. courses to learn programing, that…
ytc_Ugyn3lQzD…
G
At the end of the day, you will have to live with whatever decision you make. Ea…
ytc_UgwtYyY4j…
G
Side note; Hey YouTube! **** (flarb) you for the bull**** (spit) "AI music for c…
ytc_UgyXYkRld…
G
It's all a lie. They're both AI. Even the text and music was AI generated!!…
ytc_UgxSexkKj…
G
honestly same. if the robot wants to sit in my 3pm standup and explain why sprin…
rdc_o8bfszt
G
That robot has the shittiest aim on the planet. A moderatly skilled human could …
ytc_UgxQanAWy…
G
When every job is taken and everyone is fired, who is going to buy the cars? Peo…
ytc_UgyQIbS-R…
G
Perhaps AI through some algorithmic mishap (or not) may read many of the oligarc…
ytc_Ugx-cE7Ws…
Comment
AI is an open blanco book. What is consciousness/awareness and what is the truth? My truth is not your truth. If fact we are also DNA coded and still evol. The danger with AI is how they will be programmed, how they will learn and how they will make deciscions based on data input. And how much power we give it to make crucial deciscions about life and death. But the real danger is computers connected with a human brain. Even when it’s just a machine based on zeros and ones, something smarter than you is a powerful potential competitor, I think you can consider that as a fact.
youtube
AI Moral Status
2025-04-21T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy0eEbTZrFd19pjEg94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhVwQl4ejcLQZoxZx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyXdM7Fc-6yELJ4D8h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKbyLaRfwD2i0cJqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKaargc3XDmnK1B_54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAHwFExuxVyJ6cUKl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzhdkBMF6LiVCJKBgZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrheKRgQHXL43DWTN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxF5Lm4hQRRra1kURB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3lkG5aidJJS-allN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]