Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
F the machines that are conscious they're expendable. Let's hope 20 - 30 years f…
ytc_UgyflaKGS…
G
In 1982 I was a computer programmer and it was predicted by many computer futuri…
ytc_UgxA6oef4…
G
I'm going to have to look into this... in particular, how did Rozado determine …
ytc_Ugz5ilQaF…
G
be honest. If Oppenheimer didn't know the test bomb before Japan would collapse …
ytc_UgyxkKZ0m…
G
For me that sounds like an AI simulating an annoyed human, of which the internet…
ytr_UgzN4QQiI…
G
As a machine learning enginner i will give this answer 3/10 as "machine learning…
ytc_Ugyy2jS8o…
G
If AI is allowed unregulated in the west because of China 🇨🇳, then perhaps small…
ytc_Ugz9Ekrlg…
G
That would be great. We will be going back to the good old no AI days…
ytc_UgyMOSkoh…
Comment
I can't understand that if we replace human jobs which is based on human salary in broad aspect then what is meaning of that job....for example if I'm from real estate company and there is no one buying new flat then what is meaning of related to all sectors who is depending on real estate!! I think there's no need of any industry which is exist right now how is depending on human investment like I give example before? Then whatever prosess of developing superintelligent related to human jobs and human investment related jobs will not needed anymore so if we learn AI in our perticular field then that is waste of time because that no more needed if there is no sell of your product.....if we want to keep going what is going on then we need to keep jobs as proportionate as that industry required.
youtube
AI Governance
2025-09-07T15:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzzRYv1B8_JgVH8sdV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwWFtIQPUFaUT7zwcZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwPstfbWrF1cI0K8ft4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyc-gRDILHh3YUrtNN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxEhtQzPXCg3an1-wd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwRzNtXkn1XkG1_4CV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwww6bzfGTkbqQ_3MF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyMYtAbQv8aovrodAh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_Ugx46cqe9tvopJrEIdN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz6MYEwADI_5JNZ55R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"indifference"}
]