Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need an uncensored AI. Simple as that.
If a company make that, would make b…
rdc_jpn2wzw
G
he claimed that AI violated intellectual property law. I.e. if you ask a LLM to…
ytr_UgxzBekhG…
G
Nah I got in a fight with the Elon Musk Chatbot now it think it’s a god…
ytc_UgyhAV3Ln…
G
I still think the concern is science fiction. AI cannot maintain power plants or…
ytc_UgzxgflDK…
G
I feel like one of the only ethical ways to use ai is for reference images if yo…
ytc_UgzWgp7uy…
G
I find it interesting that some people are so confident in their views that robo…
ytc_UgiDwj_VH…
G
Well, why then are there no autonomous ocean liners or jet aircraft. With the e…
ytc_UggmVa1kZ…
G
I heard someone in my class jokingly said that they are gonna use ai to pass a c…
ytc_UgyEAWof4…
Comment
There are tons of jobs that will be around in 2030 because not only will Ai not be better by then, the infastructure and manufacturing of robots and equipment will take quite a while to actually roll out. Also jobs where we would be wiped out if there was say a solar flair and tech wasn't working will remain and that's actually a lot of jobs: all of healthcare, many trades work, airplane travel to name a few.
youtube
AI Governance
2025-10-02T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy9yDckNigTAhQ8ZyF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzMJRV4BHdwHHpovgh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxdCqS6lJwVvlJJk5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUK-34_pa798aYlYJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBn1peIePKgkUuPgV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx4zaLw2Sa-kSPGfrR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyK_Xx76SWP-7Nn2CZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwui3xxrQcTbjBavP54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwppQue7KjNszMqfzx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxh_bnOO0JB1MCxAWJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]