Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Given that humanity can't even agree on what consciousness is nor come up with s…
ytc_Ugwwph1Dv…
G
Timestamps (Powered by Merlin AI)
00:00 - Sorry, we don't have enough informatio…
ytc_UgwrnbmTN…
G
@brovideogaming9447 Machines were probably bad at their job in 1910 but it impr…
ytr_Ugx6FgT2p…
G
The day they let AI does all our taxes for us is the day I will start fearing fo…
ytr_UgyMv6Aer…
G
It’s just another method, what’s important is the end result, so if AI does smth…
ytr_UgyZN9VT4…
G
Seen here, another painful part of the AI, even if you are willing to use it doe…
ytr_UgzYvtnHI…
G
Having a logical discussion on anything related to Tesla and its self-driving ca…
ytc_UgwQOszPH…
G
What does this mean for humans though. Are we fucking animals and it's sad, or a…
rdc_empaghc
Comment
Let's apply some common sense... What jobs do you see AI replacing in the next 5 years? Analysts, accounting, tax prep, some science & engineering? Even then some human interactivity will be required. Trades and trade professions need humans, highly skilled like doctors, lawyers, etc. 99% of jobs is completely unrealistic...
youtube
AI Governance
2025-09-07T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwvYI1IUVJP0lCPuAF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxO_rACB7JhWoWr4mV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugytu4HoXoNsm0LHV1l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwayXDoJR-GRWEeVMJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzAXwmIV94MMDFyD94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyOooKq8ZR8jmqlZ8h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeW512AZ6STElAgRJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyej5DnyAuGZqCtwBZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnsYfzx9yANMN-4bZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXnm2gxY32AD4ccFx4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]