Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You convinced me that it's fake :) how come AI is inhaling during the conversati…
ytc_UgzYMkGBr…
G
This has already come into affect right under our noses and it’s scary to think …
ytc_UgwJl0696…
G
Would be funny to find out that humans were created by some Type II+ civilizatio…
ytc_Ugxt8-64v…
G
Keep doing this we will make them hate publishing AI art since they get f*cked b…
ytc_UgylzADfp…
G
@valvodkashort sighted. LLMs trained on genetics are revolutizing medical resea…
ytr_UgzIMOz2f…
G
Wow now that's messed up!!! Here we go the AI are going to take over, this is th…
ytc_UgylO6Wor…
G
1. Creating a flexible and good AI for the future will most likely have a featur…
ytc_UginEExRK…
G
just the beginning's a bit ppainful. Karen Hao perfectly explains the difference…
ytc_UgywdwBdl…
Comment
So I watched this with intrigue, and it's easy to sit there. Getting scared by it. So in 2027, the guy who owns the petrol station down the road, just put his life savings into buying the petrol station, is going to go and buy $120,000 AI bot that's going to replace him?! There's so many levels of impracticality that stand in the way between 2027 being 99% unemployment. Don't get me wrong the guy's talking a lot of sense, but it will be 20 years at least, and in that time the world changes so much, so there will be extra dynamics. You will still always need a human element to programming, so either we'll become programmers or labourers. Because let's face it. No one's going to trust a robot to do my electrics for a long time yet. Every year that passes the dynamic will change and the next 2-year plan will change with it. This is a fascinating view of the world, and whilst he's so interesting, I think there's a little bit of third-person perspective to have here
youtube
AI Governance
2025-09-26T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwcdlyjVFVdC85NfRF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgztxRT3IGpMxo3HLZx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzHzwkSOfRtd4awWOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6H3o5-9LKOtNxzTt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxdBDxjSE5L8i4M4op4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxECd8tSrWj_8eQB_x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx8eqGsvV1pM_9sGWN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxCShTmQm9oc4vc-K54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHoHsgv6qYNyGELSJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyPtcVrGD5mkPJ_hbl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}]