Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Interesting....I am an early-retired md, but training myself on lots of AI for c…
ytc_UgxQbqGS8…
G
Alignment is the problem God faced when making humanity. Resolved when he gave u…
ytc_Ugz65DbIT…
G
I am scared of ai because they can mould our emotions so easily by just a pictur…
ytc_Ugzgn_Upu…
G
Guys... The endgame...
So if 5% of companies can do something amazing with AI.…
ytc_UgxHDzf3D…
G
AI will always make "mistakes" all intelligence does. Otherwise it is not intel…
ytc_Ugy1kY2c-…
G
How about we automate CEOs with AI. Honestly, I don't think it'd be all that har…
rdc_nmhjfbw
G
Here is what you should think?
God is in control, not AI.
In the year of 2030, E…
ytc_Ugz49LRaf…
G
It works good if you're already have experience, but what to do, when you're try…
ytc_UgwSJjTK6…
Comment
This is truly the most depressing interview. This man and his colleagues developed AI but yet they did nothing to protect the human race from AI knowing the implications. They were motivated by ego and greed when they released this technology on the world. As he says himself, he made enough money to keep himself and his family immune from the implications that are coming down the track faster than our societies and governments can cope with.
youtube
AI Governance
2025-06-17T14:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxBtW7mNRWkpbcW7lR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwWhxlkpzIsjgpZ5q94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_yqykxVMNjSwWmF94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwYDmWjChoz_d9xM8d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwH7h9s-aQ-KUXe9Q14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyMUNWWnjpaBQDZSD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz5I9LKoArUqGQmCvd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz7jW5HsbHHnMCtDKJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfTdq2_ghQS5nEZst4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxfk2P1Tci6fhEvmWV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]