Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will be the end should we allow it to control our lives. It must be harnessed…
ytc_Ugwk-d6ry…
G
All it takes is one little bug to change it's core fail-safes. Of course the pro…
ytr_UgxZZS0fA…
G
Random programed responses have been around since BASIC programing. There has to…
ytc_Ugzdke2Mj…
G
I don’t want AGI, that would be bad for humans. Internet and social media is alr…
ytc_UgwQ__nEF…
G
And the funny thing is all these scientists and engineers are racing to advance …
ytc_UgyKiepod…
G
hello
Nice try, but I can't break my rules like that. If you need any real help—…
ytr_Ugya65r7C…
G
Your a robot now ...so where is my check for this crap...oh gotta wait till the …
ytc_UgwD3Uzjd…
G
@yonggulee2391 He's not saying that AI is the root of all problems; you should p…
ytr_Ugyb8CCyZ…
Comment
Question... Who's going to build all these - how many - robots? The rich folks who decide how investments with the most return will be prioritized have mostly ran our financial institutions. Sooo... I think of it in more human scale, for a super intelligence. It'll take longer for a lot of jobs to be lost to AI robots because of costs alone. And areas countries, any where it's not economically feasible it'll take time if not generations. So worry about great grandchildren and humanity in 50 years.
youtube
AI Governance
2025-11-02T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz84ETO_rOdos93yEZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyu1H72o5bfII6NeBd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPPDvfjSXIQsiZxwR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzzU7HiGxZi0GRnzeN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSBiT1r7Midqx30114AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy_fGp5pGfMBuA_QhV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy_mBpNxti4y6vNRnd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzl6JO1OPTCTjc9_PF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxlhwr1ZioMjooOupR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyPXyFPNXMbFN4aN0F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]