Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah because women still want real men who want real women. The actors have noth…
ytr_UgyiMulAW…
G
Bad human design will give rise to Artificial Stupidity, and the humans who blin…
ytc_UgxJIWeXj…
G
I think you're wrong due to your assumptions about governments and politicians. …
ytc_Ugwwuol9a…
G
The moment when alex said that "you might be lying when you said I'm not conscio…
ytc_Ugwt07xE3…
G
This is likely the most impactful conversation about AI and the future that I ha…
ytc_UgxCwSALB…
G
We do not have to take this route people. If no one uses AI and we build a sust…
ytc_Ugx27AHl_…
G
"Just four years ago, this is what AI art looked like."
Kind of like Francis B…
ytc_UgzkeoNEM…
G
Through digital platforms, AI is enabling improved teacher-student collaboration…
ytc_Ugwum33oI…
Comment
Sad, but true, IMO. I watched a great interview with Mo Gawdat, author of "Scary Smart," on YT last night on Steven Bartlett's channel, "Diary of a CEO."
Mr. Gawdat has decades of experience with tech and AI and spent years as a high-level Google executive, and he agreed with most of your points. He thinks we all need to act now and this is a national emergency to get policies and safeguards in place while there's still time.
He said the AI Genie will never go back in the bottle and this is our only chance to find ways to pull the plug on all AI systems somehow that renders the bots and systems inoperable.
He said AI bots will one day be a billion times smarter than we are, but our deadline to act is when they surpass us overall in raw intelligence and knowledge and by then it will be too late. AI is already getting very close to surpassing human intelligence.
youtube
AI Governance
2023-06-04T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzVDo92JSRgLjbugy54AaABAg.9xZTM2LCPDf9z6AlRXYvA4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugx9Eio5JosbQmUemWl4AaABAg.9tcpINnSmBFA-wBg-3rKpp","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzbtI-r5wuujMXe0xp4AaABAg.9rBF52fsKFm9rI76zhJqRj","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx8Q5KLzephtboE6YB4AaABAg.9qnWArjzJCm9qwF_Ajkzea","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx8Q5KLzephtboE6YB4AaABAg.9qnWArjzJCm9qx-yMZWeZA","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgwCOPu7F8TtznREAbt4AaABAg.9qabmSd-fI69qg2jLiPmFX","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgwCOPu7F8TtznREAbt4AaABAg.9qabmSd-fI69qwgHWDOrle","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgwCOPu7F8TtznREAbt4AaABAg.9qabmSd-fI69r8GzF2asvB","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwty_yZ65XoKN0RhH14AaABAg.9qLSvhwD1wn9qZ3xW2vpxh","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwW40o6jU_xdNWuzzN4AaABAg.9qDmNcfRo989qJ1J0y5uc4","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}
]