Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I thought Mark Z was a robot but Elon M must be Skynet’s human incarnation……
ytc_Ugy8mzUAi…
G
If AI replaces us how do the rich continue to make money. There will be no econo…
ytc_UgxC7akQ0…
G
AI is much worse cus it actually exists and will be the downfall of humans…
ytr_UgziIOpyG…
G
It makes literally zero sense from Artstation's PoV to ban AI art entirely. If A…
ytc_Ugzdt9ZZP…
G
Hell no, this was not a suicide. Who picks up take-out and then shoots themselve…
ytc_UgwGQvRz8…
G
@OliGaming-d1u DLSS5 will have a built in slider if you don't like it just disab…
ytr_UgxjtrDLi…
G
He’s lying very badly or he’s an idiot to think that progress stops with AI lear…
ytc_Ugxy-5mV6…
G
Anthropic: *denies the us government use of its AI for spying on citizens and fo…
ytr_Ugwp_k38M…
Comment
"The genie's out of the bottle and there's no real accounting for what happens next." We need an AI Magna Carta, an AI constitutional convention, and/or an AI security council at the UN because these technologies are so powerful and the threats will emanate from people weaponizing and abusing it all for their own ends. We have pivotal decisions to make in an exponentially shrinking timescale. At the rate of development in different computing, AI, material sciences, and quantum technologies being developed, collective humanity probably has 5 years or less to make the correct decisions that will keep all of this from going sideways in short order.
youtube
AI Governance
2023-05-04T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwrFT2tldmlCMX8LLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzWDP3y7FAZf13_gyN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7fZPvQLXSjD7tChV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwz65qMSnoS3LKBn6t4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyW4ixx0JUu5bEYSQJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwtl0mCSYyP9QJZncd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyaJqeOME8IM10K8t14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytGwjxRZPffeBIcBd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxY9Hln_Gko4WU1MAt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmoN-iMDHTmijFLdd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]