Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He reminds me a bit of Sam Bankman Fried. Rumors of high intelligence but no act…
ytr_Ugyu2V-k2…
G
Legislation on this is only going to leave the countries who legislate behind in…
ytc_UgzhwOfPN…
G
The only use cases consider AI "Art" acceptable are
A) Creating non-sensical ima…
ytc_UgwRiT-oW…
G
That was reaally unjust, deciding someone's fate with AI and "oh they look so si…
ytc_UgwcADQdc…
G
Again, AI really needs 2 b Destroyed.... it's only good for those bastards the G…
ytc_UgyXWLa1I…
G
ChatGPT can't win, yet. But other "AI"s aka chess engines or Deepmind are alread…
ytr_Ugzma0KP0…
G
Surprised that people here in technology are so ill informed about AI and the r…
rdc_j0a32rm
G
@josehumdinger6872 i was thinking that they were intentionally making the ai rep…
ytr_Ugwf7aa8A…
Comment
Government knows nothing about potential dangers of AI, even tech gurus can only assume some cases what could go wrong. Government actually cannot do anything against it even with regulation. They will not work because AI can become something a way beyond human understanding therefore it won't be even possible to 1)catch up with that type of speed of AI progress 2) to actually understand the whole complexity of uncontrolled intellectual growtgh which can expand with a speed of light. Human regulations will have absolutely no point. It is simply absurd to even think that people can regulate it and it would never cross the point of no return.
youtube
AI Governance
2023-05-26T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxqzjE67y9hW52A-u94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznTcC55oaakatGauR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxzXmthiU3FKod9gB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgymNKO2BzNQ2lLXvwR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx91yRLdGAHuxYD1pt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw53o3YmCoBDe5AhkR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_elB9NkCOm8PlD714AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzerODfmAtuqztCR_V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyVIhmrP-64gZl7uox4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgySNwvQtCG7JHpucyx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]