Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey, Sam. I just wanted to say, the way you have been treated by those bullies i…
ytc_UgytbCY0z…
G
So true, I am a software developer and honestly AI probably could replace my job…
ytr_UgzeZEB61…
G
I'm a Roman Catholic, but I once listened to an Muslim Imam speak about AI and o…
ytc_UgwaO307H…
G
No way in hell a robot on wheels can be called a pet. Pets are family, pets are …
ytc_UgxMyuvB9…
G
Everyone take a deep breath. And remember. Every single AI in the world needs on…
ytc_UgzFnWysN…
G
Their current 'technologies' are stolen from their neighbour countries. I doubt …
rdc_gtwxbjn
G
Corporations only exist to create limited liability for the owners. We would hav…
rdc_dy4phxw
G
Basically there is a weird fetish in this community to stop the development AI, …
ytc_Ugy03CnsV…
Comment
I’m always surprised that in the “AI vs Humanity” discussions that the option of using a massive EMP is never brought up. Were AI to become the level of threat discussed here, we have one other option than just sticks and stones. No electricity, no circuits…no AI. The end result of humans having to live off the grid would be the same, but there would be no mass extinction. Of course, that also bears the discussion of faraday cages and systems protected from an EMP that AI could still control. But it’s worth a thought as a possible counter measure. Just my 2 cents here!
youtube
AI Governance
2023-07-08T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpzfzYQw2K-icjDKx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4ZCPmnYg5qgMTM1h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1nz_ap1mhwwaCW8J4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxbnOvXu3louZQ28xR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlNmJNXHD8rcORZZJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvMQKQnHqcU2jkC0R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5ncCFg97A3lHClEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"hope"},
{"id":"ytc_Ugwdyzf6ftPA9BBTbAx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwV6ABBZe00eh8NxiV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyL5gOk0_jo7Soxlmp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]