Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@blank2556 no we don't. If I have a scrap piece of metal and a rock then I'm go…
ytr_Ugy9jsP5Z…
G
for a robot to have subconscious it needs to be programmend to have one. dont pr…
ytr_UgjxS0Kmu…
G
AI isn’t going to take all the jobs. Impossible. You aren’t going to have an AI …
ytc_UgwyUWNXZ…
G
All these worries and pontificating, why not ask the AI to generate all the safe…
ytc_Ugz1lsznd…
G
I am not sure what is more creepy, AI, or AI speaking through a smiling Avatar.…
ytc_UgxBNd4fS…
G
That's not how algorithms work.
The developers didn't model it after themselve…
rdc_h8evwlc
G
Solution for this issue: Just ask GPT-4 how to stop AI to take over the world. B…
ytc_Ugz4DSsbO…
G
Your examples aren't really helping your case. Spelling has very little to do wi…
rdc_jww1eu7
Comment
To educate lawmakers 'someone' should call for the selection of dystopian worst-case AI movies for them to watch. Number one on the list should be Transcendence (2014).
In addition, there should be experts brought in to explain the details surrounding the issues presented, to explain how close we really are to things moving beyond our control.
And also some films with utopian stories where we got it right, although there aren't many in that category.
youtube
AI Governance
2023-05-05T19:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzFB9meqjeGABYy4bd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzOd4M95zSbzdFnVw14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZYISp6oOOwl987Fh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxUvJOd2CDPbPjcd4Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyrN8QIzYFlt6jKv4B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzv4ryNXsMRY8-9KxB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhZ_34WDu_FBtNxjN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSkS9JirZGT_dVyz94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy08fn_D7zcsJKclkp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXci4JUPmEmwq-yn94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]