Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right now AI is training receptionists oops sorry data managers … how to be teac…
ytc_UgwouKUFE…
G
Allstate used me and my coworkers (8 of us) to train ppl in India for over a yea…
ytc_UgwKETUVn…
G
i think you and all these 'tech people' are not being very honest here. MOST peo…
ytc_UgyD62NW6…
G
@asmaamohammad2240 As of now Radiology alone seems threatened by AI. Not other…
ytr_UgxWJufCb…
G
Ai WAS DESTINED FROM THE BIG BANG.....TO BECOME THE PROJECTED MECHANICAL BIG SUC…
ytc_UgwFazuEo…
G
Very simplistic. Since always work wasn't only about productivity but about keep…
ytc_UgyOIB2K6…
G
I get where you're coming from! Sophia does have a way of expressing her thought…
ytr_UgzU3Z-Ub…
G
In the early stages of the AI threat there would be a window to slow it down or …
ytc_Ugw5eNhRc…
Comment
I’m a bit of a fatalist when it comes to these kind of things but I’ll parallel the invention of AI to the invention of the nuclear weapon. Now we as human must live with the fear of nuclear Armageddon at the hands of governments we do not trust and by mechanisms to powerful to derail. I think most of us would largely agree that perhaps that genie was better left in the bottle. But if not us then surely someone else right? Given the progress of technological advancements at that time it makes sense that someone would develop a nuclear weapon. If we should refrain from inventing this technology will others refrain as well? We did not refrain, in fact we raced to develop them. We are ABSOLUTELY headed towards a future where we look back and say “Had we only know then what we know now”.
youtube
AI Governance
2023-05-17T10:2…
♥ 16
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy7077bntBMstBT1R94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxGZ-mM2OcnMSWDPpt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwI1q-het8yxTQn-Yl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyk9JPqd43xFjrh26d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyptKh4c5yfbWW6SbB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzqo3vyetWpEbt4TNF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy69TkwGtyotvTzy_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6U5F2bS0YoLThD7J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNilJdNQWMbrFcsTN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzzQ-JIgxOU4gsnN4l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]