Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right, with every technological advance it isn’t like laborers get to benefit. I…
rdc_nxrq65m
G
just doomers...again and always. "Your world is in Jeapordy!...Trust the governm…
ytc_Ugx7CJ6Xm…
G
Art is using sculpting tools to make a clay bust of Pee Wee Herman. AI art us…
ytc_Ugxw3cn4Z…
G
I think AI could be great with airplanes to reduce accidents and software proble…
ytc_Ugy-AFqz_…
G
Its funny how a lot of people who have defended AI in this retrospect for saying…
ytc_UgxcNCnmB…
G
Hahaha, what an epic and at the same time so... hopeful scenario you've painted!…
ytc_Ugzqz-vsl…
G
Plot twist: the fact that machines overheat or needs expensive repairs undercuts…
ytc_UgxmGWMl1…
G
I am profoundly sorry for their loss. Unfortunately ppl do not understand that A…
ytc_Ugzv7yXPL…
Comment
Its funny how the people whofund the reaserch of AI are all about "this will help humanity" but omit all the potential bad use of it, the consequences of a bad actor using AI for "bad things"....cuz the investors will be the first ones to missuse it -.-"""
We had soooooooooooooooooooooooooooooooooooooooooooo many books and films that shows us how can (and will) be any technology missused...but naaaah thats fine, humans will never missuse atom bombs/ tanks/ weapons / AI/ social medias to do bad stuff right?? RIGHT!!
youtube
AI Governance
2024-01-11T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx6ULAn7YeVS4aMauV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSy3kaiN5Sf_USn2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDgib_pFDPn5uqyn94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAJlyOwQOcbvb8-4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQyFoQ4Xo8JU8qJit4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwa597GRUcLlQmPb-F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6lgPMbCJ_Jw97FTB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLqScsNPV2Rc7KFC54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzg95PPY8lgGMDL6Pt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxCxfXtfx42y_iZx7d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]