Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I call "algorithmic culture" seemed like a big problem a long time before A…
ytc_UgzNnD5GI…
G
@janderson117 you aren’t creating anything though, you are commissioning somethi…
ytr_Ugzh8JsEn…
G
Republicans mad at immigrants taking jobs while investing in AI that will take a…
ytc_UgybOCVxQ…
G
MUCH better than the Facebook or tik tok hearing. This one actually had structur…
ytc_Ugy0nqmuE…
G
Indeed, it's merely a phase—the new NFTs. If you look at that situation, the peo…
ytr_Ugz1YXX6_…
G
Just reading that Bosch just today opened their chip plants in Germany.
"i woul…
rdc_h17u4wz
G
our idea of an optimal outcome to this seems to be utilizing ai for our benefit.…
ytc_Ugyn8z-ud…
G
I think the supposed psychosis from using LLMs, is just people trying to find a …
ytc_UgxOuaG2W…
Comment
We just have to make sure we never make the classic mistake made in every Sci-Fi movie about AI. If we are making something that can become smarter than us, we must make sure it is not also stronger than us. How many movies put an emerging consciousness into a robot that can outfight any human? I think the first regulation should be about limiting the strength and mobility of any device that will have an embedded AI engine. If it's going to get smarter than us, it should need help lifting a box of tissues...
youtube
AI Governance
2023-07-08T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyx0kRX62KRVbmU2XJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzyjutaFh4dASmiAxp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyW497hyah9pSbLPQ14AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQaUatkICOSS-s1NB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrqyFEqaxF8NGfPPl4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwxaGSS3mv0ppPUgMB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGUazI5KMYeG4Vv354AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxpHbRTlNe8UwL8dK54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwOR884VLE4vV9763p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzsrQOPy8p5ZogPMPN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]