Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You compared ART to a working job. You intrinsically view all ART as a tool and/…
ytr_UgySzBaob…
G
So the error was that the car allowed the driver to override the speed by contin…
ytc_Ugw5GL5gH…
G
If you use AI the correct way, it can actually help you learn. However, a lot of…
ytc_UgwQIlQ37…
G
I sit there for hours to make my artworks, and then sum mf uses an ai software t…
ytc_UgwNhPBVI…
G
I don't call myself an "artist", i just prompt the AI for myself, on my own hard…
ytc_Ugxx8AHSK…
G
Side note but I’m curious what you guys think of Google’s genie 3. Do we think t…
rdc_n803pwo
G
@leaderteammimikyu3024 if only they weren't snake oil that only works with old A…
ytr_Ugw00HgZK…
G
It just feels wrong to not be polite to AI. Especially if it gives a reslly thou…
ytc_Ugx1k7K-m…
Comment
Am l concerned, yes. Do I think that they will put on the brakes, no. I am 65. Go back about 40 years ago and people would have laughed at the thought that man could create AI. Then about 20 years later they thought that it might be possible. Now we know that it isn’t only possible but quite likely that an intelligence greater than our own will be or has already been created. I think it is too late to put the genie back into the bottle. When Musk wanted self driving I knew that the only way there was an artificial intelligence. So it didn’t surprise me in the least when he started to develop robots. Nobody is going to slow down, let alone stop. It is a race now and the government wants it more than anyone else. Of course they want to own it for the military. Government regulation would be like a fox guarding the hen house. I am surprised that Musk would trust them.
youtube
AI Governance
2023-03-30T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwmaTKbsho7xfXRlm94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxzFpS_ytoaTy-rhp14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxs6l_cwP1aYITLMDp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzlfmBj3Qt_EK4gE7h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNIZ-d280ARnkhsUp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqN3eWIHvH0ljO4894AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0YQSqTn7YiQPyAkB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyys0bLM5BI-uk7RYl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxTGLv5ZCtrv8kldNN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLY6Tm__sG6qMa16t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]