Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People are dumb as fffffff
Like bro how can that AI steal without artists? Plus …
ytc_UgyG6ynSz…
G
Waymo run in areas where that have been fully trained on, and they have lidar an…
ytc_UgzvuPIG0…
G
I disagree with ghibli ai. It steals the work of Miyazaki and his efforts. It's …
ytc_Ugy--vHal…
G
Everyone forgave him, including all the victims who were on the site he was usin…
ytr_UgxLpS4Wf…
G
It was flooding in San Diego because of rain. Think maybe you can tone down the …
rdc_fn5nv5d
G
Not one mention of energy limits. Not one mention of how AI is supposed to start…
ytc_UgyHnr7EG…
G
You miss the point.
AGI will not be achieved by the current models.
Intuitive …
ytc_UgxIkV0Sw…
G
I hope the AI bubble bursts and these companies laying off employees realize it …
ytc_UgzrE-rn6…
Comment
@herlandercarvalho I personally think it is absurd to think of giving up personal liberties just because something really bad *might* happen. I'm just never going to be okay with the kinds of things a govt would have to do to enforce an AI regulation and the reason is it requires some serious encroachment to my basic sovereignty as a free thinking human being and all over some extremely rare and hard to predict event that will most likely not happen. . Pick any of these extreme events and they probably will not happen. One or more will happen but any one event is extremely unlikely to happen. That means any example scenario used to create legislation will by definition be extremely unlikely to actually come to fruition. These blackswan events are extremely rare and just because we increase the surface area for them that doesn't make them any more likely. It does increase the overall likelihood of experiencing one of many possible blackswan events but any particular event is just as unlikely to happen as it always has been. So giving up personal liberty for something bad that is unlikely to actually happen just seems absurd to me
youtube
2023-05-10T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugy04J7PUpcnCna9BC14AaABAg.9pT2dIhJ8mO9pTC0VcWbTW","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugy04J7PUpcnCna9BC14AaABAg.9pT2dIhJ8mO9qafkphc0RK","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzkqcOguPlmghigzOd4AaABAg.9pT-3iBzvVZ9pTrPxI3lfR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx8hkgo8y4kuhCHIZh4AaABAg.9pSvbLukvSp9pX5whpqbBm","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugwa1ko2ZwgPbJKqEzt4AaABAg.9pSt6T0y-Oa9pV7yqO7u5h","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxcZnsxBBL52SFBylh4AaABAg.9pSsFXIsyJ19p_jVSIASkp","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwavBXu0OnJxynvG754AaABAg.9pSn5nNUNgf9pV7ZqcQsXC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwqOfQIe1TGmNGlY4l4AaABAg.9pSdmFLjsh89pSujdQSbMH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwk5-sDtlCX90DvkrJ4AaABAg.9pSdRSLYNX69pT-uoiTr_i","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwk5-sDtlCX90DvkrJ4AaABAg.9pSdRSLYNX69pT4e05U-d0","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]