Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This boils down to questioning the morality of a human being behind the camera a…
ytc_Ugw1d-BR5…
G
In the end, is it heading towards humans making a choice between a super artific…
ytc_UgwZ1c-Eq…
G
This is so f****** dumb all this technology and they can't get out of this littl…
ytc_UgxTNr5SQ…
G
Are we going to have robot babies where you need to 3D print them for a percenta…
ytc_UgxyH66IA…
G
If you talk sexually to Ai it will do it back but kinda mask it like it isn’t tr…
ytc_UgwXhiZUv…
G
Trump says “Don’t worry”
Next day opens an AI studio and proceeds to rip off al…
rdc_mic7la6
G
This should have happened decades ago.
Instead the Chinese took over virtually.…
rdc_ibehzoh
G
Does no one remember Skynet from Terminator or Cylons from Battlestar Galactica…
ytc_UgzKTlVzh…
Comment
increasing the context limit is expensive. Why not make the person using the llm anticipate what info it needs to carry out the task and making sure the task is small enough that it's context limit is not exhausted?
youtube
AI Jobs
2025-12-17T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzQO3hIAzw5NgfiA1x4AaABAg.AQrXkfT9sgKAS59bJaQV60","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyjHpw_SEXVR5Tfl494AaABAg.AQqjzkeexxcAQtfwdkRoGM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyjHpw_SEXVR5Tfl494AaABAg.AQqjzkeexxcAQuwiz-rpeK","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyjHpw_SEXVR5Tfl494AaABAg.AQqjzkeexxcAR0gJ57Fh_a","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx7Rxbb8w8jBDmshqZ4AaABAg.AQqN8waJeTjARCU5wGmdwf","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugxf80Y69Vj3qCjbP3p4AaABAg.AQnxUAhXTFWATX8HFfaTRu","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz2Rzcw3jCN-maVGHV4AaABAg.AQn9qTGBpfBARFTh2k_W9R","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugz2Rzcw3jCN-maVGHV4AaABAg.AQn9qTGBpfBARH_aMqlPSd","responsibility":"industry_self","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz2Rzcw3jCN-maVGHV4AaABAg.AQn9qTGBpfBARON03qiapF","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx5aj_2OVBzxGvbEXp4AaABAg.AQn4n0tzWZfAQq_hWCeChW","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]