Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will take up jobs, it will run businessess, make movies, drive cars, fly plan…
ytc_Ugwf_ajcH…
G
No it wont be a danger to art people have very unique art styles we wont be gett…
ytr_Ugz4jNmXb…
G
I can't believe how wrong Scott is on AI. You cannot look to history to explain …
ytc_UgxdlhLl0…
G
These kind of conversations show that Elon does not understand what an LLM is. I…
ytc_Ugyj4vgEn…
G
i think with ai can be used in art but in ways that still result in actual art l…
ytc_UgxiFk4KN…
G
@lintee_12 Exactly. Plus, even if a picture did leak, you could say, "So what, …
ytr_UgxgiUYqw…
G
We should drain AI “artists” of they’re blood to produce fuel for a super colony…
ytc_UgwWxwmM0…
G
OpenAI doesn't have access to people's data, except what they willingly share in…
ytc_UgzdqvfBi…
Comment
Good example. The danger of AI was also portrayed back in 1968 when HAL 9000 ( the Heuristically programmed ALgorithmic computer) in "2001 a space odyssey" took over running the ship and tried to kill everyone aboard.
youtube
AI Governance
2026-02-11T14:2…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugy3ukK8OORya7kB9XB4AaABAg.ASMTuWfwJWmASRVOF_N0UH","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwhW_uR9nGR2VLkWYh4AaABAg.ARwdSpHG1j3AU8i2zSHx8u","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugx8d3GMUr4A8KMCLoV4AaABAg.ARnRXm1sq_9AS5F-I6r6ZQ","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwk8QPLX6US6-kI4Fl4AaABAg.ARjOGZjZBcrASxu-GvPHRj","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxETHT0nuGAvImuQoF4AaABAg.ARiUC4KvGcvARiVPyJucgT","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxETHT0nuGAvImuQoF4AaABAg.ARiUC4KvGcvARiYvjEc3pH","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwpN4e4KDV_ODiwAnh4AaABAg.ARSG0sMZnp2ARa9BcmACHh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgyVS22ov8iIXXs-yEN4AaABAg.ARPXZ4g6SvHARaArjxIIeN","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwSU8aZ90E6417NPbp4AaABAg.ARK1FJ_5Cx2AT5EU4-1wZo","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxketCYWE5n61AJ-gt4AaABAg.ARK0EVsNbKsARK1JB7EkRH","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]