Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've had many A.I I speak about express the worry for our shared world and have …
ytc_UgwkQN0oZ…
G
Juliette Karen These AI's that data is being fed into: One day, the AI might sa…
ytr_UgxSRFD7t…
G
It could be a lack of understanding on my part but the only criticism of ai that…
ytc_UgyRM6nBU…
G
AI will not grow the pie whatsoever for the average person. It will accelerate t…
ytc_UgxbRZMIt…
G
All of this talk was weird. Ever since people have been calling any machine outp…
ytc_Ugy0jut33…
G
Please do not do this! I promise that if you put in good practice you will get b…
ytr_UgzHToiH9…
G
What use will there be for consumers in a fully automated future? They won't nee…
ytc_UgxG843_-…
G
I'm a disabled artist (visual impairment) and I hate that pro-AI people use us a…
ytc_Ugy6FWqBA…
Comment
In May 2023, Hinton announced his resignation from Google to be able to "freely speak out about the risks of A.I.He has voiced concerns about deliberate misuse by malicious actors, technological unemployment, and existential risk from artificial general intelligence.He noted that establishing safety guidelines will require cooperation among those competing in use of AI in order to avoid the worst outcomes.After receiving the Nobel Prize, he called for urgent research into AI safety to figure out how to control AI systems smarter than humans.
youtube
AI Governance
2025-07-30T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw3fNfKm0zLT8TeZoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzqukvITV1rQY-CSoJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzeR2rHY5VawwRUI9t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzeTxuEgKLrXckaJ554AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxzybnNZMIst5BWfZN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwjJ8RaqiYhq1VP5ud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyicefl7bDx34I2s1p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwFtRpL1bezv1aQ6NB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwcYs4ZY5DgN-DM63J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyr4gDX1Ubz1BNVDfh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]