Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i mean, the ai content does not build itself
if i use the english language to s…
ytc_UgxpAAerD…
G
The difference is that it takes time and effort to develop skills. Just because …
ytr_UgybVbGo4…
G
AI can't really replace a lot of the things it's currently purported to do . . .…
ytc_UgzKjC68T…
G
It's always a power struggle. Human nature hasn't changed. History repeats itsel…
ytc_UgwMKP3Wr…
G
Elon Musk qui demande une pause 😂 c'est juste pour prendre de l'avance sur les a…
ytc_UgwJgLTgG…
G
Concerns are real. AGI by 2027? no. LLM is not a stepping stone to AGI.…
ytc_UgxUpO3fV…
G
Why are there stupid people create stupid things such as a robot? They're invest…
ytc_UgyFPoC6Z…
G
On another note, Westworld is a different animal altogether. Those idiots progra…
ytr_Ugh8FxkHz…
Comment
Sal Khan's perspective on the transformative power of artificial intelligence in education is indeed thought-provoking. The idea of personalized AI tutors for students and AI teaching assistants for teachers opens up exciting possibilities for enhanced learning experiences. What role should teachers play in this AI-powered education landscape? How do we strike a balance between human interaction and AI support?
youtube
2023-06-20T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz5WXaXziw05GUjU9x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvG5HCRRkDlOodkpd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKxS4ZfsEy3hn4tO54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz62lgq0rNZwmoRmCx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzbtukhe2ba8bb7PfN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRwpCXIFXkHr9hqxF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGbXpeympB340d-RR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTytVoOxTf29rg9cV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwEcy1JBN5T9vN8kWx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxHyyZUvjvI3wGisz94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]