Was any American history covered in your history class? What was covered, if at all?

Hello, American here.

What the title says, I'm curious if American history, like the Civil War, Civil Rights movement, etc. is covered at all in your classes.

If so, what's covered, and in what context are you learning about it (like if you talk about the American revolution is it in the context of learning about the Enlightenment or British colonization.

I'm curious to hear your responses. I'd imagine this is already done, but I'd also appreciate it if you could say what country you're from.

Thanks in advance!