The vector object itself is on the stack, but its data is allocated on the heap. Stack objects must have their size known at compile time. If the object can change its size at runtime, it has to allocate. That’s why std::array’s size is a template argument.
But is this specific case the size (= 1000) is known at compile time. Assuming the code does not push more than 1000 items, this should be safe yes?
(But OK std::array is a better choice…)
Nop, std::vector always allocate on the heap even if the size is known at compile time or this is a specific non std compiler optimisation.
Some stuff like std::string have a small string optimisation but I doubt vector have one in any case and 1000 is clearly not a small size.
if you want on the stack then indeed std::array is the way to go.
The initial size is a compile time constant, but a vector can be resized. Anyway, the point was mainly that there needs to be a known size for every type, because sizeof(T) is a compile time constant. For example, sizeof(std::vector<int>) is 32 bytes in MSVC on Windows x64. There’s no size information in the type itself -it’s 32 bytes whether it holds 10 or 1000 ints. Small buffer optimization (for std::string, std::function) is done by oversizing the type a little so that it can hold small data within itself. I don’t think any implementation does that for std::vector -there’s not much use for it.