Nope, you can't blame the forecasters, or even the mathematical weather models they rely on (tho' tempered by their personal knowledge and experience).
Given the volume of the Troposphere, and the comparative paucity of weather observing points, it's all about "initial conditions" from ground stations, radiosondes, weather satellites and other observing systems plugged into each iteration of the mathematical weather models.
These models use systems of differential equations based on the laws of physics, fluid motion, and chemistry, and a coordinate system which divides the planet into a 3D grid. Winds, heat transfer, solar radiation, relative humidity, and surface hydrology are calculated within each grid cell, and the interactions with neighboring cells are used to calculate atmospheric properties in the future.
Any wonder they need a brace of supercomputers to shake that big bag of marbles, now shaking at an operational computing speed of 8.4 petaflops (8,400,000,000,000,000 floating point operations per second.)
Even with (click →)
NOAA's very latest supercomputers coming on line this year — the first major upgrade to the system since 2015, will "boost its compute power by one third and storage by 60 percent, extending its visibility into future weather from 10 to 16 days" — it all still comes down to the (click →)
butterfly effect …
where a small change in one state of a deterministic nonlinear system (a small error in a single observation) can result in large differences in a later state (i.e. forecast).
"One butterfly flaps its wings in Michoacán, Mexico, and that pico-ripple in the atmosphere yields significant differences in the worldwide weather forecast two weeks out."