If you live in a state that has harsh winters and lots of snow, winter tires are a worthy investment. However, the question asks for only the disadvantages. Thus, first, let’s check out the disadvantages and then see if it is worth it.
1. They need replacement sets for other seasons: You will have to replace winter tires need when the roads clear up, which almost doubles the cost. So, every season the expenditure may pile up.
2. They are more flexible but also fragile: The rubber used to create the winter tires is softer. But it also wears out faster than the one used for all-season tires, if they are used in improper weather conditions. So, they must be swapped out in the summer or fall.
3. High Cost: As noted above, both the points indicate that you will need two sets of tires for your car. This may not seem like a worthy investment if you live in a state like Florida. But if you live in the north, then safety comes first. That said, winter tires add to the expenditure on a car.
Why you should still consider winter tires:
With the disadvantages out of the way, it is worthwhile to look at why it may be worth it in your situation. If your area has a lot of snow, making the terrain slippery, you will not want your car to skid. The keyword here is ‘safety’. Anyone would agree that spending a few hundred dollars is more beneficial for one’s health and car’s condition. Winter tires have a tighter grip, which impedes skidding and slipping. Moreover, if you maintain them properly, you can get away with 4-5 winters with one set. So, given your geographic location and climate, it is a judgment call to make.
If you want to learn more about the internal differences, SimpleTire has a great article on this. At any rate, safety comes first.