![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Every so often I hear someone talking about modelling traffic jams as waves travelling in a queue of cars. After some thought, I came to some tentative conclusions, without having actually tried any modelling or anything.
Imagine a long long stream of cars along a somewhat congested motorway without much overtaking.
The first observation is, whatever you do, you can't really affect the car in front as long as you're driving legally/safely. And whatever you do, you don't end up significantly behind them: if there's any sort of traffic, the average speed is much under the fastest speed you could drive in an empty road, so you can always catch up with them. So whatever you do, *you* will reach your final turnoff shortly after the car in front.
However, over a long run of cars, it seems plausible (I haven't double-checked the maths) that cars driving at 30mph have a greater throughput than ones alternating 60mph and 0mph, mostly due to needing more than twice the distance between at 60 than at 30. That means that if traffic is dense, there's a natural tendency for small disruptions to sometimes get magnified, when each car reacts a little slowly to the car in front, and hence makes a slightly larger correction. Whereas if you go a bit slower and give yourself a bit of extra space when the traffic in front of you starts of but you suspect it's more stop-start, hopefully the traffic behind you will experience *less* disruption.
I'm not sure, does that sound right?
Imagine a long long stream of cars along a somewhat congested motorway without much overtaking.
The first observation is, whatever you do, you can't really affect the car in front as long as you're driving legally/safely. And whatever you do, you don't end up significantly behind them: if there's any sort of traffic, the average speed is much under the fastest speed you could drive in an empty road, so you can always catch up with them. So whatever you do, *you* will reach your final turnoff shortly after the car in front.
However, over a long run of cars, it seems plausible (I haven't double-checked the maths) that cars driving at 30mph have a greater throughput than ones alternating 60mph and 0mph, mostly due to needing more than twice the distance between at 60 than at 30. That means that if traffic is dense, there's a natural tendency for small disruptions to sometimes get magnified, when each car reacts a little slowly to the car in front, and hence makes a slightly larger correction. Whereas if you go a bit slower and give yourself a bit of extra space when the traffic in front of you starts of but you suspect it's more stop-start, hopefully the traffic behind you will experience *less* disruption.
I'm not sure, does that sound right?
no subject
Date: 2017-05-11 09:58 pm (UTC)no subject
Date: 2017-05-11 10:09 pm (UTC)no subject
Date: 2017-05-11 11:24 pm (UTC)They have a bunch of cars all driving round in a circle, and soon a traffic jam wave forms without any apparent reason.
no subject
Date: 2017-05-11 11:29 pm (UTC)no subject
Date: 2017-05-12 01:33 pm (UTC)However, when setting a variable speed limit the efficacy comparison is not with vehicles trying to do 70mph, but with vehicles self-regulating their speed.
Back when variable speed limits were first introduced on the M25, I saw a study that said it did improve the average speed of traffic, but only by about 5km/h. This suggests that to get the full benefit UK law should be changed to allow speed limits to be varied in 5mph increments. A significant obstacle to this, however, is that cars aren't required to have 5mph gradations on their speedometers!
Absent that, average speed checks need to be justified in terms of safety, fuel economy and/or driver stress. Although not implausible, I've never seen an evidence-based justification in those terms.