Gaussian and sieve filters
The output depends on the computation kernel the recursive median is one. Others include connected set openings ('o'), closings ('c') and composites of these 'M' ('o' followed by 'c' at each stage) or 'N' ('c' followed by 'o'). These last two yield outputs that are almost indistinguishable from the recursive median. (I have just realised how confusing it is referring to the recursive mean as an 'm' sieve, I shall start using the abbreviation 'v' as much as possible instead.) This is dull stuff.
This is dull stuff compared to something that has been thought about much less, let alone exploited- a PhD project waiting to happen. Sieves (and the recursive median filter is what I call one of the sieves) are idempotent. In other words having made one pass through the data at any particular scale, making another pass through the result changes nothing.
This is not like a linear (diffusion) filter where repeated passes at the same scale simply smooth the signal away. It means that one could build entirely new filtering schema, here are some ideas. They begin to look a little biological.
Two twists in the algorithm make it practical
The implementation is very literal and slow. Each stage is run and re-run to idempotency. I called the filter a data-sieve. The twists in this story is firstly to switch to multiple pass median filters(Bangham, 1993)<ref>Bangham, J. Andrew, "Properties of a Series of Nested Median Filters, Namely the Data Sieve," IEEE Trans Sig. Process. Vol. 41. NO. I. Jan 1993</ref>, then being excited enough to look for something faster/better: 'recursive median filters''(Bangham, Chardaire et. al. 1996)<ref>Bangham, J. Andrew, "Multiscale nonlinear decomposition: the sieve decomposition theorem" IEEE Trans Pat. Anal. Mach. Intelligence. Vol. 18. NO. 5. Jan 1996</ref> and sieves. Each twist was patented. The first was (I think) in 1988. Patents are difficult to produce, and enforce. I was supported in this by Cambridge Consultants Limited (CCL) who had taken over the company. Fantastic engineers but the vision community had not (or we had not noticed) at that time appreciated the coming possibilities of SIFT. Scale-invariant feature transform (or SIFT) is an algorithm in computer vision to detect and describe local features in images. The algorithm  was published by David Lowe in 1999.