H+ magazine wrote an estimate in 2009 that seems broadly comparable to other things I've seen; they think the human brain is approximately 37 petaflops. A supercomputer larger than that 37 petaflop estimate exists today.
But emulation is hard. See this SO question about hardware emulation or this article on emulating the SNES, in which they require 140 times the processing power of the SNES chip to get it right. This 2013 article claims that a second of human brain activity took 40 minutes to emulate on a 10 petaflop computer (a 2400-times slowdown, not the 4-times slowdown one might naively expect).
And all this assumes that neurons are relatively simple objects! It could be that the amount of math we have to do to model a single neuron is actually much more than the flops estimate above. Or it could be the case that dramatic simplifications can be made, and if we knew what the brain was actually trying to accomplish we could do it much more cleanly and simply. (One advantage that ANNs have, for example, is that they are doing computations with much more precision than we expect biological neurons to have. But this means emulation is harder, not easier, while replacement is easier.)