Recent behavioral studies have given rise to two contrasting models for limited working memory capacity: a "discrete-slot" model in which memory items are stored in a limited number of slots, and a "shared-resource" model in which the neural representation of items is distributed across a limited pool of resources. To elucidate the underlying neural processes, we investigated a continuous network model for working memory of an analog feature. Our model network fundamentally operates with a shared resource mechanism, and stimuli in cue arrays are encoded by a distributed neural population. On the other hand, the network dynamics and performance are also consistent with the discrete-slot model, because multiple objects are maintained by distinct localized population persistent activity patterns (bump attractors). We identified two phenomena of recurrent circuit dynamics that give rise to limited working memory capacity. As the working memory load increases, a localized persistent activity bump may either fade out (so the memory of the corresponding item is lost) or merge with another nearby bump (hence the resolution of mnemonic representation for the merged items becomes blurred). We identified specific dependences of these two phenomena on the strength and tuning of recurrent synaptic excitation, as well as network normalization: the overall population activity is invariant to set size and delay duration; therefore, a constant neural resource is shared by and dynamically allocated to the memorized items. We demonstrate that the model reproduces salient observations predicted by both discrete-slot and shared-resource models, and propose testable predictions of the merging phenomenon.