In Lp-spaces with p  [1, ) there exists a best approximation mapping to the set of functions computable by Heaviside perceptron networks with n hidden units; however for p  (1, ) such best approximation is not unique and cannot be continuous. Keywords. One-hidden-layer networks, Heaviside perceptrons, best approximation, metric projection, continuous selection, approximatively compact. 							
						
							
					 															
					Paul C. Kainen, Vera Kurková, Andrew Vogt