A wireless network architecture is studied in which edge nodes (ENs), such as base stations, are connected to a cloud processor by dedicated fronthaul links, while also being endowed with caches, in which popular content, such as multimedia files, can be proactively stored. Cloud processing enables the centralized implementation of cooperative transmission by the ENs, albeit at the cost of an increased latency due to fronthaul transfer. In contrast, edge caching allows for the low-latency delivery of the cached files, but with generally limited cooperation among the ENs. The interplay between cloud processing and edge caching is studied from an information-theoretic viewpoint by investigating the fundamental limits of a metric, termed normalized delivery time (NDT), which captures the worst-case latency for delivering any requested content to the users. Lower and upper bounds on the NDT are derived that yield insights into the trade-off between cache storage capacity, fronthaul capacity and delivery latency.