Smart grids are the next generation of power distribution network, using information and communications technologies to increase overall energy efficiency and service quality of the power grid. A significant challenge in smart grid development is the rapidly rising number of smart devices and how to meet the associated load on the backbone communication infrastructure. This paper designs an Internetof-Things smart grid testbed simulator to provide crucial insight into communication network optimization. Simulation for a large number of smart devices under various heterogeneous network topologies is used to analyze the maximum number of clients supportable for a given demand-response latency requirement. This latency includes all protocol overheads, retransmissions and traffic congestion, and simulator processing time is successfully eliminated from the final delay calculation via data post-processing. For a specific three-tier topology, given a round-trip latency requirement, the effect of number of smart devices per local hub and overall number of local hubs on network performance is analyzed, and crucial design insights are drawn relevant to cost-efficiency optimization of network deployment.