Most past studies on neural news headline generation trained the encoder-decoder model using the first sentence of a document aligned with a headline. However, it is found that the first sentence might not provide sufficient information. We propose using a topic sentence as the input instead of the first sentence for neural news headline generation task. The topic sentence is considered as the most newsworthy sentence and has been studied in the past. Experimental results show that the model trained on the topic sentence has a better generalizability than the model trained using only the first sentence. Training the model using both the first and topic sentences increases the performance even further compared to only training using the topic sentence in certain cases. We conclude that using a topic sentence, while keeping input length as short as possible at the same time, is a preferred strategy for providing more informative information to the neural network compared to using just the first sentence.