Summers are warming in the west when measured by extreme temperatures (highest temperature for the month). Don't want average temperatures since temperatures at night are not an issue. Can't use average temperatures anyway due to the time-of-day observation bias issue that makes it look like temperatures are dropping when they are actually rising. In a nutshell, the observers used to go out and set the min/max thermometer manually every day in the early evening. But that meant that the max for the next day may in fact be the max for the current day if the next day is cooler. Then switching to morning resets the min would be duplicated if the next day's min was warmer. Finally switching to electronic min/max there would be no bias but unfortunately the new data would be incompatable with the old data. So instead I simply find the max temperature for each month. Thus the time-of-day issue is moot.
Hotter days out west are a problem, creating for example higher rates of evapotranspiration and soil moisture decrease. That's the main reason that the average drought (which starts and ends naturally) is more severe than the average drought years ago. Up to 4F rise per century (with a lot of natural variation) in Utah:

In the east it's a different story. Summer extremes are dropping. You would not know that from reading the news, but their sources are usually ASOS sensors at airports. Those run hot especially in hot weather. And as we know they are next to the runway. Here's a rural Maine station with a 5F drop per century in July:

Here are all the stations I did with a link to the data and code
https://virtualcoinclub.com/wx/temp/. The data came from here:
https://www.ncdc.noaa.gov/cdo-web/search I did not cherry pick any data. I simply sorted by oldest starting date and looked for stations that continue to the present day with 98% or more data (although I prefer 99% or 100%). The first station I found with that criteria with the list sorted by oldest starting date was the one I downloaded and plotted.