It seems they've fallen head over heels for the new cool guy in the White House but their not yet ready to commit to a long-term relationship with the rest of the American family.
I always thought that views of the US have improved, modestly, but are still predominantly negative.
I do believe the perception of the US in the world has changed dramatically since Bush's departure. This is not surprising of course, since before the Iraq debacle we were a very popular country. We are just returning to normalcy. Some think that the world hates us because of our wealth and our influence but this is not true. Only when we use that power to act like douchebags do they get mad at us.