The paper [1] is linked in the OP. Section 3, the section invalidated by the errata, looks at data on voter preferences. So the answer to your question is "a lot", assuming that you think that "using data on the beliefs of voters" is in some way important to a paper on electoral strategy.
This was a prominent mark of the Bush era too (and I'm sure many other Democratic/Republican administrations, it's just my example). I was rewatching Fahrenheit 9/11 because of our government's current foreign engagements, and it's striking how even then people were duped, but back then it was by lack of information, now it is by overinformation
FWIW, I don't think this is really a Trump era exclusive. People have always believed whatever they wanted to believe, facts and accuracy be damned. IMHO understanding this as an inexorable human condition makes it much easier to understand the world.
What actually matters is people can be persuaded to _want_ to believe different things, so the only real leverage is in shaping those wants—not in being right.
Also, if you can get people to connect intensely with just one or two of your statements, you can then make other false statements and they won’t care, because that might invalidate the one they really want to believe. The threshold is so low that the shotgun approach of just telling lies continually actually works quite well. Your statements don’t even have to be consistent, so you can A/B test the lies.
The normal backpressure to this is that you lose face with your peers because you become known as a liar. But if your peers don’t influence your success, or you just have no peers, it works.
reply