These types of psyops can be applied generously to more than politics. These types of tactics can be applied to any confrontational situation where you have "side-taking", especially blind oath type of dealings.
The covid pandemic, like politics, is an easily divisive issue. Especially in places like the United States where public discussions are generally open via social media (unlike other places without open public free speech on political issues).
You have lines already formed, opinions already chosen (like in politics) where the individual often "takes sides" and plays the issue like in sports; you have your team (beliefs or stance) and you fight for whatever narratives that team pushes.
Narratives are created and nudged by the wielder of the psops operation(s). When narratives are already apparent, it is easier and quicker to manipulate these situations. Social media is easy peasy land to deploy, implement, and manipulate narratives.
Below is my response to someone who commented on the above article. Disruptive actors can "play" both sides easily. Especially if theirs are focused efforts. Generally the easier marks will take the lead bait unto narratives which require no more than their loyalty. No idea if this poster is or is not an actor, yet his response is often similar to act/play roles.
Of course "normal" discourse and conversations often contain the very same. The difference will be dedicated efforts to sow, spread, and reap this discourse. Far fewer individuals will have these type interests as a motivated (ie: paid or conscripted) bad state actor.
Nationalism displayed by these bad actors will often feed into these narratives, and their own dedication to their own nations, at least for the better experienced actors and programs, will rarely be seen unless to nudge opinions for their own state. Yet that can be a giveaway to those researching who is doing what. Humans and machines both can mess up their roles if solid awareness of their actions (reviews) are not calculated and openly challenged (results?). In the future, machines operating via AI will become much better than humans at these types of deception.
Now take the above and go down home political (the above is international based). Here is a widely circulated social media meme which plays well into these "to sow discord" situations, and it does it for BOTH "sides":
This type of post is exactly the type these foreign actors devise, combative & seemingly towing a party line by using insults. Bit more difficult to spot them on some social media, generally though if you’ll follow their posting history, ALL troll accounts will do is post divisive insults mostly based on lopsided party lines. Twitter and Zucks bbs are fairly easy, they’ll often display extremely patriotic memes and avatars, along with aforesaid combative negative posts.
Above would appeal to both "left" (often mislabeled as liberals) and "right" (often mislabeled as conservatives). It builds dark comedy on/for the part of the left, and anger (plus stoic labeling - the flag and cult of personality parts) on/for the right. This is a good example of effective bi-partisan trolling.
If you've found yourself drawn to either of the narratives above, as points of interest in your mindset, taking sides on the above given narratives out of "support" for your "side, congratulations you're probably an easy mark for online pysops.
These two examples are plain ones, the narratives simple (1st one - China is bad, 2nd one - Orange Man/ Southern Man bad) and have enough subjective "truth" as to be appealing to even highly intelligent people, on some levels.
Rise above those levels.
This was part of findings of a seven year long underground investigation into online trolling. The full article will appear in Issue 71 of The Underground Sound, out before Year's end@ http://undergroundrecords.org/sound