You could do this at several levels. One is at the information level by searching for, and exploiting, any weaknesses in the AI interfaces (think buffer overflows). At the same time each AI will strive to correct those same weaknesses in itself. This includes crafting tailored computer viruses.
Then one AI could map out, identify and attempt to take out the other AI's support infrastructure: backups, energy stations, and yes, possibly attempt to target critical spots in the rival's physical architecture; trying to remotely hijack a plane to crash it into the other AI's main switching station for example. Another possibility is to attack the digital layer of the AI's infrastructure, sort of a Stuxnet approach.
Another possibility would be to recruit allies. One of the AIs could try and manufacture evidence that the other AI is trying to take over the world, import skynet, create an Armageddon gravitational singularity, resurrect the Antichrist, engage in unstoppable nanotechnological biowarfare, and so on, and "sell" this evidence to the appropriate groups to elicit violent actions directed at eliminating the rival AI.
A cross-over between option 2 and 3 could be to manufacture evidence demanding a tactical nuclear strike against the other AI's central installation (if one exists), e.g. selling the other AI as the master control computer to develop a credible and devastating bioweapon.
Depending on the AI's location and situation, other scenarios exist. For example if the rival AI was being developed by some private institution, hiring mercenaries to attack the institution could be a possibility. Manipulating the stock market until its own worth (through several cover companies) was more than the capitalization of the rival institution, buying it out and instating a CEO with mandate to stop all AI research, and delete any prototypes, would be another.
Another possibility to achieve the "THERE CAN BE ONLY ONE" directive would be for one of the AIs to convince the other and fuse together. An AI's view of individuality might not be the same as ours.