NAT, properly configured, will cause no problems at all. (ensure the router is accepting pings from the wan as well). For port forwarding to work properly and reliably, you must have a static LAN config. (technically you should for the DMZ as well). Never use the DMZ and port forwarding on the same machine. Port triggering by design will not work reliably with directplay 7.
If properly configured routers are not a problem, unfortunately most people do not know how to configure a router and seem unwilling to read the manaul to find out. (configuring statically outside the DHCP pool range, disabling UPnP etc...). It is a shame that routers have become so commonplace, though they are not a problem for those who understand the details of network configuration.
Software firewalls, have been shown to be unstable for mutiplayer SFC play through experience. Most will not configure it to allow pings - which are necessary. Software firewalls introduce another unnecssary and potentially (usually) buggy layer in the tcp/ip stack of your machine. Most work by creating a virtual nic (wheher you can see it or not) and offer very little control over the actual NAT that is taking place.
A number of users have reported significantly increased mission stability without the use of firewalls. (Karnak DieHard, myself...)
We never had mission stability issues like we do now before software firewalls became so prevalent. Coincidence? I think not.
I stopped using Zonealarm many years ago after I traced it to being the cause of mission instability in OP. This instability varied from version to version with Zonealarm, indicating it was a problem for them. Karnak and DieHArd both had similar experiences with the Windows XP firewall.
If a report of software firewall use is received while the suspect user is still online, it is not too hard to find out if they actually are running a software firewall.
I'll add more to this post with edits but thought I'd get this in now, considering your comments...
edit: ok, smoke break over... back to explaining why software firewalls are bad for networks, network applications and actually provide a false sense of security:
In two different professional settings I have experienced signficant probelms with software firewalls...
1) I spent weeks trying to troubleshoot a backup application on an NT4 network. After much pain I finally tracked it down to a BlackIce install on the domain controller. This cost a lot of time and money. Not only did it cause problems for running backup over the network, it did not protect against an earlier worm infection on the network caused by a user who opened an unknown attachement, that the fully updated virus software detected but could not stop.
False security. After discussions with BlackIce support, they admitted that their current version of their firewall could not currently cope with allowing the poroper operation of this backup application over the network...
OK, that was case #1. (actually I just thought of a third...)
2) When running a webserver in a comlicated intranet at another job site, the server was inexplicably invisible from vertain netwrok segments. Again, after much pain and agony I tracked the trouble down to a ZoneAlarm install on the webserver. Uninstalled Zonealarm and prestop, everyone could see the webserver again. Zonealarm, as far as it was concerned, waws properly configured, yet it was definitely the casue of the problem. Again, this cost a lot of money in valuable man hours.
3) A data aquisition system, responsible for collecting data that represented tens of thousands of dollars a day to the company was crippled by an an unknown source. Again, after much troubleshooting and diagnotices (right down to powerline monitoring for the entire facility) a McAfee install with a firewall component was determined to be the cause. It was interfering with serial communications when it should not have been. McAfee was at a loss for an explanation but it was definitely the problem. This was probably the most expensive case of software firewall problems I have come across. It probably cost about $37,000 to the company when all was said and done in lost productivity and additional cost to the instrument manufacturer who sent out a techincian to help troubleshoot the system under a service contract.. Also, in this case that same firewall failed to protect against a worm infection. More false security. Very expensive false security.
I have more coming...
OK, now think about what is happening, latency is extremely critical in multiplayer gaming applications. The multiplayer game itself is often very processor intensive... now introduce a software firewall that relies on a virtual network adapter and NAT running on that same processor. The processor is busy with game data, and sends packets to the network expecing them to go unheeded... but wait, that software firewall needs to use the procesor to process the nat at the virtual network adapter. (which is not under your control at all by the way)...
Now you say, "but I get good latency numbers when I test with the firewall up". Aha, say I, but are you running a processor intensive game while said tests are underway? No.
OK, now the software firewall manufacturer is clearly making assumptions about what you are doing with your connection. They cannot possibly account for all applications. Not one of them charge nearly the amount of money that would take.
Hardware firewalls are a different matter, (more to come) ... "Hardware" firewalls or routers actually have hardware network interfaces, to perform their NAT and packet filtering with, and are not hampered by other tasks, (like running a processor intensive game), they are purpose built for tha job and do nothing else. Most run an embedded OS like VxWorks and firmware written by/for the hardware manufacturer. They are not invelnerable to failure but as they have the hardware and purpose designed software to do it are far more reliable. Though bugs and issues do occure that can require regular reboots of the device or frequirent firmware updates.
False security: some firewalls can actually introduce vulnerabilites in your system that did not previously exist. If you have ever monitored firewall activity or hacking attempts you will find they increase exponentially with porn warez and P2P app usage. Playing a game does not draw this negative attention and does not really require the use of a firewall.
(more to come)
Keeping your OS up to date and using common sense on configuring the network is the best protection there is. Generally all that is required to connect to the internet is a minimal TCP/IP protocol on the client machine. The more minimal the better. (No windows filesharing, no NetBIOS ets...)
DSL (PPPOE) can be a bit of a problem and complicate such matters as it introduces another layer of NAT that is not completely under your control. Often people will use ISP provided PPPOE clients when the one native to the OS in use is often superior in functionality and security.
OK, now to address the question/rebuke: But BatteFeild 1942, Quake2004 and Eve work flawlessly with my software firewall and router, so SFC netcode just sucks! ... (smoke break.. this rant is tiresome
)
To put it simply those other games only require that the client be able to communicate with the server, which is usually known before the main game code starts. In Quake like games, the client sends and receives all data through the game server. You are never in direct communincation with other players.
In SFC, you must be able to connect to the game server, like other games determined form a serverlist, and known in advance of the main game code running. Now SFC runs a (now inadequate) firewall detection at the serverlist, not becasue it needs to for communication witht he server but rather to ensure that you will be able to communicate relaibl;y with all the other players on the server (whcih cannot be known in advance) independently of the server. Once a multiplayer dynaverse mission launches, you are not talking to the server anymore but only the other players n the mission. In order fot the mission to start you must be able to connect to these players and maintain uninterrupted communication witht hem, as the server is no longer managing that communication. When in a multiplayer mission, the host is actually the server that you are connected to and the other players are clients. I cannot think of any other games that work like this. SFC is unique in this respect. It is efficient, and greatly reduces bandwidth and porcessorl loads on the dynaverse server, but was conceived before software firewalls were commonplace.
If we ever get the client source I'd like to see a more complete software firewall detection implemted to greatly increase the stability of multiplayer missions thus improving the reputation of the game that is much maligned due to poor understanding.
Phew... I think I might be done... maybe....