The organisation I work for has done something similar.
In our case, we roughly followed these principles:
1. The client app has is own database, at the beginning a copy of the main server database
2. The server app also has its own database
3. The client app can exports its data into a json file, which can be imported on the server app
4. The server app can export its data into a json file which can be imported on the client app
5. The end users are the ones who initiate the sync process, so that control rests with them, but its easy to automate this sync.
6. The json import logic on the client and server is then able to deal with duplicate data
Be on the lookout for autogenerated primary keys, those can be a nasty little demon to deal with when syncing data. Infact, I would go out on a limb and suggest you dont use autogenerated primary keys but instead use something else.
Also, MySQL supports Master-Master replication, where data flows both ways. But of course it assumes that the two computers can communicate directly, i.e. public IP addresses on both (unless networking gurus have another way).