Socket Programmin (Problem)
Hello,
I have builded a Class that uses sockets to both connect and recieve data. Now I wanted my classes to only accept a connection from a known application, and thus I added an other class ("ValidationClass") that all it does is create a connection with the remote location, andsend a string to be validation. The other side would then check the data recieved and accept or recject the connection.
However I have a problem. When I test everything, my server application does not tell me whether the connection has been accepted or not. However when I DEBUG my client side application, my server side tells me the correct result (Whether it was accpeted or not).
Now the reason I can think why on DEBUG it works fine is that I take time to press F10 and the connection can be made correctly!!
However I do not know how to make sure that my server side is initialized from the client side :( or vice versa
Does anyone have any ideas?
Thanks in advance for any help you may provide