c++ - timeout and select() not working as expected -


i'm trying communicate server using rcon protocol take control on gamesever. far using exisiting c# library library quite buggy , therefore i'm writing own application in c++ i'm able use both on windows , on linux server.

so far worked out pretty i'm running problems when try set timeout using select() find out if server still started , responding commands. during first run of application select() returns right value, after closing , running again, results weird.

my code pretty looks this:

#include <stdio.h> #include <iostream> #include <iomanip> #include <sstream> #include <string.h> #include <unistd.h> // socket includes #include <sys/types.h> #include <sys/socket.h> #include <netinet/in.h> #include <arpa/inet.h>  // other includes #include "main.h" #include "crc32.h"  int main() {      struct sockaddr_in server;     int mysocket, slen = sizeof(server);     char buffer[2048];       if ((mysocket = socket(af_inet, sock_dgram, ipproto_udp)) < 0)         printf("error creating socket");      memset((char *) &server, 0, sizeof(server));     server.sin_family = af_inet;     server.sin_addr.s_addr = inet_addr("xxx.xxx.xxx.xxx");     server.sin_port = htons(1234);      if(connect(mysocket, (sockaddr*) &server, sizeof(server)) < 0)         printf("error connecting socket");      bool isconnected = false;     /*     creating packet send, cut save space     */     sendto(mysocket, loginpacket.c_str(), loginpacket.length(), 0, (sockaddr *) &server, slen);      while(1) {          // clearing buffer         bzero(buffer, 2048);          // timeout settings         int selectsize = 0;         struct timeval timeout;         timeout.tv_sec = 5;         timeout.tv_usec = 0;         fd_set fds;         fd_zero(&fds);         fd_set(mysocket, &fds);          selectsize = select(mysocket + 1, &fds, 0, 0, &timeout);         std::cout << "size is: " selectsize << std::endl; // testing :)          if (selectsize == 1) {             int recvlength = recvfrom(mysocket, buffer, sizeof(buffer), 0, (sockaddr *) &server, (socklen_t*) &slen);             if (buffer[7] == 0x00) {                 if (buffer[8] == 0x01) {                     //password correct                     isconnected = true;                     break;                 }                 if (buffer[8] == 0x00) {                     //password wrong, sth.                 }             }         }     }     // tests     sayhello();     close(mysocket);     return 0; } 

when start script first time , server not started, works expected, after 5 seconds selectsize returns value 0. loops further on until start the server, return value 1 , break while loop. afterwards quit application, turn off server , start script again. instead of return value 0 after 5 seconds, returns value 1 (even though server offline , there's not packet receive) , after 5 seconds return value 0. running same script (with adjustments) on windows gave me pretty same result, selectsize pretty returned 1, though server offline , value should have been 0. read tons of sites using select() right way none of them helped me far select() not returning reliable results after quitting , restarting application.

p.s.: there confusing information how use select() right way, used pretty every solution provided on internet like:

selectsize = select(mysocket + 1, &fds, 0, 0, &timeout); selectsize = select(0, &fds, 0, 0, &timeout); selectsize = select(1, &fds, 0, 0, &timeout); 

but none of them gave me reliable result.


Comments

Popular posts from this blog

Spring Boot + JPA + Hibernate: Unable to locate persister -

go - Golang: panic: runtime error: invalid memory address or nil pointer dereference using bufio.Scanner -

c - double free or corruption (fasttop) -