Richard B. Johnson dixit:
> > /* ****************************** */
> > if (select(getdtablesize(), &socklist, NULL, NULL, NULL) < 0)
> > {
> > if (errno != EINTR) perror("WeapTerrain");
> > continue;
> > }
> select() takes a file-descriptor as its first argument, not the
> return-value of some function that returns the number of file-
> descriptors. You cannot assume that this number is the same
> as the currently open socket. Just use the socket-value. That's
> the file-descriptor.
Not at all. 'select()' takes a *number of file descriptors* as
its first argument, meaning the maximum number of file descriptors to
check (it checks only the first N file descriptors, being 'N' the
first argument). Usually that first argument is FD_SETSIZE, but the
result of any function returning a number is right if you know that
the return value is what you want.
If, for example, FD_SETSIZE is set to UINT_MAX but
getdtablesize() returns 100 ('ulimit' came to mind), it's a good idea
to use the return value of that function. Anyway, IMHO is better to
use FD_SETSIZE.
See the glibc info for more references.
Bye and happy coding :)
Raúl Núñez de Arenas Coronado
-- Linux Registered User 88736 http://www.pleyades.net & http://www.pleyades.net/~raulnac - To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html Please read the FAQ at http://www.tux.org/lkml/