當(dāng)收到客戶的連接時需保存下代表客戶端的新socket,以后用這個socket與這個客戶通訊。每個客戶將來會對應(yīng)一個rtp會話,而且各客戶的RTSP請求只控制自己的rtp會話,那么最好建立一個會話類,代表各客戶的rtsp會話。于是類RTSPServer::RTSPClientSession產(chǎn)生,它保存的代表客戶的socket。下為RTSPClientSession的創(chuàng)建過程
[cpp] view plain copyvoid RTSPServer::incomingConnectionHandler(int serverSocket) { struct sockaddr_in clientAddr; SOCKLEN_T clientAddrLen = sizeof clientAddr; //接受連接 int clientSocket = accept(serverSocket, (struct sockaddr*) &clientAddr, &clientAddrLen); if (clientSocket < 0) { int err = envir().getErrno(); if (err != EWOULDBLOCK) { envir().setResultErrMsg("accept() failed: "); } return; } //設(shè)置socket的參數(shù) makeSocketNonBlocking(clientSocket); increaseSendBufferTo(envir(), clientSocket, 50 * 1024); #ifdef DEBUG envir() << "accept()ed connection from " << our_inet_ntoa(clientAddr.sin_addr) << "/n"; #endif //產(chǎn)生一個sesson id // Create a new object for this RTSP session. // (Choose a random 32-bit integer for the session id (it will be encoded as a 8-digit hex number). We don't bother checking for // a collision; the probability of two concurrent sessions getting the same session id is very low.) // (We do, however, avoid choosing session id 0, because that has a special use (by "OnDemandServerMediaSubsession").) unsigned sessionId; do { sessionId = (unsigned) our_random(); } while (sessionId == 0); //創(chuàng)建RTSPClientSession,注意傳入的參數(shù) (void) createNewClientSession(sessionId, clientSocket, clientAddr); }RTSPClientSession要提供什么功能呢?可以想象:需要監(jiān)聽客戶端的rtsp請求并回應(yīng)它,需要在DESCRIBE請求中返回所請求的流的信息,需要在SETUP請求中建立起RTP會話,需要在TEARDOWN請求中關(guān)閉RTP會話,等等...
RTSPClientSession要偵聽客戶端的請求,就需把自己的socket handler加入計劃任務(wù)。證據(jù)如下:[cpp] view plain copyRTSPServer::RTSPClientSession::RTSPClientSession( RTSPServer& ourServer, unsigned sessionId, int clientSocket, struct sockaddr_in clientAddr) : fOurServer(ourServer), fOurSessionId(sessionId), fOurServerMediaSession(NULL), fClientInputSocket(clientSocket), fClientOutputSocket(clientSocket), fClientAddr(clientAddr), fSessionCookie(NULL), fLivenessCheckTask(NULL), fIsMulticast(False), fSessionIsActive(True), fStreamAfterSETUP(False), fTCPStreamIdCount(0), fNumStreamStates(0), fStreamStates(NULL), fRecursionCount(0) { // Arrange to handle incoming requests: resetRequestBuffer(); envir().taskScheduler().turnOnBackgroundReadHandling(fClientInputSocket, (TaskScheduler::BackgroundHandlerProc*) &incomingRequestHandler, this); noteLiveness(); } 下面重點講一下下RTSPClientSession響應(yīng)DESCRIBE請求的過程:[cpp] view plain copyvoid RTSPServer::RTSPClientSession::handleCmd_DESCRIBE( char const* cseq, char const* urlPreSuffix, char const* urlSuffix, char const* fullRequestStr) { char* sdpDescription = NULL; char* rtspURL = NULL; do { //整理一下下RTSP地址 char urlTotalSuffix[RTSP_PARAM_STRING_MAX]; if (strlen(urlPreSuffix) + strlen(urlSuffix) + 2 > sizeof urlTotalSuffix) { handleCmd_bad(cseq); break; } urlTotalSuffix[0] = '/0'; if (urlPreSuffix[0] != '/0') { strcat(urlTotalSuffix, urlPreSuffix); strcat(urlTotalSuffix, "/"); } strcat(urlTotalSuffix, urlSuffix); //驗證帳戶和密碼 if (!authenticationOK("DESCRIBE", cseq, urlTotalSuffix, fullRequestStr)) break; // We should really check that the request contains an "Accept:" ##### // for "application/sdp", because that's what we're sending back ##### // Begin by looking up the "ServerMediaSession" object for the specified "urlTotalSuffix": //跟據(jù)流的名字查找ServerMediaSession,如果找不到,會創(chuàng)建一個。每個ServerMediaSession中至少要包含一個 //ServerMediaSubsession。一個ServerMediaSession對應(yīng)一個媒體,可以認(rèn)為是Server上的一個文件,或一個實時獲取設(shè)備。其包含的每個ServerMediaSubSession代表媒體中的一個Track。所以一個ServerMediaSession對應(yīng)一個媒體,如果客戶請求的媒體名相同,就使用已存在的ServerMediaSession,如果不同,就創(chuàng)建一個新的。一個流對應(yīng)一個StreamState,StreamState與ServerMediaSubsession相關(guān),但代表的是動態(tài)的,而ServerMediaSubsession代表靜態(tài)的。 ServerMediaSession* session = fOurServer.lookupServerMediaSession(urlTotalSuffix); if (session == NULL) { handleCmd_notFound(cseq); break; } // Then, assemble a SDP description for this session: //獲取SDP字符串,在函數(shù)內(nèi)會依次獲取每個ServerMediaSubSession的字符串然連接起來。 sdpDescription = session->generateSDPDescription(); if (sdpDescription == NULL) { // This usually means that a file name that was specified for a // "ServerMediaSubsession" does not exist. snprintf((char*) fResponseBuffer, sizeof fResponseBuffer, "RTSP/1.0 404 File Not Found, Or In Incorrect Format/r/n" "CSeq: %s/r/n" "%s/r/n", cseq, dateHeader()); break; } unsigned sdpDescriptionSize = strlen(sdpDescription); // Also, generate our RTSP URL, for the "Content-Base:" header // (which is necessary to ensure that the correct URL gets used in // subsequent "SETUP" requests). rtspURL = fOurServer.rtspURL(session, fClientInputSocket); //形成響應(yīng)DESCRIBE請求的RTSP字符串。 snprintf((char*) fResponseBuffer, sizeof fResponseBuffer, "RTSP/1.0 200 OK/r/nCSeq: %s/r/n" "%s" "Content-Base: %s//r/n" "Content-Type: application/sdp/r/n" "Content-Length: %d/r/n/r/n" "%s", cseq, dateHeader(), rtspURL, sdpDescriptionSize, sdpDescription); } while (0); delete[] sdpDescription; delete[] rtspURL; //返回后會被立即發(fā)送(沒有把socket write操作放入計劃任務(wù)中)。 } fOurServer.lookupServerMediaSession(urlTotalSuffix)中會在找不到同名ServerMediaSession時新建一個,代表一個RTP流的ServerMediaSession們是被RTSPServer管理的,而不是被RTSPClientSession擁有。為什么呢?因為ServerMediaSession代表的是一個靜態(tài)的流,也就是可以從它里面獲取一個流的各種信息,但不能獲取傳輸狀態(tài)。不同客戶可能連接到同一個流,所以ServerMediaSession應(yīng)被RTSPServer所擁有。創(chuàng)建一個ServerMediaSession過程值得一觀:[cpp] view plain copystatic ServerMediaSession* createNewSMS(UsageEnvironment& env,char const* fileName, FILE* /*fid*/) { // Use the file name extension to determine the type of "ServerMediaSession": char const* extension = strrchr(fileName, '.'); if (extension == NULL) return NULL; ServerMediaSession* sms = NULL; Boolean const reuseSource = False; if (strcmp(extension, ".aac") == 0) { // Assumed to be an AAC Audio (ADTS format) file: NEW_SMS("AAC Audio"); sms->addSubsession( ADTSAudioFileServerMediaSubsession::createNew(env, fileName, reuseSource)); } else if (strcmp(extension, ".amr") == 0) { // Assumed to be an AMR Audio file: NEW_SMS("AMR Audio"); sms->addSubsession( AMRAudioFileServerMediaSubsession::createNew(env, fileName, reuseSource)); } else if (strcmp(extension, ".ac3") == 0) { // Assumed to be an AC-3 Audio file: NEW_SMS("AC-3 Audio"); sms->addSubsession( AC3AudioFileServerMediaSubsession::createNew(env, fileName, reuseSource)); } else if (strcmp(extension, ".m4e") == 0) { // Assumed to be a MPEG-4 Video Elementary Stream file: NEW_SMS("MPEG-4 Video"); sms->addSubsession( MPEG4VideoFileServerMediaSubsession::createNew(env, fileName, reuseSource)); } else if (strcmp(extension, ".264") == 0) { // Assumed to be a H.264 Video Elementary Stream file: NEW_SMS("H.264 Video"); OutPacketBuffer::maxSize = 100000; // allow for some possibly large H.264 frames sms->addSubsession( H264VideoFileServerMediaSubsession::createNew(env, fileName, reuseSource)); } else if (strcmp(extension, ".mp3") == 0) { // Assumed to be a MPEG-1 or 2 Audio file: NEW_SMS("MPEG-1 or 2 Audio"); // To stream using 'ADUs' rather than raw MP3 frames, uncomment the following: //#define STREAM_USING_ADUS 1 // To also reorder ADUs before streaming, uncomment the following: //#define INTERLEAVE_ADUS 1 // (For more information about ADUs and interleaving, // see <http://www.live555.com/rtp-mp3/>) Boolean useADUs = False; Interleaving* interleaving = NULL; #ifdef STREAM_USING_ADUS useADUs = True; #ifdef INTERLEAVE_ADUS unsigned char interleaveCycle[] = {0,2,1,3}; // or choose your own... unsigned const interleaveCycleSize = (sizeof interleaveCycle)/(sizeof (unsigned char)); interleaving = new Interleaving(interleaveCycleSize, interleaveCycle); #endif #endif sms->addSubsession( MP3AudioFileServerMediaSubsession::createNew(env, fileName, reuseSource, useADUs, interleaving)); } else if (strcmp(extension, ".mpg") == 0) { // Assumed to be a MPEG-1 or 2 Program Stream (audio+video) file: NEW_SMS("MPEG-1 or 2 Program Stream"); MPEG1or2FileServerDemux* demux = MPEG1or2FileServerDemux::createNew(env, fileName, reuseSource); sms->addSubsession(demux->newVideoServerMediaSubsession()); sms->addSubsession(demux->newAudioServerMediaSubsession()); } else if (strcmp(extension, ".ts") == 0) { // Assumed to be a MPEG Transport Stream file: // Use an index file name that's the same as the TS file name, except with ".tsx": unsigned indexFileNameLen = strlen(fileName) + 2; // allow for trailing "x/0" char* indexFileName = new char[indexFileNameLen]; sprintf(indexFileName, "%sx", fileName); NEW_SMS("MPEG Transport Stream"); sms->addSubsession( MPEG2TransportFileServerMediaSubsession::createNew(env, fileName, indexFileName, reuseSource)); delete[] indexFileName; } else if (strcmp(extension, ".wav") == 0) { // Assumed to be a WAV Audio file: NEW_SMS("WAV Audio Stream"); // To convert 16-bit PCM data to 8-bit u-law, prior to streaming, // change the following to True: Boolean convertToULaw = False; sms->addSubsession( WAVAudioFileServerMediaSubsession::createNew(env, fileName, reuseSource, convertToULaw)); } else if (strcmp(extension, ".dv") == 0) { // Assumed to be a DV Video file // First, make sure that the RTPSinks' buffers will be large enough to handle the huge size of DV frames (as big as 288000). OutPacketBuffer::maxSize = 300000; NEW_SMS("DV Video"); sms->addSubsession( DVVideoFileServerMediaSubsession::createNew(env, fileName, reuseSource)); } else if (strcmp(extension, ".mkv") == 0) { // Assumed to be a Matroska file NEW_SMS("Matroska video+audio+(optional)subtitles"); // Create a Matroska file server demultiplexor for the specified file. (We enter the event loop to wait for this to complete.) newMatroskaDemuxWatchVariable = 0; MatroskaFileServerDemux::createNew(env, fileName, onMatroskaDemuxCreation, NULL); env.taskScheduler().doEventLoop(&newMatroskaDemuxWatchVariable); ServerMediaSubsession* smss; while ((smss = demux->newServerMediaSubsession()) != NULL) { sms->addSubsession(smss); } } return sms; } 可以看到NEW_SMS("AMR Audio")會創(chuàng)建新的ServerMediaSession,之后馬上調(diào)用sms->addSubsession()為這個ServerMediaSession添加一個 ServerMediaSubSession ??雌饋鞸erverMediaSession應(yīng)該可以添加多個ServerMediaSubSession,但這里并沒有這樣做。如果可以添加多個 ServerMediaSubsession 那么ServerMediaSession與流名字所指定與文件是沒有關(guān)系的,也就是說它不會操作文件,而文件的操作是放在 ServerMediaSubsession中的。具體應(yīng)改是在ServerMediaSubsession的sdpLines()函數(shù)中打開。新聞熱點
疑難解答