国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 學(xué)院 > 開發(fā)設(shè)計 > 正文

live555學(xué)習(xí)筆記8-RTSPClient分析

2019-11-08 18:27:23
字體:
供稿:網(wǎng)友

八 RTSPClient分析

有RTSPServer,當(dāng)然就要有RTSPClient。如果按照Server端的架構(gòu),想一下Client端各部分的組成可能是這樣:因為要連接RTSP server,所以RTSPClient要有TCP socket。當(dāng)獲取到server端的DESCRIBE后,應(yīng)建立一個對應(yīng)于ServerMediasession的ClientMediaSession。對應(yīng)每個Track,ClientMediaSession中應(yīng)建立ClientMediaSubsession。當(dāng)建立RTP Session時,應(yīng)分別為所擁有的Track發(fā)送SETUP請求連接,在獲取回應(yīng)后,分別為所有的track建立RTP socket,然后請求PLAY,然后開始傳輸數(shù)據(jù)。事實是這樣嗎?只能分析代碼了。

testPRogs中的OpenRTSP是典型的RTSPClient示例,所以分析它吧。main()函數(shù)在playCommon.cpp文件中。main()的流程比較簡單,跟服務(wù)端差別不大:建立任務(wù)計劃對象--建立環(huán)境對象--處理用戶輸入的參數(shù)(RTSP地址)--創(chuàng)建RTSPClient實例--發(fā)出第一個RTSP請求(可能是OPTIONS也可能是DESCRIBE)--進(jìn)入Loop。

RTSP的tcp連接是在發(fā)送第一個RTSP請求時才建立的,在RTSPClient的那幾個發(fā)請求的函數(shù)sendXXXXXXCommand()中最終都調(diào)用sendRequest(),sendRequest()中會跟據(jù)情況建立起TCP連接。在建立連接時馬上向任務(wù)計劃中加入處理從這個TCP接收數(shù)據(jù)的socket handler:RTSPClient::incomingDataHandler()。下面就是發(fā)送RTSP請求,OPTIONS就不必看了,從請求DESCRIBE開始:

[cpp] view plain copyvoid getSDPDescription(RTSPClient::responseHandler* afterFunc)  {      ourRTSPClient->sendDescribeCommand(afterFunc, ourAuthenticator);  }  unsigned RTSPClient::sendDescribeCommand(responseHandler* responseHandler,          Authenticator* authenticator)  {      if (authenticator != NULL)          fCurrentAuthenticator = *authenticator;      return sendRequest(new RequestRecord(++fCSeq, "DESCRIBE", responseHandler));  }  參數(shù)responseHandler是調(diào)用者提供的回調(diào)函數(shù),用于在處理完請求的回應(yīng)后再調(diào)用之。并且在這個回調(diào)函數(shù)中會發(fā)出下一個請求--所有的請求都是這樣依次發(fā)出的。使用回調(diào)函數(shù)的原因主要是因為socket的發(fā)送與接收不是同步進(jìn)行的。類RequestRecord就代表一個請求,它不但保存了RTSP請求相關(guān)的信息,而且保存了請求完成后的回調(diào)函數(shù)--就是responseHandler。有些請求發(fā)出時還沒建立tcp連接,不能立即發(fā)送,則加入fRequestsAwaitingConnection隊列;有些發(fā)出后要等待Server端的回應(yīng),就加入fRequestsAwaitingResponse隊列,當(dāng)收到回應(yīng)后再從隊列中把它取出。由于RTSPClient::sendRequest()太復(fù)雜,就不列其代碼了,其無非是建立起RTSP請求字符串然后用TCP socket發(fā)送之。

現(xiàn)在看一下收到DESCRIBE的回應(yīng)后如何處理它。理論上是跟據(jù)媒體信息建立起MediaSession了,看看是不是這樣:

[cpp] view plain copyvoid continueAfterDESCRIBE(RTSPClient*, int resultCode, char* resultString)  {      char* sdpDescription = resultString;      //跟據(jù)SDP創(chuàng)建MediaSession。      // Create a media session object from this SDP description:      session = MediaSession::createNew(*env, sdpDescription);      delete[] sdpDescription;        // Then, setup the "RTPSource"s for the session:      MediaSubsessionIterator iter(*session);      MediaSubsession *subsession;      Boolean madeProgress = False;      char const* singleMediumToTest = singleMedium;      //循環(huán)所有的MediaSubsession,為每個設(shè)置其RTPSource的參數(shù)      while ((subsession = iter.next()) != NULL) {          //初始化subsession,在其中會建立RTP/RTCP socket以及RTPSource。          if (subsession->initiate(simpleRTPoffsetArg)) {              madeProgress = True;              if (subsession->rtpSource() != NULL) {                  // Because we're saving the incoming data, rather than playing                  // it in real time, allow an especially large time threshold                  // (1 second) for reordering misordered incoming packets:                  unsigned const thresh = 1000000; // 1 second                  subsession->rtpSource()->setPacketReorderingThresholdTime(thresh);                    // Set the RTP source's OS socket buffer size as appropriate - either if we were explicitly asked (using -B),                  // or if the desired FileSink buffer size happens to be larger than the current OS socket buffer size.                  // (The latter case is a heuristic, on the assumption that if the user asked for a large FileSink buffer size,                  // then the input data rate may be large enough to justify increasing the OS socket buffer size also.)                  int socketNum = subsession->rtpSource()->RTPgs()->socketNum();                  unsigned curBufferSize = getReceiveBufferSize(*env,socketNum);                  if (socketInputBufferSize > 0 || fileSinkBufferSize > curBufferSize) {                      unsigned newBufferSize = socketInputBufferSize > 0 ?                           socketInputBufferSize : fileSinkBufferSize;                      newBufferSize = setReceiveBufferTo(*env, socketNum, newBufferSize);                      if (socketInputBufferSize > 0) { // The user explicitly asked for the new socket buffer size; announce it:                          *env                                  << "Changed socket receive buffer size for the /""                                  << subsession->mediumName() << "/"                                  << subsession->codecName()                                  << "/" subsession from " << curBufferSize                                  << " to " << newBufferSize << " bytes/n";                      }                  }              }          }      }      if (!madeProgress)          shutdown();        // Perform additional 'setup' on each subsession, before playing them:      //下一步就是發(fā)送SETUP請求了。需要為每個Track分別發(fā)送一次。      setupStreams();  }  此函數(shù)被刪掉很多枝葉,所以發(fā)現(xiàn)與原版不同請不要驚掉大牙。的確在DESCRIBE回應(yīng)后建立起了MediaSession,而且我們發(fā)現(xiàn)Client端的MediaSession不叫ClientMediaSesson,SubSession亦不是。我現(xiàn)在很想看看MediaSession與MediaSubsession的建立過程:

[cpp] view plain copyMediaSession* MediaSession::createNew(UsageEnvironment& env,char const* sdpDescription)  {      MediaSession* newSession = new MediaSession(env);      if (newSession != NULL) {          if (!newSession->initializeWithSDP(sdpDescription)) {              delete newSession;              return NULL;          }      }        return newSession;  }  我可以告訴你,MediaSession的構(gòu)造函數(shù)沒什么可看的,那么就來看initializeWithSDP():內(nèi)容太多,不必看了,我大體說說吧:就是處理SDP,跟據(jù)每一行來初始化一些變量。當(dāng)遇到"m="行時,就建立一個MediaSubsession,然后再處理這一行之下,下一個"m="行之上的行們,用這些參數(shù)初始化MediaSubsession的變量。循環(huán)往復(fù),直到盡頭。然而這其中并沒有建立RTP socket。我們發(fā)現(xiàn)在continueAfterDESCRIBE()中,創(chuàng)建MediaSession之后又調(diào)用了subsession->initiate(simpleRTPoffsetArg),那么socket是不是在它里面創(chuàng)建的呢?look:

[cpp] view plain copyBoolean MediaSubsession::initiate(int useSpecialRTPoffset)  {      if (fReadSource != NULL)          return True; // has already been initiated        do {          if (fCodecName == NULL) {              env().setResultMsg("Codec is unspecified");              break;          }            //創(chuàng)建RTP/RTCP sockets          // Create RTP and RTCP 'Groupsocks' on which to receive incoming data.          // (Groupsocks will work even for unicast addresses)          struct in_addr tempAddr;          tempAddr.s_addr = connectionEndpointAddress();          // This could get changed later, as a result of a RTSP "SETUP"            if (fClientPortNum != 0) {              //當(dāng)server端指定了建議的client端口              // The sockets' port numbers were specified for us.  Use these:              fClientPortNum = fClientPortNum & ~1; // even              if (isSSM()) {                  fRTPSocket = new Groupsock(env(), tempAddr, fsourceFilterAddr,                          fClientPortNum);              } else {                  fRTPSocket = new Groupsock(env(), tempAddr, fClientPortNum,                          255);              }              if (fRTPSocket == NULL) {                  env().setResultMsg("Failed to create RTP socket");                  break;              }                // Set our RTCP port to be the RTP port +1              portNumBits const rtcpPortNum = fClientPortNum | 1;              if (isSSM()) {                  fRTCPSocket = new Groupsock(env(), tempAddr, fSourceFilterAddr,                          rtcpPortNum);              } else {                  fRTCPSocket = new Groupsock(env(), tempAddr, rtcpPortNum, 255);              }              if (fRTCPSocket == NULL) {                  char tmpBuf[100];                  sprintf(tmpBuf, "Failed to create RTCP socket (port %d)",                          rtcpPortNum);                  env().setResultMsg(tmpBuf);                  break;              }          } else {              //Server端沒有指定client端口,我們自己找一個。之所以做的這樣復(fù)雜,是為了能找到連續(xù)的兩個端口              //RTP/RTCP的端口號不是要連續(xù)嗎?還記得不?              // Port numbers were not specified in advance, so we use ephemeral port numbers.              // Create sockets until we get a port-number pair (even: RTP; even+1: RTCP).              // We need to make sure that we don't keep trying to use the same bad port numbers over and over again.              // so we store bad sockets in a table, and delete them all when we're done.              HashTable* socketHashTable = HashTable::create(ONE_Word_HASH_KEYS);              if (socketHashTable == NULL)                  break;              Boolean success = False;              NoReuse dummy; // ensures that our new ephemeral port number won't be one that's already in use                while (1) {                  // Create a new socket:                  if (isSSM()) {                      fRTPSocket = new Groupsock(env(), tempAddr,                              fSourceFilterAddr, 0);                  } else {                      fRTPSocket = new Groupsock(env(), tempAddr, 0, 255);                  }                  if (fRTPSocket == NULL) {                      env().setResultMsg(                              "MediaSession::initiate(): unable to create RTP and RTCP sockets");                      break;                  }                    // Get the client port number, and check whether it's even (for RTP):                  Port clientPort(0);                  if (!getSourcePort(env(), fRTPSocket->socketNum(),                          clientPort)) {                      break;                  }                  fClientPortNum = ntohs(clientPort.num());                  if ((fClientPortNum & 1) != 0) { // it's odd                      // Record this socket in our table, and keep trying:                      unsigned key = (unsigned) fClientPortNum;                      Groupsock* existing = (Groupsock*) socketHashTable->Add(                              (char const*) key, fRTPSocket);                      delete existing; // in case it wasn't NULL                      continue;                  }                    // Make sure we can use the next (i.e., odd) port number, for RTCP:                  portNumBits rtcpPortNum = fClientPortNum | 1;                  if (isSSM()) {                      fRTCPSocket = new Groupsock(env(), tempAddr,                              fSourceFilterAddr, rtcpPortNum);                  } else {                      fRTCPSocket = new Groupsock(env(), tempAddr, rtcpPortNum,                              255);                  }                  if (fRTCPSocket != NULL && fRTCPSocket->socketNum() >= 0) {                      // Success! Use these two sockets.                      success = True;                      break;                  } else {                      // We couldn't create the RTCP socket (perhaps that port number's already in use elsewhere?).                      delete fRTCPSocket;                        // Record the first socket in our table, and keep trying:                      unsigned key = (unsigned) fClientPortNum;                      Groupsock* existing = (Groupsock*) socketHashTable->Add(                              (char const*) key, fRTPSocket);                      delete existing; // in case it wasn't NULL                      continue;                  }              }                // Clean up the socket hash table (and contents):              Groupsock* oldGS;              while ((oldGS = (Groupsock*) socketHashTable->RemoveNext()) != NULL) {                  delete oldGS;              }              delete socketHashTable;                if (!success)                  break; // a fatal error occurred trying to create the RTP and RTCP sockets; we can't continue          }            // Try to use a big receive buffer for RTP - at least 0.1 second of          // specified bandwidth and at least 50 KB          unsigned rtpBufSize = fBandwidth * 25 / 2; // 1 kbps * 0.1 s = 12.5 bytes          if (rtpBufSize < 50 * 1024)              rtpBufSize = 50 * 1024;          increaseReceiveBufferTo(env(), fRTPSocket->socketNum(), rtpBufSize);            // ASSERT: fRTPSocket != NULL && fRTCPSocket != NULL          if (isSSM()) {              // Special case for RTCP SSM: Send RTCP packets back to the source via unicast:              fRTCPSocket->changeDestinationParameters(fSourceFilterAddr, 0, ~0);          }            //創(chuàng)建RTPSource的地方          // Create "fRTPSource" and "fReadSource":          if (!createSourceObjects(useSpecialRTPoffset))              break;            if (fReadSource == NULL) {              env().setResultMsg("Failed to create read source");              break;          }            // Finally, create our RTCP instance. (It starts running automatically)          if (fRTPSource != NULL) {              // If bandwidth is specified, use it and add 5% for RTCP overhead.              // Otherwise make a guess at 500 kbps.              unsigned totSessionBandwidth =                      fBandwidth ? fBandwidth + fBandwidth / 20 : 500;              fRTCPInstance = RTCPInstance::createNew(env(), fRTCPSocket,                      totSessionBandwidth, (unsigned char const*) fParent.CNAME(),                      NULL /* we're a client */, fRTPSource);              if (fRTCPInstance == NULL) {                  env().setResultMsg("Failed to create RTCP instance");                  break;              }          }            return True;      } while (0);        //失敗時執(zhí)行到這里      delete fRTPSocket;      fRTPSocket = NULL;      delete fRTCPSocket;      fRTCPSocket = NULL;      Medium::close(fRTCPInstance);      fRTCPInstance = NULL;      Medium::close(fReadSource);      fReadSource = fRTPSource = NULL;      fClientPortNum = 0;      return False;  }  是的,在其中創(chuàng)建了RTP/RTCP socket并創(chuàng)建了RTPSource,創(chuàng)建RTPSource在函數(shù)createSourceObjects()中,看一下:

[cpp] view plain copyBoolean MediaSubsession::createSourceObjects(int useSpecialRTPoffset)  {      do {          // First, check "fProtocolName"          if (strcmp(fProtocolName, "UDP") == 0) {              // A UDP-packetized stream (*not* a RTP stream)              fReadSource = BasicUDPSource::createNew(env(), fRTPSocket);              fRTPSource = NULL; // Note!                if (strcmp(fCodecName, "MP2T") == 0) { // MPEG-2 Transport Stream                  fReadSource = MPEG2TransportStreamFramer::createNew(env(),                          fReadSource);                  // this sets "durationInMicroseconds" correctly, based on the PCR values              }          } else {              // Check "fCodecName" against the set of codecs that we support,              // and create our RTP source accordingly              // (Later make this code more efficient, as this set grows #####)              // (Also, add more fmts that can be implemented by SimpleRTPSource#####)              Boolean createSimpleRTPSource = False; // by default; can be changed below              Boolean doNormalMBitRule = False; // default behavior if "createSimpleRTPSource" is True              if (strcmp(fCodecName, "QCELP") == 0) { // QCELP audio                  fReadSource = QCELPAudioRTPSource::createNew(env(), fRTPSocket,                          fRTPSource, fRTPPayloadFormat, fRTPTimestampFrequency);                  // Note that fReadSource will differ from fRTPSource in this case              } else if (strcmp(fCodecName, "AMR") == 0) { // AMR audio (narrowband)                  fReadSource = AMRAudioRTPSource::createNew(env(), fRTPSocket,                          fRTPSource, fRTPPayloadFormat, 0 /*isWideband*/,                          fNumChannels, fOctetalign, fInterleaving,                          fRobustsorting, fCRC);                  // Note that fReadSource will differ from fRTPSource in this case              } else if (strcmp(fCodecName, "AMR-WB") == 0) { // AMR audio (wideband)                  fReadSource = AMRAudioRTPSource::createNew(env(), fRTPSocket,                          fRTPSource, fRTPPayloadFormat, 1 /*isWideband*/,                          fNumChannels, fOctetalign, fInterleaving,                          fRobustsorting, fCRC);                  // Note that fReadSource will differ from fRTPSource in this case              } else if (strcmp(fCodecName, "MPA") == 0) { // MPEG-1 or 2 audio                  fReadSource = fRTPSource = MPEG1or2AudioRTPSource::createNew(                          env(), fRTPSocket, fRTPPayloadFormat,                          fRTPTimestampFrequency);              } else if (strcmp(fCodecName, "MPA-ROBUST") == 0) { // robust MP3 audio                  fRTPSource = MP3ADURTPSource::createNew(env(), fRTPSocket,                          fRTPPayloadFormat, fRTPTimestampFrequency);                  if (fRTPSource == NULL)                      break;                    // Add a filter that deinterleaves the ADUs after depacketizing them:                  MP3ADUdeinterleaver* deinterleaver = MP3ADUdeinterleaver::createNew(                          env(), fRTPSource);                  if (deinterleaver == NULL)                      break;                    // Add another filter that converts these ADUs to MP3 frames:                  fReadSource = MP3FromADUSource::createNew(env(), deinterleaver);              } else if (strcmp(fCodecName, "X-MP3-DRAFT-00") == 0) {                  // a non-standard variant of "MPA-ROBUST" used by RealNetworks                  // (one 'ADU'ized MP3 frame per packet; no headers)                  fRTPSource = SimpleRTPSource::createNew(env(), fRTPSocket,                          fRTPPayloadFormat, fRTPTimestampFrequency,                          "audio/MPA-ROBUST" /*hack*/);                  if (fRTPSource == NULL)                      break;                    // Add a filter that converts these ADUs to MP3 frames:                  fReadSource = MP3FromADUSource::createNew(env(), fRTPSource,                          False /*no ADU header*/);              } else if (strcmp(fCodecName, "MP4A-LATM") == 0) { // MPEG-4 LATM audio                  fReadSource = fRTPSource = MPEG4LATMAudioRTPSource::createNew(                          env(), fRTPSocket, fRTPPayloadFormat,                          fRTPTimestampFrequency);              } else if (strcmp(fCodecName, "AC3") == 0                      || strcmp(fCodecName, "EAC3") == 0) { // AC3 audio                  fReadSource = fRTPSource = AC3AudioRTPSource::createNew(env(),                          fRTPSocket, fRTPPayloadFormat, fRTPTimestampFrequency);              } else if (strcmp(fCodecName, "MP4V-ES") == 0) { // MPEG-4 Elem Str vid                  fReadSource = fRTPSource = MPEG4ESVideoRTPSource::createNew(                          env(), fRTPSocket, fRTPPayloadFormat,                          fRTPTimestampFrequency);              } else if (strcmp(fCodecName, "MPEG4-GENERIC") == 0) {                  fReadSource = fRTPSource = MPEG4GenericRTPSource::createNew(                          env(), fRTPSocket, fRTPPayloadFormat,                          fRTPTimestampFrequency, fMediumName, fMode, fSizelength,                          fIndexlength, fIndexdeltalength);              } else if (strcmp(fCodecName, "MPV") == 0) { // MPEG-1 or 2 video                  fReadSource = fRTPSource = MPEG1or2VideoRTPSource::createNew(                          env(), fRTPSocket, fRTPPayloadFormat,                          fRTPTimestampFrequency);              } else if (strcmp(fCodecName, "MP2T") == 0) { // MPEG-2 Transport Stream                  fRTPSource = SimpleRTPSource::createNew(env(), fRTPSocket,                          fRTPPayloadFormat, fRTPTimestampFrequency, "video/MP2T",                          0, False);                  fReadSource = MPEG2TransportStreamFramer::createNew(env(),                          fRTPSource);                  // this sets "durationInMicroseconds" correctly, based on the PCR values              } else if (strcmp(fCodecName, "H261") == 0) { // H.261                  fReadSource = fRTPSource = H261VideoRTPSource::createNew(env(),                          fRTPSocket, fRTPPayloadFormat, fRTPTimestampFrequency);              } else if (strcmp(fCodecName, "H263-1998") == 0                      || strcmp(fCodecName, "H263-2000") == 0) { // H.263+                  fReadSource = fRTPSource = H263plusVideoRTPSource::createNew(                          env(), fRTPSocket, fRTPPayloadFormat,                          fRTPTimestampFrequency);              } else if (strcmp(fCodecName, "H264") == 0) {                  fReadSource = fRTPSource = H264VideoRTPSource::createNew(env(),                          fRTPSocket, fRTPPayloadFormat, fRTPTimestampFrequency);              } else if (strcmp(fCodecName, "DV") == 0) {                  fReadSource = fRTPSource = DVVideoRTPSource::createNew(env(),                          fRTPSocket, fRTPPayloadFormat, fRTPTimestampFrequency);              } else if (strcmp(fCodecName, "JPEG") == 0) { // motion JPEG                  fReadSource = fRTPSource = JPEGVideoRTPSource::createNew(env(),                          fRTPSocket, fRTPPayloadFormat, fRTPTimestampFrequency,                          videoWidth(), videoHeight());              } else if (strcmp(fCodecName, "X-QT") == 0                      || strcmp(fCodecName, "X-QUICKTIME") == 0) {                  // Generic QuickTime streams, as defined in                  // <http://developer.apple.com/quicktime/icefloe/dispatch026.html>                  char* mimeType = new char[strlen(mediumName())                          + strlen(codecName()) + 2];                  sprintf(mimeType, "%s/%s", mediumName(), codecName());                  fReadSource = fRTPSource = QuickTimeGenericRTPSource::createNew(                          env(), fRTPSocket, fRTPPayloadFormat,                          fRTPTimestampFrequency, mimeType);                  delete[] mimeType;              } else if (strcmp(fCodecName, "PCMU") == 0 // PCM u-law audio              || strcmp(fCodecName, "GSM") == 0 // GSM audio              || strcmp(fCodecName, "DVI4") == 0 // DVI4 (IMA ADPCM) audio              || strcmp(fCodecName, "PCMA") == 0 // PCM a-law audio              || strcmp(fCodecName, "MP1S") == 0 // MPEG-1 System Stream              || strcmp(fCodecName, "MP2P") == 0 // MPEG-2 Program Stream              || strcmp(fCodecName, "L8") == 0 // 8-bit linear audio              || strcmp(fCodecName, "L16") == 0 // 16-bit linear audio              || strcmp(fCodecName, "L20") == 0 // 20-bit linear audio (RFC 3190)              || strcmp(fCodecName, "L24") == 0 // 24-bit linear audio (RFC 3190)              || strcmp(fCodecName, "G726-16") == 0 // G.726, 16 kbps              || strcmp(fCodecName, "G726-24") == 0 // G.726, 24 kbps              || strcmp(fCodecName, "G726-32") == 0 // G.726, 32 kbps              || strcmp(fCodecName, "G726-40") == 0 // G.726, 40 kbps              || strcmp(fCodecName, "SPEEX") == 0 // SPEEX audio              || strcmp(fCodecName, "T140") == 0 // T.140 text (RFC 4103)              || strcmp(fCodecName, "DAT12") == 0 // 12-bit nonlinear audio (RFC 3190)                      ) {                  createSimpleRTPSource = True;                  useSpecialRTPoffset = 0;              } else if (useSpecialRTPoffset >= 0) {                  // We don't know this RTP payload format, but try to receive                  // it using a 'SimpleRTPSource' with the specified header offset:                  createSimpleRTPSource = True;              } else {                  env().setResultMsg(                          "RTP payload format unknown or not supported");                  break;              }                if (createSimpleRTPSource) {                  char* mimeType = new char[strlen(mediumName())                          + strlen(codecName()) + 2];                  sprintf(mimeType, "%s/%s", mediumName(), codecName());                  fReadSource = fRTPSource = SimpleRTPSource::createNew(env(),                          fRTPSocket, fRTPPayloadFormat, fRTPTimestampFrequency,                          mimeType, (unsigned) useSpecialRTPoffset,                          doNormalMBitRule);                  delete[] mimeType;              }          }            return True;      } while (0);        return False; // an error occurred  }  可以看到,這個函數(shù)里主要是跟據(jù)前面分析出的媒體和傳輸信息建立合適的Source。

socket建立了,Source也創(chuàng)建了,下一步應(yīng)該是連接Sink,形成一個流。到此為止還未看到Sink的影子,應(yīng)該是在下一步SETUP中建立,我們看到在continueAfterDESCRIBE()的最后調(diào)用了setupStreams(),那么就來探索一下setupStreams():

[cpp] view plain copyvoid setupStreams()  {      static MediaSubsessionIterator* setupIter = NULL;      if (setupIter == NULL)          setupIter = new MediaSubsessionIterator(*session);        //每次調(diào)用此函數(shù)只為一個Subsession發(fā)出SETUP請求。      while ((subsession = setupIter->next()) != NULL) {          // We have another subsession left to set up:          if (subsession->clientPortNum() == 0)              continue; // port # was not set            //為一個Subsession發(fā)送SETUP請求。請求處理完成時調(diào)用continueAfterSETUP(),          //continueAfterSETUP()又調(diào)用了setupStreams(),在此函數(shù)中為下一個SubSession發(fā)送SETUP請求。  [cpp] view plain copy<span style="white-space:pre">      </span>//直到處理完所有的SubSession          setupSubsession(subsession, streamUsingTCP, continueAfterSETUP);          return;      }        //執(zhí)行到這里時,已循環(huán)完所有的SubSession了      // We're done setting up subsessions.      delete setupIter;      if (!madeProgress)          shutdown();        //創(chuàng)建輸出文件,看來是在這里創(chuàng)建Sink了。創(chuàng)建sink后,就開始播放它。這個播放應(yīng)該只是把socket的handler加入到      //計劃任務(wù)中,而沒有數(shù)據(jù)的接收或發(fā)送。只有等到發(fā)出PLAY請求后才有數(shù)據(jù)的收發(fā)。      // Create output files:      if (createReceivers) {          if (outputQuickTimeFile) {              // Create a "QuickTimeFileSink", to write to 'stdout':              qtOut = QuickTimeFileSink::createNew(*env, *session, "stdout",                      fileSinkBufferSize, movieWidth, movieHeight, movieFPS,                      packetLossCompensate, syncStreams, generateHintTracks,                      generateMP4Format);              if (qtOut == NULL) {                  *env << "Failed to create QuickTime file sink for stdout: "                          << env->getResultMsg();                  shutdown();              }                qtOut->startPlaying(sessionAfterPlaying, NULL);          } else if (outputAVIFile) {              // Create an "AVIFileSink", to write to 'stdout':              aviOut = AVIFileSink::createNew(*env, *session, "stdout",                      fileSinkBufferSize, movieWidth, movieHeight, movieFPS,                      packetLossCompensate);              if (aviOut == NULL) {                  *env << "Failed to create AVI file sink for stdout: "                          << env->getResultMsg();                  shutdown();              }                aviOut->startPlaying(sessionAfterPlaying, NULL);          } else {              // Create and start "FileSink"s for each subsession:              madeProgress = False;              MediaSubsessionIterator iter(*session);              while ((subsession = iter.next()) != NULL) {                  if (subsession->readSource() == NULL)                      continue; // was not initiated                    // Create an output file for each desired stream:                  char outFileName[1000];                  if (singleMedium == NULL) {                      // Output file name is                      //     "<filename-prefix><medium_name>-<codec_name>-<counter>"                      static unsigned streamCounter = 0;                      snprintf(outFileName, sizeof outFileName, "%s%s-%s-%d",                              fileNamePrefix, subsession->mediumName(),                              subsession->codecName(), ++streamCounter);                  } else {                      sprintf(outFileName, "stdout");                  }                  FileSink* fileSink;                  if (strcmp(subsession->mediumName(), "audio") == 0                          && (strcmp(subsession->codecName(), "AMR") == 0                                  || strcmp(subsession->codecName(), "AMR-WB")                                          == 0)) {                      // For AMR audio streams, we use a special sink that inserts AMR frame hdrs:                      fileSink = AMRAudioFileSink::createNew(*env, outFileName,                              fileSinkBufferSize, oneFilePerFrame);                  } else if (strcmp(subsession->mediumName(), "video") == 0                          && (strcmp(subsession->codecName(), "H264") == 0)) {                      // For H.264 video stream, we use a special sink that insert start_codes:                      fileSink = H264VideoFileSink::createNew(*env, outFileName,                              subsession->fmtp_spropparametersets(),                              fileSinkBufferSize, oneFilePerFrame);                  } else {                      // Normal case:                      fileSink = FileSink::createNew(*env, outFileName,                              fileSinkBufferSize, oneFilePerFrame);                  }                  subsession->sink = fileSink;                  if (subsession->sink == NULL) {                      *env << "Failed to create FileSink for /"" << outFileName                              << "/": " << env->getResultMsg() << "/n";                  } else {                      if (singleMedium == NULL) {                          *env << "Created output file: /"" << outFileName                                  << "/"/n";                      } else {                          *env << "Outputting data from the /""                                  << subsession->mediumName() << "/"                                  << subsession->codecName()                                  << "/" subsession to 'stdout'/n";                      }                        if (strcmp(subsession->mediumName(), "video") == 0                              && strcmp(subsession->codecName(), "MP4V-ES") == 0 &&                              subsession->fmtp_config() != NULL) {                          // For MPEG-4 video RTP streams, the 'config' information                          // from the SDP description contains useful VOL etc. headers.                          // Insert this data at the front of the output file:                          unsigned                    configLen;                          unsigned char* configData                          = parseGeneralConfigStr(subsession->fmtp_config(), configLen);                          struct timeval timeNow;                          gettimeofday(&timeNow, NULL);                          fileSink->addData(configData, configLen, timeNow);                          delete[] configData;                      }                        //開始傳輸                      subsession->sink->startPlaying(*(subsession->readSource()),                              subsessionAfterPlaying, subsession);                        // Also set a handler to be called if a RTCP "BYE" arrives                      // for this subsession:                      if (subsession->rtcpInstance() != NULL) {                          subsession->rtcpInstance()->setByeHandler(                                  subsessionByeHandler, subsession);                      }                        madeProgress = True;                  }              }              if (!madeProgress)                  shutdown();          }      }        // Finally, start playing each subsession, to start the data flow:      if (duration == 0) {          if (scale > 0)              duration = session->playEndTime() - initialSeekTime; // use SDP end time          else if (scale < 0)              duration = initialSeekTime;      }      if (duration < 0)          duration = 0.0;        endTime = initialSeekTime;      if (scale > 0) {          if (duration <= 0)              endTime = -1.0f;          else              endTime = initialSeekTime + duration;      } else {          endTime = initialSeekTime - duration;          if (endTime < 0)              endTime = 0.0f;      }        //發(fā)送PLAY請求,之后才能從Server端接收數(shù)據(jù)      startPlayingSession(session, initialSeekTime, endTime, scale,              continueAfterPLAY);  }  仔細(xì)看看注釋,應(yīng)很容易了解此函數(shù)。
發(fā)表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發(fā)表
主站蜘蛛池模板: 南宫市| 建瓯市| 垦利县| 成安县| 方城县| 乌海市| 察哈| 潍坊市| 兴海县| 云和县| 尖扎县| 兰州市| 闸北区| 赣榆县| 临泉县| 青浦区| 秦安县| 堆龙德庆县| 射阳县| 周至县| 呼和浩特市| 汪清县| 彭州市| 乐陵市| 轮台县| 武义县| 万年县| 曲靖市| 长海县| 贡山| 秀山| 招远市| 保定市| 永清县| 定州市| 日照市| 同心县| 汤原县| 金阳县| 德令哈市| 屯门区|