cocoartmp.com

iOS framework to easily build your broadcasting app

Home > FAQ's
 

Version 4.2:

  • iOS 9 and XCode 7 compatibility
  • adobe user / password authentication
  • swift support
  • RtmpSettings.h
    is deprecated, configuration is applied in runtime
    • bufferLength
      of video playback can be set on
      RtmpMediaStream
      object. By default 3 seconds
    • chunk size of video packet (transfer size) can be set directly on
      RtmpMediaStream
      object, default value is 1024. Make sure to use larger values, for video resolutions like FullHD or 4K (up to 100KB)
    • keep alive
      timeoutInterval
      is accessible on
      RtmpClient
      object. Keep this number a bit longer to server keep alive ping period, by default 60 seconds.
    • adaptiveFrameRate
      can be enabled on
      RtmpCaptureSession
      object, enabled by default
    • criticalFrameRate
      is threshold when the lower / higher bandwidth detection is triggered. This configuration can be set on
      RtmpCaptureSession
      object. 10 FPS by default
    • frameOverdueAge
      is the maximum age of frames which should be transferred. The older frames are dropped. This configuration prevents increasing delay in video and can be set on
      RtmpCaptureSession
      object. By default 2 seconds
    • fragmentLength
      of video file buffer. This value can be set on
      RtmpCaptureSession
      object, by default 240 seconds
    • objectEncoding
      of RTMP communication can be selected between AMF3 and older AMF0 on
      RtmpPacket
      class using it’s class method
      [RtmpPacket setObjectEncoding:TYPE_AMF3]
      . Encoding AMF3 is configured by default
  • auto-adaption of video quality to current internet bandwidth
    • adaptive frame rate in video, none frames are dropped
    • new delegate methods on
      RtmpCaptureSession
      called when lower or higher bandwidth detected
    • auto adjust the video resolution. (sample code can be found in
      Stream Bus
      application,
      ViewController.m
      see
      decreaseResolutionToAdaptBandwidth
      method.
    • how it works? The frame rate is changed after each 3rd second. Resolution is decreased when critical frame rate is detected for more than 9 seconds and increased after 30s of good connection bandwidth.
    • behaviour of auto-adaption can be controlled in run-time, using the
      criticalFrameRate
      and
      kCriticalTreshold
      on
      RtmpCaptureSession
      object
  • better statistics for video transfer like bandwidth and bytes transferred, accessible on
    RtmpCaptureSession
    object (
    globalStats
    )
  • doubled battery life while broadcasting the video
  • faster disconnect by revoking the larger packets immediately.
  • low disk space detection delegate on
    RtmpCaptureSession
  • multiplayer example
  • Bug fixes

 

Version 4.1:

  • AMF3 encoding and mixed encoding, new constant in RtmpSettings.h 
    int RTMP_OBJECT_ENCODING = TYPE_AMF3;
  • support for SharedObjects encoded in AMF3
  • improved inactivity detection while broadcasting
  • disconnect immediately when server close the connection
  • improved reaction time to seek and pause functions during the video playback
  • unitized play function for all media servers (FMS, Wowza, Red5)
  • query metadata directly on RtmpMediaStream (e.g.: FPS, width, hight, audioCodecID, videoCodecID,…)
  • unitized metadata elements between video file format v10 and v10.1 (e.g.: videoCodecID is double in v10 vs. NSString in v10.1)
  • optional use - GPUImage to render the video using OpenGL (does not affect any performance but enables special effects). New switch 
    #define USE_GPU 
    in the RtmpSettings.h, uncomment
    #pod 'GPUImage', '~> 0.1.0’
    in podfile and run
    pod install
    .