V4L2 Brainstorming meeting notes (day 3) * Buffer pools (related to Linaro activities) Who is going to push this effort for V4L2? Laurent, Samsung, Hans? Samsung will continue works on CMA allocator. There are 3 building blocks: 1. contiguous memory allocator 2. iommu memory allocator 3. user interface for allocating and passing buffers We need: - requirements - evaluate existing 'solutions' Action: make a list of requirements by March 30th. Post on the linux-media mailinglist. Main platforms: Intel, ARM, SH. Also: what are the Xserver-specific requirements? Action: after that, discuss various solutions in view of the requirements. It would be good if we could have most of this done before the ELC starts (April 11). * Multiple buffer queues From Guennadi's RFC: 1. multiple video-buffer queues per device / filehandle: their allocation and switching between them. 2. either the above queue-alloc method, or for the VIDIOC_REQBUFS ioctl() has to accept an explicit "buffer-size" parameter 3. a "skip-cache-invalidate" flag for the above queue-alloc method, or for the VIDIOC_REQBUFS ioctl() 4. a "persistent" flag for VIDIOC_REQBUFS for buffer re-use. Note: item 4 will be postponed until we have a real use-case for this. Memory handling has been discussed during the 2010 Oslo brainstorming meeting. A summary of the discussions has been posted to the linux-media mailing list (http://www.spinics.net/lists/linux-media/msg17125.html). One or two flags to influence cache handling were defined, but we need to dig up some old meeting note to find those. Options: 1) have two vb2_queue objects 2) have one vb2_queue, but with an internal 'prepared' list of buffers. In both cases you need to be able to request buffers of a specific size (either explicitly specified, or by passing a v4l2_format and let the driver calculate the size based on the format). For 1 you need to switch between buffers. For 2 you need to be able to pre-queue buffers. Possible ideas: VIDIOC_CREATE_BUFS(bufcnt, size or fmt) VIDIOC_DESTROY_BUFS(start_index, bufcnt) To prepare a buffer: PREPARE_BUF(v4l2_buffer). skip-cache-invalidate flag: should be per plane. * Entity information ioctl Applications can need more information than what's provided by the media controller entity information ioctl to identify identities correctly. For instance, a UVC entity is identified by a 16-bytes GUID, which is not reported by entity enumeration. Another issue arises when subdev type needs to be reported: the current types are mutually exclusive, and can't handle an entity that is both a sensor and a lens controller for instance. To solve those problems, an entity information ioctl should be added to report static information to userspace. That ioctl should report a list of properties (standardized by the media controller framework) in an easily extensible way. Action: Laurent will send an RFC. * Subdev pool: to be continued on IRC. * Pipeline configuration Instead of blanking width and height give the total width and height. That way the timing doesn't change when setting the output pad format and the output pad fills in the width and height of the active area. To solve cropping after scaling we need a new 'S_SCALER' ioctl to explicitly set up the scaler output. The s_fmt on the output pad would not set the resolution, but instead receive it from the subdev based on the subdev's configuration. Not clear what the implications are though, this needs more though. Hans thinks the API to setup cropping sizes based on scaler output is the wrong way round. Laurent disagrees. Suggested option is to have a command to calculate the crop based on a scaler output size. Perhaps userspace library? Much discussion, but it is too late and the topic is too complex to finish. * Action points Item 1: compressed format API. * Controls for decoding/encoding Action: Kamil will make a list of controls and send it to linux-media. Also prepare an RFC with proposed fourcc values for compressed streams. For H.264 and VC1 there should be two separate fourccs - one for raw fames and one for ES. Check Divx formats, how particular versions differ. Short RFC about joining codec and effect classes. * UVC and 'H.264 inside MJPEG' Action: Laurent will check if the 'H.264 inside MJPEG' format (width/height) of the embedded streams is identical to the main stream. * Do we need a flag to signal m2m devices? Action: Hans will look at this in relation to the VIDIOC_S/G_PRIORITY ioctls since those are meaningless with m2m devices. Item 2: Small architecture enhancements: * Acquiring subdevs from other devices using subdev pool No conclusion reached. To be continued on IRC with Tomasz. * Introducing subdev hierarchy Approved. Action: Tomasz can proceed with this. If this needs to be exposed to the MC, then the MC needs to be modified to know about the parent-children relationship. * Allow per-filehandle control handlers. Approved. Action: Hans will implement this as part of the control events implementation. * How to control FrameBuffer device as v4l2 sub-device? Fixed with container_of construction. * Which interface is better for Mixer of Exynos, frame buffer or V4l2? Action: Marek will investigate whether it is possible to make a generic 'FB on top of V4L2' solution using vb2. * Entity information ioctl Action: Laurent will make an RFC with a proposed solution. The idea is to provide a list of read-only 'properties' or 'attributes' for an entity. Item 3:Pipeline configuration, cropping and scaling: * Cropping/Composing. Action: RFC from Samsung suggesting a VIDIOC_S_EXTCROP and VIDIOC_S_COMPOSE. * Pipeline configuration. No resolution. Suggestion to let the subdev set the width + height on output pads did not reach a final conclusion. For scaling + clipping a S_SCALER (or something like that) was proposed. Other option: 'transaction' type configuration. Also not concluded is how to validate/try configurations of a subdev with multiple inputs/outputs/transformations that all depend on one another. Action: Hans will think about this more. To be discussed on the list/IRC. Item 4: HDMI receiver/transmitter API support * Overall approval of EDID/hotplug/etc. handling. Samsung agrees that the CEC data should be handled in userspace. Action: Martin/Hans: update the APIs to the subdev pad API and incorporate the comments made and make a new RFC. Colorspace HDMI handling needs to be discussed further on the list. Item 5: Sensor/Flash/Snapshot functionality. * Sensor blanking/pixel-clock/frame-rate settings. Use total width/height instead of the blanking margins. Action: Laurent can incorporate this when he adds this support. * Synchronising parameters (e.g. exposure time and gain) on given frames. Will use private ioctls for this until we know how common this is. Tentative Action: RFC from Nokia. * Flash synchronisation. Flash triggering and flash-frame metadata (see also below) is considered hw specific. Each hw should have its own implementation. Common flash attributes however can be designed as common controls. Action: RFC from Sakari. * Frame metadata. Up to the driver based on hw capabilities. Where possible the source of the metadata should parse it (since only the source knows how to handle the contents). Tentative Action: RFC from Nokia. * Multiple video buffer queues per device. The proposed solution is to keep one queue, and instead add three new ioctls: CREATE_BUFS, DESTROY_BUFS and PREPARE_BUF. The first two make it possible to pass buffer sizes/formats when creating buffers and to release them when no longer needed. The last prepares the buffer memory for use by the driver without actually queuing it. Note: userptrs will be passed to create_bufs which will also pin it in memory where possible. Action: Guennadi: RFC and a guesstimate of the impact it has for vb2 and drivers. Item 6: Buffer pools (related to Linaro activities) * Action: All: make a list of requirements by March 30th. Post on the linux-media mailinglist. * Action: All: after that, discuss various solutions in view of the requirements. It would be good if we could have most of this done before the ELC starts (April 11).