So I asked on #firstname.lastname@example.org if it possible to implement software renderer, Kristian Høgsberg (khr) responded that in master branch of weston repo there's a nice abstraction of renderer, so it's possible to implement pixman renderer. So I did :)
|weston with x11-backend and pixman renderer|
khr and pq from #wayland were very helpful, and described place of renderer in wayland architecture, here're some points
- renderer just performs rendering of surface in order passed by compositor (compositor::surfaces_list)
- each surface has opaque (surface::opaque region) and non-opaque regions, for performance optimization it makes sense to render them differently (with different composite operators, PIXMAN_OP_SRC and PIXMAN_OP_OVER respectively)
- surface::opaque region is in surface coordinates, and damage region is in global coordinates, so one needs to translate surface::opaque region into global coordinates before compositing
I've also added MIT-SHM support to x11-backend to test pixman renderer, it's activated by passing "--use-shm" argument to weston.
I plan to add 16bpp formats support to wayland (currently it supports only 32bpp) and implement fbdev backend. As one maybe guessed, then I want to try wayland on my PDAs: Zipit Z2 and iPAQs :)
As usual, code is on github, I'll submit it upstream tomorrow.