[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [E-devel] Subtitles in Emotion ?

something interesting for subtitles is the 'halo' of the text, like that:


it is sometimes necessary to have that halo because the text can be unreadable if the color of background is closed to the one of the text.

I don't know if it is easily doable with evas or not.


PS: that example has been done with feetype

On Sun, 29 Oct 2006, Simon TRENY wrote:

On Sun, 29 Oct 2006 10:13:15 +0900,
Carsten Haitzler (The Rasterman) <raster@rasterman.com> wrote :

On Sat, 28 Oct 2006 22:44:07 +0200 Charles de Noyelle
<mansuetus@spontex.org> babbled:


My post is about .srt subtitles in Emotion.

I tried to look into the code, and I *think* that nothing is done
to read subtitles. What I expected and look for is kind of

EAPI void emotion_object_{subtitles,spu}_file_set ( Evas_Object
*obj, const char *filename )

I saw the "emotion_object_spu_channel_count(Evas_Object *obj)" But
I could not figure out what it was used for... DVD's subs ?

yes - dvd subtitles are handled by libxine as video overlays - they
do work if turned on. there is no way to use external files.

The thing is that needs some changes in Emotion EAPI, and maybe, a
such thing is not wanted (read subtitles in Emotion).

What do you thing about it? Would that be easy to do? Is it

i suggest first looking at gstreamer and xine api's to see if you can
put it there as the subtitles also need to be timed to the video.

I had implemented .srt support for eclair some times ago and it was
working quite well but it is now broken since it still uses the old
textblock API. I know that the API of xine allow to load external
subtitles, and gstreamer's probably does too, but those substitles
won't look as good as if we use an evas' text object to render them:
xine and gstreamer directly render the subtitles in the YUV data, so
when the video is stretched, the subtitles are stretched too. With
evas' text object, we could also make the subtitles transparent, glow,
cast a shadow, ... So I would probably be a good idea to make emotion
render its own subtitles.

Now there are several problems that I met when I implemented that in
eclair. First the text encoding: Evas requires UTF-8 whereas .srt
files doesn't seem to follow a standard: some times they uses 8-bit
ascii, some times UTF-8, ..., and it's not specified in the files. So
we would need to be able to auto-detect the encoding of a text. Is
there a way to do that? And also, I noticed that when you render an
incorrectly UTF-8 encoded text with Evas, Evas only renders the text up
to the first incorrect character, the following characters are not
rendered. It may be good to just ignore the incorrect character, and to
continue to render the others (I don't know if it's easy to do though, I
just know that GTK and QT do that so it should be possible :))
Other problem: there are a lot of subtitle formats (srt, ssa, txt, ...)
so we would need a lot of loaders if we want to support them all. Srt
seems to be the most used by far (and it is really easy to load), so we
should probably start with it.

I will try to add .srt support to emotion this week end.