make money online Multimedia

Friday, October 2, 2009

The Life Cycle of Multimedia
Metadata

During its lifetime, multimedia content
undergoes different stages or cycles from
production to consumption. Content is created,
processed or modified in a postproduction stage,
delivered to users, and finally, consumed. Metadata,
or descriptive data about the multimedia
content, pass through similar stages but with different
time lines.1 Metadata may be produced,
modified, and consumed by all actors involved
in the content production-consumption chain
(see the “Life-Cycle Spaces” sidebar for more
information). At each step of the chain, different
kinds of metadata may be produced by highly
different methods and of substantially different
semantic value.
Different metadata let us tie the different multimedia
processes in a life cycle together. However,
to employ these metadata, they must be appropriately
generated. The CODAC Project, led by
Harald Kosch, implements different multimedia
processes and ties them together in the life cycle.
CODAC uses distributed systems to implement
multimedia processes. Figure 1 gives the architectural
overview of this system.
The project’s core component is a Multimedia
Database Management System (MMDBMS),2
which stores content and MPEG-7-based metadata.
3 It communicates with a streaming server
for data delivery. The database is realized in the
Multimedia Data Cartridge (MDC)—which is an
extension of the Oracle Database Management
System—to handle multimedia content and
MPEG-7 metadata.

Use case scenario

To demonstrate metadata’s life cycle, let’s consider
a use case scenario. A user watches an interesting
music video on a colleague’s screen and
wants to retrieve the same music video. The only
information she retained was that the singer in
the music video was Avril Lavigne, and she
remembers the song’s melody. She can’t ask her
colleague directly, so she wants to access it from
a multimedia database.
The query service of our Multimedia Data Cartridge
(MDC) offers a solution for finding such a
music video. First, the user can enter a query by
thematic means, thus specifying that the music
video’s singer is Avril Lavigne. (See Figure 2a, p. 80,
for the query interface.) In response to the first
query, the service returns several music videos that
pertain to the singer. To narrow the search, the
user can hum the melody to the query with a
humming service, as Figure 2b shows.
The multimedia database retrieves information
on the music video that meets the query request
and delivers it to the user. Such information
includes the full title, full information on the
singer, the production date, and so on, and finally
the address of the media server where the user
can obtain the video. Thus, the user finds out that
the song she has searched for is entitled “Skater
Boy” and can now access the music video.
Luckily, the user is registered to the media
server storing the video and can request the
music video from this server. In addition, the
user specifies in the video request her mobile
device’s terminal capabilities. Unfortunately, the
media server has only copies of the music video
in a quality that doesn’t meet the terminal constraints.
The server examines metadata generated
in postprocessing of the video to generate a
variation of the video with the best possible quality
satisfying the constraints and then delivers
this variation.
Let’s further assume that the request goes over
an authorized proxy cache that examines if a
cached copy is present. If it’s there, but not in the
appropriate quality, delivery-related metadata
describing the possibility for the video to be
adapted to resource constraints is used by the
proxy cache to adapt the video accordingly.