FAQ

General questions

  1. The digital rights of an item are incorrect, how can I inform the consortium about that issue?

    In the case that you locate a wrongly stated rights statement, please open a helpdesk ticket or contact disci.byzart@unibo.it.

  2. How do I apply the right statement in my metadata?

    First consult the Europeana Rights Statements and Creative commons public domain licence information. If you still have questions, open a helpdesk ticket or contact disci.byzart@unibo.it.

  3. I want to use BYZART-sourced material to create a presentation, what should I do?

    Please pay attention to the copyright information and make sure you follow the licensing rules in order to use the content for the intended purpose.

  4. What are the digitization requirements for material to be added to Byzart collection?

    You can read the digitization overview document under the Helpdesk – How it works section.

  5. I need practical help with the digitisation process for Video/Audio/Images, can you recommend a tutorial?

    The Byzart Helpdesk – How it works section provides video guides to those processes for all media types. Also see later sections of this FAQ.

  6. I want to see how holographic content is displayed on my pyramid-shaped display, where can I find it?

    Download the holographic video to avoid streaming delays.
    Make sure that the video is displayed at full-screen mode on your handheld device (laptop or phone).
    Place the device facing upwards and with tip of the pyramid pointing at the centre of the screen.
    Level the system at eye-level and press play to watch the video.

  7. What do the metadata contain?

    Within this project we constructed special metadata to appropriately model the content requirements. The main goal of the BYZART format is to contain all the information needed to accurately describe an object of cultural heritage from the Byzantine period, while strictly adhering to the rules of the Europeana Data Model (EDM). This can be found under the Helpdesk -> Metadata modelling section, under the General instructions about the “Byzart metadata format” link. A table-based format is also available under the “table format of the Byzart metadata format” option.

  8. Can I freely use the BYZART metadata construct?

    Of course, we invite others to use the metadata construct. If you find it useful please send us an e-mail!

  9. How can I use the communication (ticket) form under Helpdesk?

    You can use it for general enquiries, as communication means to become a partner, to report CMC system questions and to submit questions about intellectual property rights relating to our content.

Technical & digitization questions

  1. I am a total beginner. I don’t understand how formats, containers and codecs work. What is lossy and lossless compression? Can you explain the basics?

    A media file, like all other files, has a name and an extension. It also has a container format, one or more media streams (also known as tracks) and metadata.

    The container format usually corresponds to the file extension, so for example AVI, MOV, AAC, MP3, MKV are both file extensions and container formats. They provide the “box” into which the media data is stored.
    A stream, also known as track, is a distinct medium stored inside a container using a codec. For example, video files usually have both a video and an audio stream. Other video files have multiple video streams with different angles of the same scene. The player applications are responsible for allowing the user viewing (decoding) the content to access streams and choose among them.
    The codec, also known as encoding format, is the algorithm used to digitize the source (usually) analog video and audio data so computers can replay them later. Examples are FFMPEG, Dolby Digital (AC3, etc), ITU standards (G.719, G.722, etc), MPEG (layers 1 to 3), AAC, etc.
    The metadata is information that pertains to the media data but is not itself part of it. E.g. subtitles, music genre, language and chapters in a video file are metadata. Metadata is stored according to specifications defined in the container format.

    A stream in a media file can use compression, i.e. be compressed or uncompressed. Compression is a method to reduce the size of the medium data to make storage easier. Uncompressed media data, especially video, is typically huge in size, especially if the physical digitizing equipment (camera, scanner, audio card, etc) are very high fidelity. Most codecs use compression to make media storage practical.
    Compression can be lossless or lossy. Lossless compression means that the compression algorithm maintains the original content exactly as is after decompression, meaning lossless codecs lose no quality at all when they are used to store media. Lossy encoding is the opposite and aims to throw away information that is deemed less important to reduce the size of the medium in storage. For example, lossy audio codecs try to preserve audio frequencies that the human ear hears well and throw away data for other frequencies. Video codecs reduce fidelity of each frame in a video to a point where a human can see artifacts (e.g. blurring). Choice of lossy codecs is made depending on a multitude of factors, including the projected physical display dimensions, the intended use of the file being created, etc. Lossy codecs tend to have a lot more settings than the lossless ones, since choosing what to “throw away” typically differs for each application.

    The main choice to be made when digitizing a medium is to decide on the encoding to use, since this is what determines the quality, size and fidelity of the stored file as compared to the original medium. A secondary consideration is which container format to use. This must be selected so that the software used in the project supports it. Byzart provides a brief document with recommendations for digitization in the helpdesk how-to section.

  2. What are the Byzart recommendations for codecs and containers?

    Byzart recommendations are the following:
    video: the Matroska container (file extension: MKV) with the FFV1 lossless codec or the h265/HEVC lossy codec
    audio: the Matroska container (file extension: MKV) with the FLAC lossless codec or the AAC lossy codec
    images: the WebP or TIFF containers (file extensions: WEBP,TIFF) with the TIFF deflate compression setting or WebP lossless for lossless encoding. For lossy encoding, the TIFF deflate compression setting or (preferred) lossy WebP with the quality setting between 60% and 90% depending on the dimensions of the image. Large images may allow for lower quality settings, depending on the actual content. If in doubt, use 90%.

    Byzart provides a brief document with recommendations for digitization in the helpdesk how-to section.

  3. What software should I use to encode media files for Byzart?

    Assuming you have a file captured from a hardware device and wish to prepare it for upload to Byzart, the recommended software is XMedia Recode, mainly for its FFV1 support. Any frontend to the FFMPEG library should be fine, as long as it can encode/decode FFV1, h265/HEVC, FLAC, AAC into/from the Matroska (MKV) container. For image acquisition from scanner devices it’s best to use GIMP, the GNU image manipulation program.

    There are video tutorials available for the installation and use of these programs in the Byzart Helpdesk how-to section.

  4. I am overwhelmed by the settings in my scanner/digitizer/video camera/audio card. What is sample rate, resolution, FPS and DPI?

    When converting medium data from analog to digital, hardware like an audio capture card needs to break down the analog information (think of it as a wave) into samples that can be stored in a computer (think of this as converting the smooth curves of a wave into jagged edges). Obviously this changes the shape of the analog data, i.e. data is lost when digitizing.

    The sample rate is how often the device takes a “snapshot” of the analog medium. The more often this happens, the higher the sample rate, the better the fidelity (and size) of the digital copy of the analog data. Obviously high sample rates are preferred, however there is a limit to how fast our senses can perceive the analog world around us. This means that since our eyes can sample the world around us at rate of for example 60-80 samples per second (60Hz-80Hz) it makes no sense to store video data in higher than 80Hz sample rates. As an additional example, audio in CDs is 44.1KHz while studio recordings rarely use less than 96KHz.

    The resolution is how detailed is the “snapshot” each device takes of the analog medium. So, a “4K camera” provides less information about what it’s lens is looking at than an “8K camera”. “4K” refers to the horizontal resolution of the image taken from a camera. “HD” content has a resolution of 1920points horizontally and 1080 points vertically (1920×1080) and is also described as “1080p”. Clearly, we prefer higher resolutions since they allow us to zoom into the image after digitization and make out more of the detail in the original image.

    The FPS is the “frames per second” of a video file. Without getting into the complexity of film frame rates we can safely say that the FPS of a video determines how smooth its playback is. A low FPS video will seem like a series of photographs instead of a video. Typically, 25FPS is the minimum FPS that should be used for video, however 30FPS is the practical minimum. FPS is directly related to the sample rate of the camera, so a 60Hz camera capture should never be encoded to more than 60FPS, as it would be pointless, there is no extra information to encode than 60 frames per second. Conversely, encoding it at 30FPS will halve the “smoothness” of the video compared to the original.

    The DPI is the “dots per inch” of an image file. It can be used for video and physical display as well as it describes the number of dots (pixels, points) per inch of the video/device. Obviously, this is directly related to resolution. So, a media file containing video with very high resolution displayed on a low DPI monitor would look much worse than on a high DPI monitor. In practical use, DPI is important when scanning images. Typical settings are 150,300,600 and 1200 DPI settings on scanners. Generally, 150 is recommended for computer vieweing without zoom, 300 for general use, 600 if printing the digitized image is required and 1200 for high fidelity storage of complex content (e.g. wide-angle photos, 360o content, etc).

    The fidelity of the digitized content to the original analog medium is a combination of both resolution and sample rate. Byzart provides a brief document with recommendations for digitization in the helpdesk how-to section.

  5. I want to touch up my files before uploading them to Byzart. Do you have any recommendations?

    For images, typical touch-up includes histogram, color and brightness adjustments as well as possibly actual editing of the image. This could be blurring of areas that surround the main theme of a photo, such as an icon, for example. For these, any image editing program will do, as long as it can save/export into lossless/lossy WebP and TIFF. If your favourite program does not support WebP you can save to PNG and then convert using a different converter program. We recommend GIMP as a free open-source solution.

    For video, touch-up is a much more complicated proposition. If it only involves cropping or resizing the video, most video encoder/transcoder programs will do this as part of the encoding. Some will provide common filters, such as deinterlace, brightness adjust or letterbox. However, for actually modifying the content of a video you need video editing suites and must decide on an intermediate format if FFV1 is not supported natively by your software. We do not offer recommendations on the software you wish to use but make sure that your intermediate format does not resize your video (including upscaling). Some intermediate formats do not allow for custom resolutions so you must take this into account before actually capturing information (taking photos, video) and adjust your device resolution and sample rates accordingly beforehand.

    For audio, things are simpler, usually touch-up is required for most sources when digitizing because normalization has to take place so that libraries/collections have a common volume between items. We recommend Audacity for audio editing as it’s free and open source.

    Also remember that encoders allow you to chose streams to mux into the final media file. This means that if you have a video file and only want to modify/touch-up the audio track you can split the audio out into a separate file, edit that using your audio editing program and then “repackage” it (remux) it with the original video. For the Matroska container format that we recommend MKVToolNix contains a muxing/demuxing tool that will let you easily split/join streams into MKV files.

    Batch conversion/processing is also very practical. If you need to normalize the audio of hundreds of video files it is best to use command-line tools to do this, typically using the FFMPEG suite. There are batch conversion programs for images, audio and video and for all user experience levels online, a lot of them free.

  6. I have digitized my files, now what? How do I access this content in the best way?

    You will need a media player application. We recommend VLC, an open-source and free multiplatform media player that plays practically any kind of media file available.

    Beyond being able to access the media you must take into account how you will project/display/play back the content. For example, watching a UHD (4K) video on an 22′ HD (1920×1080) computer monitor will not do it justice. You should be using a higher resolution (and probably also a bigger screen). Similarly, if you are projecting onto a 10×10 meter wall, you should be using a high resolution projector for the same video (at least 4K) and possibly reduce the area of your projection if artifacts are visible during playback. Additionally, if you have a small 15′ laptop screen with a very high resolution and playing back the same 4K video, all the detail will be displayed but your eyes will not be able to see it since the screen is so small.

    VLC and other media player programs usually provide post-processing filters during playback, such as upscalers/downscalers, deinterlacers, color adjustment filters, audio normalization, etc. Make sure to use those to adapt your playback to the environment and equipment you have available. For example, if you are playing audio on a set of speakers with a very high bass response or a set of speakers with a subwoofer you should consider an audio equalizer filter to reduce the low frequencies while playing back the audio. Similarly, if your video projector is projecting onto a non-white wall, you should modify your playback color filters to regain color fidelity.

    It is best to digitize your content in multiple ways, for example you can create a lossless video encoding of your content then use that file to create a lossy, smaller and more manageable video file to use for presentations or quick online viewing. Always avoid re-encoding from a lossy format as this significantly reduces quality. Always re-encode from your lossless master files. Store your master files in long-term storage and keep acceptable quality copies around for playback, especially for video. Lossily-encoded videos with lower resolutions are orders of magnitude smaller than the losslessly-encoded versions, especially using the h265/HEVC codec we recommend.

  7. What about holographic and 360o content?

    For 360o content you should use 5K or higher resolution video as source since each frame will be projected very close to the eye of the viewer in a VR helmet or projected in a huge area around the viewer in a room setup. This means that details will be noticed and lack of them even more so. For holographic content (4-sided pyramid projection) you should adapt your video to the hardware available, we recommend HD video (1920×1080) at least. All these recommendations are usualy standard and already set as defaults into any capture device (e.g. 360o cameras).We have sample videos in the how-to section of the Byzart helpdesk.