Search results for: analog signal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5056

Search results for: analog signal processing

3316 Hybridization of Mathematical Transforms for Robust Video Watermarking Technique

Authors: Harpal Singh, Sakshi Batra

Abstract:

The widespread and easy accesses to multimedia contents and possibility to make numerous copies without loss of significant fidelity have roused the requirement of digital rights management. Thus this problem can be effectively solved by Digital watermarking technology. This is a concept of embedding some sort of data or special pattern (watermark) in the multimedia content; this information will later prove ownership in case of a dispute, trace the marked document’s dissemination, identify a misappropriating person or simply inform user about the rights-holder. The primary motive of digital watermarking is to embed the data imperceptibly and robustly in the host information. Extensive counts of watermarking techniques have been developed to embed copyright marks or data in digital images, video, audio and other multimedia objects. With the development of digital video-based innovations, copyright dilemma for the multimedia industry increases. Video watermarking had been proposed in recent years to serve the issue of illicit copying and allocation of videos. It is the process of embedding copyright information in video bit streams. Practically video watermarking schemes have to address some serious challenges as compared to image watermarking schemes like real-time requirements in the video broadcasting, large volume of inherently redundant data between frames, the unbalance between the motion and motionless regions etc. and they are particularly vulnerable to attacks, for example, frame swapping, statistical analysis, rotation, noise, median and crop attacks. In this paper, an effective, robust and imperceptible video watermarking algorithm is proposed based on hybridization of powerful mathematical transforms; Fractional Fourier Transform (FrFT), Discrete Wavelet transforms (DWT) and Singular Value Decomposition (SVD) using redundant wavelet. This scheme utilizes various transforms for embedding watermarks on different layers by using Hybrid systems. For this purpose, the video frames are portioned into layers (RGB) and the watermark is being embedded in two forms in the video frames using SVD portioning of the watermark, and DWT sub-band decomposition of host video, to facilitate copyright safeguard as well as reliability. The FrFT orders are used as the encryption key that allows the watermarking method to be more robust against various attacks. The fidelity of the scheme is enhanced by introducing key generation and wavelet based key embedding watermarking scheme. Thus, for watermark embedding and extraction, same key is required. Therefore the key must be shared between the owner and the verifier via some safe network. This paper demonstrates the performance by considering different qualitative metrics namely Peak Signal to Noise ratio, Structure similarity index and correlation values and also apply some attacks to prove the robustness. The Experimental results are presented to demonstrate that the proposed scheme can withstand a variety of video processing attacks as well as imperceptibility.

Keywords: discrete wavelet transform, robustness, video watermarking, watermark

Procedia PDF Downloads 213
3315 Economic Assessment of the Fish Solar Tent Dryers

Authors: Collen Kawiya

Abstract:

In an effort of reducing post-harvest losses and improving the supply of quality fish products in Malawi, the fish solar tent dryers have been designed in the southern part of Lake Malawi for processing small fish species under the project of Cultivate Africa’s Future (CultiAF). This study was done to promote the adoption of the fish solar tent dryers by the many small scale fish processors in Malawi through the assessment of the economic viability of these dryers. With the use of the project’s baseline survey data, a business model for a constructed ‘ready for use’ solar tent dryer was developed where investment appraisal techniques were calculated in addition with the sensitivity analysis. The study also conducted a risk analysis through the use of the Monte Carlo simulation technique and a probabilistic net present value was found. The investment appraisal results showed that the net present value was US$8,756.85, the internal rate of return was 62% higher than the 16.32% cost of capital and the payback period was 1.64 years. The sensitivity analysis results showed that only two input variables influenced the fish solar dryer investment’s net present value. These are the dried fish selling prices that were correlating positively with the net present value and the fresh fish buying prices that were negatively correlating with the net present value. Risk analysis results showed that the chances that fish processors will make a loss from this type of investment are 17.56%. It was also observed that there exist only a 0.20 probability of experiencing a negative net present value from this type of investment. Lastly, the study found that the net present value of the fish solar tent dryer’s investment is still robust in spite of any changes in the levels of investors risk preferences. With these results, it is concluded that the fish solar tent dryers in Malawi are an economically viable investment because they are able to improve the returns in the fish processing activity. As such, fish processors need to adopt them by investing their money to construct and use them.

Keywords: investment appraisal, risk analysis, sensitivity analysis, solar tent drying

Procedia PDF Downloads 260
3314 Strength Evaluation by Finite Element Analysis of Mesoscale Concrete Models Developed from CT Scan Images of Concrete Cube

Authors: Nirjhar Dhang, S. Vinay Kumar

Abstract:

Concrete is a non-homogeneous mix of coarse aggregates, sand, cement, air-voids and interfacial transition zone (ITZ) around aggregates. Adoption of these complex structures and material properties in numerical simulation would lead us to better understanding and design of concrete. In this work, the mesoscale model of concrete has been prepared from X-ray computerized tomography (CT) image. These images are converted into computer model and numerically simulated using commercially available finite element software. The mesoscale models are simulated under the influence of compressive displacement. The effect of shape and distribution of aggregates, continuous and discrete ITZ thickness, voids, and variation of mortar strength has been investigated. The CT scan of concrete cube consists of series of two dimensional slices. Total 49 slices are obtained from a cube of 150mm and the interval of slices comes approximately 3mm. In CT scan images, the same cube can be CT scanned in a non-destructive manner and later the compression test can be carried out in a universal testing machine (UTM) for finding its strength. The image processing and extraction of mortar and aggregates from CT scan slices are performed by programming in Python. The digital colour image consists of red, green and blue (RGB) pixels. The conversion of RGB image to black and white image (BW) is carried out, and identification of mesoscale constituents is made by putting value between 0-255. The pixel matrix is created for modeling of mortar, aggregates, and ITZ. Pixels are normalized to 0-9 scale considering the relative strength. Here, zero is assigned to voids, 4-6 for mortar and 7-9 for aggregates. The value between 1-3 identifies boundary between aggregates and mortar. In the next step, triangular and quadrilateral elements for plane stress and plane strain models are generated depending on option given. Properties of materials, boundary conditions, and analysis scheme are specified in this module. The responses like displacement, stresses, and damages are evaluated by ABAQUS importing the input file. This simulation evaluates compressive strengths of 49 slices of the cube. The model is meshed with more than sixty thousand elements. The effect of shape and distribution of aggregates, inclusion of voids and variation of thickness of ITZ layer with relation to load carrying capacity, stress-strain response and strain localizations of concrete have been studied. The plane strain condition carried more load than plane stress condition due to confinement. The CT scan technique can be used to get slices from concrete cores taken from the actual structure, and the digital image processing can be used for finding the shape and contents of aggregates in concrete. This may be further compared with test results of concrete cores and can be used as an important tool for strength evaluation of concrete.

Keywords: concrete, image processing, plane strain, interfacial transition zone

Procedia PDF Downloads 229
3313 A Hybrid Watermarking Scheme Using Discrete and Discrete Stationary Wavelet Transformation For Color Images

Authors: Bülent Kantar, Numan Ünaldı

Abstract:

This paper presents a new method which includes robust and invisible digital watermarking on images that is colored. Colored images are used as watermark. Frequency region is used for digital watermarking. Discrete wavelet transform and discrete stationary wavelet transform are used for frequency region transformation. Low, medium and high frequency coefficients are obtained by applying the two-level discrete wavelet transform to the original image. Low frequency coefficients are obtained by applying one level discrete stationary wavelet transform separately to all frequency coefficient of the two-level discrete wavelet transformation of the original image. For every low frequency coefficient obtained from one level discrete stationary wavelet transformation, watermarks are added. Watermarks are added to all frequency coefficients of two-level discrete wavelet transform. Totally, four watermarks are added to original image. In order to get back the watermark, the original and watermarked images are applied with two-level discrete wavelet transform and one level discrete stationary wavelet transform. The watermark is obtained from difference of the discrete stationary wavelet transform of the low frequency coefficients. A total of four watermarks are obtained from all frequency of two-level discrete wavelet transform. Obtained watermark results are compared with real watermark results, and a similarity result is obtained. A watermark is obtained from the highest similarity values. Proposed methods of watermarking are tested against attacks of the geometric and image processing. The results show that proposed watermarking method is robust and invisible. All features of frequencies of two level discrete wavelet transform watermarking are combined to get back the watermark from the watermarked image. Watermarks have been added to the image by converting the binary image. These operations provide us with better results in getting back the watermark from watermarked image by attacking of the geometric and image processing.

Keywords: watermarking, DWT, DSWT, copy right protection, RGB

Procedia PDF Downloads 516
3312 Nano-Enhanced In-Situ and Field Up-Gradation of Heavy Oil

Authors: Devesh Motwani, Ranjana S. Baruah

Abstract:

The prime incentive behind up gradation of heavy oil is to increase its API gravity for ease of transportation to refineries, thus expanding the market access of bitumen-based crude to the refineries. There has always been a demand for an integrated approach that aims at simplifying the upgrading scheme, making it adaptable to the production site in terms of economics, environment, and personnel safety. Recent advances in nanotechnology have facilitated the development of two lines of heavy oil upgrading processes that make use of nano-catalysts for producing upgraded oil: In Situ Upgrading and Field Upgrading. The In-Situ upgrading scheme makes use of Hot Fluid Injection (HFI) technique where heavy fractions separated from produced oil are injected into the formations to reintroduce heat into the reservoir along with suspended nano-catalysts and hydrogen. In the presence of hydrogen, catalytic exothermic hydro-processing reactions occur that produce light gases and volatile hydrocarbons which contribute to increased oil detachment from the rock resulting in enhanced recovery. In this way the process is a combination of enhanced heavy oil recovery along with up gradation that effectively handles the heat load within the reservoirs, reduces hydrocarbon waste generation and minimizes the need for diluents. By eliminating most of the residual oil, the Synthetic Crude Oil (SCO) is much easier to transport and more amenable for processing in refineries. For heavy oil reservoirs seriously impacted by the presence of aquifers, the nano-catalytic technology can still be implemented on field though with some additional investments and reduced synergies; however still significantly serving the purpose of production of transportable oil with substantial benefits with respect to both large scale upgrading, and known commercial field upgrading technologies currently on the market. The paper aims to delve deeper into the technology discussed, and the future compatibility.

Keywords: upgrading, synthetic crude oil, nano-catalytic technology, compatibility

Procedia PDF Downloads 393
3311 CDM-Based Controller Design for High-Frequency Induction Heating System with LLC Tank

Authors: M. Helaimi, R. Taleb, D. Benyoucef, B. Belmadani

Abstract:

This paper presents the design of a polynomial controller with coefficient diagram method (CDM). This controller is used to control the output power of high frequency resonant inverter with LLC tank. One of the most important problems associated with the proposed inverter is achieving ZVS operating during the induction heating process. To overcome this problem, asymmetrical voltage cancellation (AVC) control technique is proposed. The phased look loop (PLL) is used to track the natural frequency of the system. The small signal model of the system with the proposed control is obtained using extending describing function method (EDM). The validity of the proposed control is verified by simulation results.

Keywords: induction heating, AVC control, CDM, PLL, resonant inverter

Procedia PDF Downloads 650
3310 Valorisation of Mango Seed: Response Surface Methodology Based Optimization of Starch Extraction from Mango Seeds

Authors: Tamrat Tesfaye, Bruce Sithole

Abstract:

Box-Behnken Response surface methodology was used to determine the optimum processing conditions that give maximum extraction yield and whiteness index from mango seed. The steeping time ranges from 2 to 12 hours and slurring of the steeped seed in sodium metabisulphite solution (0.1 to 0.5 w/v) was carried out. Experiments were designed according to Box-Behnken Design with these three factors and a total of 15 runs experimental variables of were analyzed. At linear level, the concentration of sodium metabisulphite had significant positive influence on percentage yield and whiteness index at p<0.05. At quadratic level, sodium metabisulphite concentration and sodium metabisulphite concentration2 had a significant negative influence on starch yield; sodium metabisulphite concentration and steeping time*temperature had significant (p<0.05) positive influence on whiteness index. The adjusted R2 above 0.8 for starch yield (0.906465) and whiteness index (0.909268) showed a good fit of the model with the experimental data. The optimum sodium metabisulphite concentration, steeping hours, and temperature for starch isolation with maximum starch yield (66.428%) and whiteness index (85%) as set goals for optimization with the desirability of 0.91939 was 0.255w/v concentration, 2hrs and 50 °C respectively. The determined experimental value of each response based on optimal condition was statistically in accordance with predicted levels at p<0.05. The Mango seeds are the by-products obtained during mango processing and possess disposal problem if not handled properly. The substitution of food based sizing agents with mango seed starch can contribute as pertinent resource deployment for value-added product manufacturing and waste utilization which might play significance role of food security in Ethiopia.

Keywords: mango, synthetic sizing agent, starch, extraction, textile, sizing

Procedia PDF Downloads 216
3309 Boron Nitride Nanoparticle Enhanced Prepreg Composite Laminates

Authors: Qiong Tian, Lifeng Zhang, Demei Yu, Ajit D. Kelkar

Abstract:

Low specific weight and high strength is the basic requirement for aerospace materials. Fiber-reinforced epoxy resin composites are attractive materials for this purpose. Boron nitride nanoparticles (BNNPs) have good radiation shielding capacity, which is very important to aerospace materials. Herein a processing route for an advanced hybrid composite material is demonstrated by introducing dispersed BNNPs in standard prepreg manufacturing. The hybrid materials contain three parts: E-fiberglass, an aerospace-grade epoxy resin system, and BNNPs. A vacuum assisted resin transfer molding (VARTM) was utilized in this processing. Two BNNP functionalization approaches are presented in this study: (a) covalent functionalization with 3-aminopropyltriethoxysilane (KH-550); (b) non-covalent functionalization with cetyltrimethylammonium bromide (CTAB). The functionalized BNNPs were characterized by Fourier-transform infrared spectroscopy (FT-IR), X-ray diffraction(XRD) and scanning electron microscope (SEM). The results showed that BN powder was successfully functionalized via the covalent and non-covalent approaches without any crystal structure change and big agglomerate particles were broken into platelet-like nanoparticles (BNNPs) after functionalization. Compared to pristine BN powder, surface modified BNNPs could result in significant improvement in mechanical properties such as tensile, flexural and compressive strength and modulus. CTAB functionalized BNNPs (CTAB-BNNPs) showed higher tensile and flexural strength but lower compressive strength than KH-550 functionalized BNNPs (KH550-BNNPs). These reinforcements are mainly attributed to good BNNPs dispersion and interfacial adhesion between epoxy matrix and BNNPs. This study reveals the potential in improving mechanical properties of BNNPs-containing composites laminates through surface functionalization of BNNPs.

Keywords: boron nitride, epoxy, functionalization, prepreg, composite

Procedia PDF Downloads 420
3308 Characterization of Onboard Reliable Error Correction Code for SDRAM Controller

Authors: Pitcheswara Rao Nelapati

Abstract:

In the process of conveying the information there may be a chance of signal being corrupted which leads to the erroneous bits in the message. The message may consist of single, double and multiple bit errors. In high-reliability applications, memory can sustain multiple soft errors due to single or multiple event upsets caused by environmental factors. The traditional hamming code with SEC-DED capability cannot be address these types of errors. It is possible to use powerful non-binary BCH code such as Reed-Solomon code to address multiple errors. However, it could take at least a couple dozen cycles of latency to complete first correction and run at a relatively slow speed. In order to overcome this drawback i.e., to increase speed and latency we are using reed-Muller code.

Keywords: SEC-DED, BCH code, Reed-Solomon code, Reed-Muller code

Procedia PDF Downloads 413
3307 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering

Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause

Abstract:

In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.

Keywords: image processing, illumination equalization, shadow filtering, object detection

Procedia PDF Downloads 203
3306 Seafloor and Sea Surface Modelling in the East Coast Region of North America

Authors: Magdalena Idzikowska, Katarzyna Pająk, Kamil Kowalczyk

Abstract:

Seafloor topography is a fundamental issue in geological, geophysical, and oceanographic studies. Single-beam or multibeam sonars attached to the hulls of ships are used to emit a hydroacoustic signal from transducers and reproduce the topography of the seabed. This solution provides relevant accuracy and spatial resolution. Bathymetric data from ships surveys provides National Centers for Environmental Information – National Oceanic and Atmospheric Administration. Unfortunately, most of the seabed is still unidentified, as there are still many gaps to be explored between ship survey tracks. Moreover, such measurements are very expensive and time-consuming. The solution is raster bathymetric models shared by The General Bathymetric Chart of the Oceans. The offered products are a compilation of different sets of data - raw or processed. Indirect data for the development of bathymetric models are also measurements of gravity anomalies. Some forms of seafloor relief (e.g. seamounts) increase the force of the Earth's pull, leading to changes in the sea surface. Based on satellite altimetry data, Sea Surface Height and marine gravity anomalies can be estimated, and based on the anomalies, it’s possible to infer the structure of the seabed. The main goal of the work is to create regional bathymetric models and models of the sea surface in the area of the east coast of North America – a region of seamounts and undulating seafloor. The research includes an analysis of the methods and techniques used, an evaluation of the interpolation algorithms used, model thickening, and the creation of grid models. Obtained data are raster bathymetric models in NetCDF format, survey data from multibeam soundings in MB-System format, and satellite altimetry data from Copernicus Marine Environment Monitoring Service. The methodology includes data extraction, processing, mapping, and spatial analysis. Visualization of the obtained results was carried out with Geographic Information System tools. The result is an extension of the state of the knowledge of the quality and usefulness of the data used for seabed and sea surface modeling and knowledge of the accuracy of the generated models. Sea level is averaged over time and space (excluding waves, tides, etc.). Its changes, along with knowledge of the topography of the ocean floor - inform us indirectly about the volume of the entire water ocean. The true shape of the ocean surface is further varied by such phenomena as tides, differences in atmospheric pressure, wind systems, thermal expansion of water, or phases of ocean circulation. Depending on the location of the point, the higher the depth, the lower the trend of sea level change. Studies show that combining data sets, from different sources, with different accuracies can affect the quality of sea surface and seafloor topography models.

Keywords: seafloor, sea surface height, bathymetry, satellite altimetry

Procedia PDF Downloads 64
3305 Computation of Induction Currents in a Set of Dendrites

Authors: R. B. Mishra, Sudhakar Tripathi

Abstract:

In this paper, the cable model of dendrites have been considered. The dendrites are cylindrical cables of various segments having variable length and reducing radius from start point at synapse and end points. For a particular event signal being received by a neuron in response only some dendrite are active at a particular instance. Initial current signals with different current flows in dendrite are assumed. Due to overlapping and coupling of active dendrite, they induce currents in the dendrite segments of each other at a particular instance. But how these currents are induced in the various segments of active dendrites due to coupling between these dendrites, It is not presented in the literature. Here the paper presents a model for induced currents in active dendrite segments due to mutual coupling at the starting instance of an activity in dendrite. The model is as discussed further.

Keywords: currents, dendrites, induction, simulation

Procedia PDF Downloads 379
3304 The Quantum Theory of Music and Human Languages

Authors: Mballa Abanda Luc Aurelien Serge, Henda Gnakate Biba, Kuate Guemo Romaric, Akono Rufine Nicole, Zabotom Yaya Fadel Biba, Petfiang Sidonie, Bella Suzane Jenifer

Abstract:

The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original, and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological, and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation, and the question of modeling in the human sciences: mathematics, computer science, translation automation, and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal, and random music. The experimentation confirming the theorization, I designed a semi-digital, semi-analog application that translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music, and deterministic and random music). To test this application, I use music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). The translation is done (from writing to writing, from writing to speech, and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz, and world music or variety, etc. The software runs, giving you the option to choose harmonies, and then you select your melody.

Keywords: language, music, sciences, quantum entenglement

Procedia PDF Downloads 61
3303 A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry

Authors: I GD Yudha Partama, A. Kanno, Y. Akamatsu, R. Inui, M. Goto, M. Sekine

Abstract:

The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).

Keywords: bottom elevation, MVS, river, SfM

Procedia PDF Downloads 292
3302 A Crystallization Kinetic Model for Long Fiber-Based Composite with Thermoplastic Semicrystalline Polymer Matrix

Authors: Nicolas Bigot, M'hamed Boutaous, Nahiene Hamila, Shihe Xin

Abstract:

Composite materials with polymer matrices are widely used in most industrial areas, particularly in aeronautical and automotive ones. Thanks to the development of a high-performance thermoplastic semicrystalline polymer matrix, those materials exhibit more and more efficient properties. The polymer matrix in composite materials can manifest a specific crystalline structure characteristic of crystallization in a fibrous medium. In order to guarantee a good mechanical behavior of structures and to optimize their performances, it is necessary to define realistic mechanical constitutive laws of such materials considering their physical structure. The interaction between fibers and matrix is a key factor in the mechanical behavior of composite materials. Transcrystallization phenomena which develops in the matrix around the fibers constitute the interphase which greatly affects and governs the nature of the fiber-matrix interaction. Hence, it becomes fundamental to quantify its impact on the thermo-mechanical behavior of composites material in relationship with processing conditions. In this work, we propose a numerical model coupling the thermal and crystallization kinetics in long fiber-based composite materials, considering both the spherulitic and transcrystalline types of the induced structures. After validation of the model with comparison to results from the literature and noticing a good correlation, a parametric study has been led on the effects of the thermal kinetics, the fibers volume fractions, the deformation, and the pressure on the crystallization rate in the material, under processing conditions. The ratio of the transcrystallinity is highlighted and analyzed with regard to the thermal kinetics and gradients in the material. Experimental results on the process are foreseen and pave the way to establish a mechanical constitutive law describing, with the introduction of the role on the crystallization rates and types on the thermo-mechanical behavior of composites materials.

Keywords: composite materials, crystallization, heat transfer, modeling, transcrystallization

Procedia PDF Downloads 181
3301 The Impact of Anxiety on the Access to Phonological Representations in Beginning Readers and Writers

Authors: Regis Pochon, Nicolas Stefaniak, Veronique Baltazart, Pamela Gobin

Abstract:

Anxiety is known to have an impact on working memory. In reasoning or memory tasks, individuals with anxiety tend to show longer response times and poorer performance. Furthermore, there is a memory bias for negative information in anxiety. Given the crucial role of working memory in lexical learning, anxious students may encounter more difficulties in learning to read and spell. Anxiety could even affect an earlier learning, that is the activation of phonological representations, which are decisive for the learning of reading and writing. The aim of this study is to compare the access to phonological representations of beginning readers and writers according to their level of anxiety, using an auditory lexical decision task. Eighty students of 6- to 9-years-old completed the French version of the Revised Children's Manifest Anxiety Scale and were then divided into four anxiety groups according to their total score (Low, Median-Low, Median-High and High). Two set of eighty-one stimuli (words and non-words) have been auditory presented to these students by means of a laptop computer. Stimuli words were selected according to their emotional valence (positive, negative, neutral). Students had to decide as quickly and accurately as possible whether the presented stimulus was a real word or not (lexical decision). Response times and accuracy were recorded automatically on each trial. It was anticipated a) longer response times for the Median-High and High anxiety groups in comparison with the two others groups, b) faster response times for negative-valence words in comparison with positive and neutral-valence words only for the Median-High and High anxiety groups, c) lower response accuracy for Median-High and High anxiety groups in comparison with the two others groups, d) better response accuracy for negative-valence words in comparison with positive and neutral-valence words only for the Median-High and High anxiety groups. Concerning the response times, our results showed no difference between the four groups. Furthermore, inside each group, the average response times was very close regardless the emotional valence. Otherwise, group differences appear when considering the error rates. Median-High and High anxiety groups made significantly more errors in lexical decision than Median-Low and Low groups. Better response accuracy, however, is not found for negative-valence words in comparison with positive and neutral-valence words in the Median-High and High anxiety groups. Thus, these results showed a lower response accuracy for above-median anxiety groups than below-median groups but without specificity for the negative-valence words. This study suggests that anxiety can negatively impact the lexical processing in young students. Although the lexical processing speed seems preserved, the accuracy of this processing may be altered in students with moderate or high level of anxiety. This finding has important implication for the prevention of reading and spelling difficulties. Indeed, during these learnings, if anxiety affects the access to phonological representations, anxious students could be disturbed when they have to match phonological representations with new orthographic representations, because of less efficient lexical representations. This study should be continued in order to precise the impact of anxiety on basic school learning.

Keywords: anxiety, emotional valence, childhood, lexical access

Procedia PDF Downloads 276
3300 All-Optical Function Based on Self-Similar Spectral Broadening for 2R Regeneration in High-Bit-Rate Optical Transmission Systems

Authors: Leila Graini

Abstract:

In this paper, we demonstrate basic all-optical functions for 2R regeneration (Re-amplification and Re-shaping) based on self-similar spectral broadening in low normal dispersion and highly nonlinear fiber (ND-HNLF) to regenerate the signal through optical filtering including the transfer function characteristics, and output extinction ratio. Our approach of all-optical 2R regeneration is based on those of Mamyshev. The numerical study reveals the self-similar spectral broadening very effective for 2R all-optical regeneration; the proposed design presents high stability compared to a conventional regenerator using SPM broadening with reduction of the intensity fluctuations and improvement of the extinction ratio.

Keywords: all-optical function, 2R optical regeneration, self-similar broadening, Mamyshev regenerator

Procedia PDF Downloads 173
3299 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos

Abstract:

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Keywords: database, forensic genetics, genetic analysis, sample management, software solution

Procedia PDF Downloads 356
3298 Cognitive Deficits and Association with Autism Spectrum Disorder and Attention Deficit Hyperactivity Disorder in 22q11.2 Deletion Syndrome

Authors: Sinead Morrison, Ann Swillen, Therese Van Amelsvoort, Samuel Chawner, Elfi Vergaelen, Michael Owen, Marianne Van Den Bree

Abstract:

22q11.2 Deletion Syndrome (22q11.2DS) is caused by the deletion of approximately 60 genes on chromosome 22 and is associated with high rates of neurodevelopmental disorders such as Attention Deficit Hyperactivity Disorder (ADHD) and Autism Spectrum Disorders (ASD). The presentation of these disorders in 22q11.2DS is reported to be comparable to idiopathic forms and therefore presents a valuable model for understanding mechanisms of neurodevelopmental disorders. Cognitive deficits are thought to be a core feature of neurodevelopmental disorders, and possibly manifest in behavioural and emotional problems. There have been mixed findings in 22q11.2DS on whether the presence of ADHD or ASD is associated with greater cognitive deficits. Furthermore, the influence of developmental stage has never been taken into account. The aim was therefore to examine whether the presence of ADHD or ASD was associated with cognitive deficits in childhood and/or adolescence in 22q11.2DS. We conducted the largest study to date of this kind in 22q11.2DS. The same battery of tasks measuring processing speed, attention and spatial working memory were completed by 135 participants with 22q11.2DS. Wechsler IQ tests were completed, yielding Full Scale (FSIQ), Verbal (VIQ) and Performance IQ (PIQ). Age-standardised difference scores were produced for each participant. Developmental stages were defined as children (6-10 years) and adolescents (10-18 years). ADHD diagnosis was ascertained from a semi-structured interview with a parent. ASD status was ascertained from a questionnaire completed by a parent. Interaction and main effects of cognitive performance of those with or without a diagnosis of ADHD or ASD in childhood or adolescence were conducted with 2x2 ANOVA. Significant interactions were followed up with t-tests of simple effects. Adolescents with ASD displayed greater deficits in all measures (processing speed, p = 0.022; sustained attention, p = 0.016; working memory, p = 0.006) than adolescents without ASD; there was no difference between children with and without ASD. There were no significant differences on IQ measures. Both children and adolescents with ADHD displayed greater deficits on sustained attention (p = 0.002) than those without ADHD. There were no significant differences on any other measures for ADHD. Magnitude of cognitive deficit in individuals with 22q11.2DS varied by cognitive domain, developmental stage and presence of neurodevelopmental disorder. Adolescents with 22q11.2DS and ASD showed greater deficits on all measures, which suggests there may be a sensitive period in childhood to acquire these domains, or reflect increasing social and academic demands in adolescence. The finding of poorer sustained attention in children and adolescents with ADHD supports previous research and suggests a specific deficit which can be separated from processing speed and working memory. This research provides unique insights into the association of ASD and ADHD with cognitive deficits in a group at high genomic risk of neurodevelopmental disorders.

Keywords: 22q11.2 deletion syndrome, attention deficit hyperactivity disorder, autism spectrum disorder, cognitive development

Procedia PDF Downloads 136
3297 Integrated Life Skill Training and Executive Function Strategies in Children with Autism Spectrum Disorder in Qatar: A Study Protocol for a Randomized Controlled Trial

Authors: Bara M Yousef, Naresh B Raj, Nadiah W Arfah, Brightlin N Dhas

Abstract:

Background: Executive function (EF) impairment is common in children with autism spectrum disorder (ASD). EF strategies are considered effective in improving the therapeutic outcomes of children with ASD. Aims: This study primarily aims to explore whether integrating EF strategies combined with regular occupational therapy intervention is more effective in improving daily life skills (DLS) and sensory integration/processing (SI/SP) skills than regular occupational therapy alone in children with ASD and secondarily aims to assess treatment outcomes on improving visual motor integration (VMI) skills. Procedures: A total of 92 children with ASD will be recruited and, following baseline assessments, randomly assigned to the treatment group (45-min once weekly individual occupational therapy plus EF strategies) and control group (45-min once weekly individual therapy sessions alone). Results and Outcomes: All children will be evaluated systematically by assessing SI/SP, DLS, and VMI, skills at baseline, 7 weeks, and 14 weeks of treatment. Data will be analyzed using ANCOVA and T-test. Conclusions and Implications: This single-blind, randomized controlled trial will provide empirical evidence for the effectiveness of EF strategies when combined with regular occupational therapy programs. Based on trial results, EF strategies could be recommended in multidisciplinary programs for children with ASD. Trial Registration: The trial has been registered in the clinicaltrail.gov for a registry, protocol ID: MRC-01-22-509 ClinicalTrials.gov Identifier: NCT05829577, registered 25th April 2023

Keywords: autism spectrum disorder, executive function strategies, daily life skills, sensory integration/processing, visual motor integration, occupational therapy, effectiveness

Procedia PDF Downloads 95
3296 Entrepreneurial Orientation and Business Performance: The Case of Micro Scale Food Processors Operating in a War-Recovery Environment

Authors: V. Suganya, V. Balasuriya

Abstract:

The functioning of Micro and Small Scale (MSS) businesses in the northern part of Sri Lanka was vulnerable due to three decades of internal conflict and the subsequent post-war economic openings has resulted new market prospects for MSS businesses. MSS businesses survive and operate with limited resources and struggle to access finance, raw material, markets, and technology. This study attempts to identify the manner in which entrepreneurial orientation puts into practice by the business operators to overcome these business challenges. Business operators in the traditional food processing sector are taken for this study as this sub-sector of the food industry is developing at a rapid pace. A review of the literature was done to recognize the concepts of entrepreneurial orientation, defining MMS businesses and the manner in which business performance is measured. Direct interview method supported by a structured questionnaire is used to collect data from 80 respondents; based on a fixed interval random sampling technique. This study reveals that more than half of the business operators have opted to commence their business ventures as a result of identifying a market opportunity. 41 per cent of the business operators are highly entrepreneurial oriented in a scale of 1 to 5. Entrepreneurial orientation shows significant relationship and strongly correlated with business performance. Pro-activeness, innovativeness and competitive aggressiveness shows a significant relationship with business performance while risk taking is negative and autonomy is not significantly related to business performance. It is evident that entrepreneurial oriented business practices contribute to better business performance even though 70 per cent prefer the ideas/views of the support agencies than the stakeholders when making business decisions. It is recommended that appropriate training should be introduced to develop entrepreneurial skills focusing to improve business networks so that new business opportunities and innovative business practices are identified.

Keywords: Micro and Small Scale (MMS) businesses, entrepreneurial orientation (EO), food processing, business operators

Procedia PDF Downloads 476
3295 A Study of Classification Models to Predict Drill-Bit Breakage Using Degradation Signals

Authors: Bharatendra Rai

Abstract:

Cutting tools are widely used in manufacturing processes and drilling is the most commonly used machining process. Although drill-bits used in drilling may not be expensive, their breakage can cause damage to expensive work piece being drilled and at the same time has major impact on productivity. Predicting drill-bit breakage, therefore, is important in reducing cost and improving productivity. This study uses twenty features extracted from two degradation signals viz., thrust force and torque. The methodology used involves developing and comparing decision tree, random forest, and multinomial logistic regression models for classifying and predicting drill-bit breakage using degradation signals.

Keywords: degradation signal, drill-bit breakage, random forest, multinomial logistic regression

Procedia PDF Downloads 340
3294 Feature Location Restoration for Under-Sampled Photoplethysmogram Using Spline Interpolation

Authors: Hangsik Shin

Abstract:

The purpose of this research is to restore the feature location of under-sampled photoplethysmogram using spline interpolation and to investigate feasibility for feature shape restoration. We obtained 10 kHz-sampled photoplethysmogram and decimated it to generate under-sampled dataset. Decimated dataset has 5 kHz, 2.5 k Hz, 1 kHz, 500 Hz, 250 Hz, 25 Hz and 10 Hz sampling frequency. To investigate the restoration performance, we interpolated under-sampled signals with 10 kHz, then compared feature locations with feature locations of 10 kHz sampled photoplethysmogram. Features were upper and lower peak of photplethysmography waveform. Result showed that time differences were dramatically decreased by interpolation. Location error was lesser than 1 ms in both feature types. In 10 Hz sampled cases, location error was also deceased a lot, however, they were still over 10 ms.

Keywords: peak detection, photoplethysmography, sampling, signal reconstruction

Procedia PDF Downloads 351
3293 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure

Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer

Abstract:

The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.

Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition

Procedia PDF Downloads 92
3292 Influence of Processing Parameters in Selective Laser Melting on the Microstructure and Mechanical Properties of Ti/Tin Composites With in-situ and ex-situ Reinforcement

Authors: C. Sánchez de Rojas Candela, A. Riquelme, P. Rodrigo, M. D. Escalera-Rodríguez, B. Torres, J. Rams

Abstract:

Selective laser melting is one of the most commonly used AM techniques. In it, a thin layer of metallic powder is deposited, and a laser is used to melt selected zones. The accumulation of layers, each one molten in the preselected zones, gives rise to the formation of a 3D sample with a nearly arbitrary design. To ensure that the properties of the final parts match those of the powder, all the process is carried out in an inert atmosphere, preferentially Ar, although this gas could be substituted. Ti6Al4V alloy is widely used in multiple industrial applications such as aerospace, maritime transport and biomedical, due to its properties. However, due to the demanding requirements of these applications, greater hardness and wear resistance are necessary, together with a better machining capacity, which currently limits its commercialization. To improve these properties, in this study, Selective Laser Melting (SLM) is used to manufacture Ti/TiN metal matrix composites with in-situ and ex-situ titanium nitride reinforcement where the scanning speed is modified (from 28.5 up to 65 mm/s) to study the influence of the processing parameters in SLM. A one-step method of nitriding the Ti6Al4V alloy is carried out to create in-situ TiN reinforcement in a reactive atmosphere and it is compared with ex-situ composites manufactured by previous mixture of both the titanium alloy powder and the ceramic reinforcement particles. The microstructure and mechanical properties of the different Ti/TiN composite materials have been analyzed. As a result, the existence of a similar matrix has been confirmed in in-situ and ex-situ fabrications and the growth mechanisms of the nitrides have been studied. An increase in the mechanical properties with respect to the initial alloy has been observed in both cases and related to changes in their microstructure. Specifically, a greater improvement (around 30.65%) has been identified in those manufactured by the in-situ method at low speeds although other properties such as porosity must be improved for their future industrial applicability.

Keywords: in-situ reinforcement, nitriding reaction, selective laser melting, titanium nitride

Procedia PDF Downloads 69
3291 Examining the Influence of Firm Internal Level Factors on Performance Variations among Micro and Small Enterprises: Evidence from Tanzanian Agri-Food Processing Firms

Authors: Pulkeria Pascoe, Hawa P. Tundui, Marcia Dutra de Barcellos, Hans de Steur, Xavier Gellynck

Abstract:

A majority of Micro and Small Enterprises (MSEs) experience low or no growth. Understanding their performance remains unfinished and disjointed as there is no consensus on the factors influencing it, especially in developing countries. Using a Resource-Based View (RBV) as the theoretical background, this cross-sectional study employed four regression models to examine the influence of firm-level factors (firm-specific characteristics, firm resources, manager socio-demographic characteristics, and selected management practices) on the overall performance variations among 442 Tanzanian micro and small agri-food processing firms. Study results confirmed the RBV argument that intangible resources make a larger contribution to overall performance variations among firms than that tangible resources. Firms' tangible and intangible resources explained 34.5% of overall performance variations (intangible resources explained the overall performance variability by 19.4% compared to tangible resources, which accounted for 15.1%), ranking first in explaining the overall performance variance. Firm-specific characteristics ranked second by influencing variations in overall performance by 29.0%. Selected management practices ranked third (6.3%), while the manager's socio-demographic factors were last on the list, as they influenced the overall performance variability among firms by only 5.1%. The study also found that firms that focus on proper utilization of tangible resources (financial and physical), set targets, and undertake better working capital management practices performed higher than their counterparts (low and average performers). Furthermore, accumulation and proper utilization of intangible resources (relational, organizational, and reputational), undertaking performance monitoring practices, age of the manager, and the choice of the firm location and activity were the dominant significant factors influencing the variations among average and high performers, relative to low performers. The entrepreneurial background was a significant factor influencing variations in average and low-performing firms, indicating that entrepreneurial skills are crucial to achieving average levels of performance. Firm age, size, legal status, source of start-up capital, gender, education level, and total business experience of the manager were not statistically significant variables influencing the overall performance variations among the agri-food processors under the study. The study has identified both significant and non-significant factors influencing performance variations among low, average, and high-performing micro and small agri-food processing firms in Tanzania. Therefore, results from this study will help managers, policymakers and researchers to identify areas where more attention should be placed in order to improve overall performance of MSEs in agri-food industry.

Keywords: firm-level factors, micro and small enterprises, performance, regression analysis, resource-based-view

Procedia PDF Downloads 68
3290 Care: A Cluster Based Approach for Reliable and Efficient Routing Protocol in Wireless Sensor Networks

Authors: K. Prasanth, S. Hafeezullah Khan, B. Haribalakrishnan, D. Arun, S. Jayapriya, S. Dhivya, N. Vijayarangan

Abstract:

The main goal of our approach is to find the optimum positions for the sensor nodes, reinforcing the communications in points where certain lack of connectivity is found. Routing is the major problem in sensor network’s data transfer between nodes. We are going to provide an efficient routing technique to make data signal transfer to reach the base station soon without any interruption. Clustering and routing are the two important key factors to be considered in case of WSN. To carry out the communication from the nodes to their cluster head, we propose a parameterizable protocol so that the developer can indicate if the routing has to be sensitive to either the link quality of the nodes or the their battery levels.

Keywords: clusters, routing, wireless sensor networks, three phases, sensor networks

Procedia PDF Downloads 489
3289 Processing Big Data: An Approach Using Feature Selection

Authors: Nikat Parveen, M. Ananthi

Abstract:

Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.

Keywords: big data, key value, feature selection, retrieval, performance

Procedia PDF Downloads 323
3288 Assessment of the Occupancy’s Effect on Speech Intelligibility in Al-Madinah Holy Mosque

Authors: Wasim Orfali, Hesham Tolba

Abstract:

This research investigates the acoustical characteristics of Al-Madinah Holy Mosque. Extensive field measurements were conducted in different locations of Al-Madinah Holy Mosque to characterize its acoustic characteristics. The acoustical characteristics are usually evaluated by the use of objective parameters in unoccupied rooms due to practical considerations. However, under normal conditions, the room occupancy can vary such characteristics due to the effect of the additional sound absorption present in the room or by the change in signal-to-noise ratio. Based on the acoustic measurements carried out in Al-Madinah Holy Mosque with and without occupancy, and the analysis of such measurements, the existence of acoustical deficiencies has been confirmed.

Keywords: Al-Madinah Holy Mosque, mosque acoustics, speech intelligibility, worship sound

Procedia PDF Downloads 158
3287 Comparison Analysis of Multi-Channel Echo Cancellation Using Adaptive Filters

Authors: Sahar Mobeen, Anam Rafique, Irum Baig

Abstract:

Acoustic echo cancellation in multichannel is a system identification application. In real time environment, signal changes very rapidly which required adaptive algorithms such as Least Mean Square (LMS), Leaky Least Mean Square (LLMS), Normalized Least Mean square (NLMS) and average (AFA) having high convergence rate and stable. LMS and NLMS are widely used adaptive algorithm due to less computational complexity and AFA used of its high convergence rate. This research is based on comparison of acoustic echo (generated in a room) cancellation thorough LMS, LLMS, NLMS, AFA and newly proposed average normalized leaky least mean square (ANLLMS) adaptive filters.

Keywords: LMS, LLMS, NLMS, AFA, ANLLMS

Procedia PDF Downloads 543