Search results for: GLCM texture features
3140 Thermal Hysteresis Activity of Ice Binding Proteins during Ice Crystal Growth in Sucrose Solution
Authors: Bercem Kiran-Yildirim, Volker Gaukel
Abstract:
Ice recrystallization (IR) which occurs especially during frozen storage is an undesired process due to the possible influence on the quality of products. As a result of recrystallization, the total volume of ice remains constant, but the size, number, and shape of ice crystals change. For instance, as indicated in the literature, the size of ice crystals in ice cream increases due to recrystallization. This results in texture deterioration. Therefore, the inhibition of ice recrystallization is of great importance, not only for food industry but also for several other areas where sensitive products are stored frozen, like pharmaceutical products or organs and blood in medicine. Ice-binding proteins (IBPs) have the unique ability to inhibit ice growth and in consequence inhibit recrystallization. This effect is based on their ice binding affinity. In the presence of IBP in a solution, ice crystal growth is inhibited during temperature decrease until a certain temperature is reached. The melting during temperature increase is not influenced. The gap between melting and freezing points is known as thermal hysteresis (TH). In literature, the TH activity is usually investigated under laboratory conditions in IBP buffer solutions. In product applications (e.g., food) there are many other solutes present which may influence the TH activity. In this study, a subset of IBPs, so-called antifreeze proteins (AFPs), is used for the investigation of the influence of sucrose solution concentration on the TH activity. For the investigation, a polarization microscope (Nikon Eclipse LV100ND) equipped with a digital camera (Nikon DS-Ri1) and a cold stage (Linkam LTS420) was used. In a first step, the equipment was established and validated concerning the accuracy of TH measurements based on literature data.Keywords: ice binding proteins, ice crystals, sucrose solution, thermal hysteresis
Procedia PDF Downloads 1863139 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals
Authors: Christine F. Boos, Fernando M. Azevedo
Abstract:
Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing
Procedia PDF Downloads 5303138 Integrated Geotechnical and Geophysical Investigation of a Proposed Construction Site at Mowe, Southwestern Nigeria
Authors: Kayode Festus Oyedele, Sunday Oladele, Adaora Chibundu Nduka
Abstract:
The subsurface of a proposed site for building development in Mowe, Nigeria, using Standard Penetration Test (SPT) and Cone Penetrometer Test (CPT) supplemented with Horizontal Electrical Profiling (HEP) was investigated with the aim of evaluating the suitability of the strata for foundation materials. Four SPT and CPT were implemented using 10 tonnes hammer. HEP utilizing Wenner array were performed with inter-electrode spacing of 10 – 60 m along four traverses coincident with each of the SPT and CPT. The HEP data were processed using DIPRO software and textural filtering of the resulting resistivity sections was implemented to enable delineation of hidden layers. Sandy lateritic clay, silty lateritic clay, clay, clayey sand and sand horizons were delineated. The SPT “N” value defined very soft to soft sandy lateritic (<4), stiff silty lateritic clay (7 – 12), very stiff silty clay (12 - 15), clayey sand (15- 20) and sand (27 – 37). Sandy lateritic clay (5-40 kg/cm2) and silty lateritic clay (25 - 65 kg/cm2) were defined from the CPT response. Sandy lateritic clay (220-750 Ωm), clay (< 50 Ωm) and sand (415-5359 Ωm) were delineated from the resistivity sections with two thin layers of silty lateritic clay and clayey sand defined in the texturally filtered resistivity sections. This study concluded that the presence of incompetent thick clayey materials (18 m) beneath the study area makes it unsuitable for shallow foundation. Deep foundation involving piling through the clayey layers to the competent sand at 20 m depth was recommended.Keywords: cone penetrometer, foundation, lithologic texture, resistivity section, standard penetration test
Procedia PDF Downloads 2653137 Behavioral Response of Dogs to Interior Environment: An Exploratory Study on Design Parameters for Designing Dog Boarding Centers in Indian Context
Authors: M. R. Akshaya, Veena Rao
Abstract:
Pet population in India is increasing phenomenally owing to the changes in urban lifestyle with increasing number of single professionals, single parents, delayed parenthood etc. The animal companionship as a means of reducing stress levels, deriving emotional support, and unconditional love provided by dogs are a few reasons attributed for increasing pet ownership. The consequence is the booming of the pet care products and dog care centers catering to the different requirements of rearing the pets. Dog care centers quite popular in tier 1 metros of India cater to the requirement of the dog owners providing space for the dogs in absence of the owner. However, it is often reported that the absence of the owner leads to destructive and exploratory behavior issues; the main being the anxiety disorders. In the above context, it becomes imperative for a designer to design dog boarding centers that help in reducing the separation anxiety in dogs keeping in mind the different interior design parameters. An exploratory research with focus group discussion is employed involving a group of dog owners, behaviorists, proprietors of day care as well as boarding centers, and veterinarians to understand their perception on the significance of different interior parameters of color, texture, ventilation, aroma therapy and acoustics as a means of reducing the stress levels in dogs sent to the boarding centers. The data collected is organized as thematic networks thus enabling the listing of the interior design parameters that needs to be considered in designing dog boarding centers.Keywords: behavioral response, design parameters, dog boarding centers, interior environment
Procedia PDF Downloads 2053136 Reduced Lung Volume: A Possible Cause of Stuttering
Authors: Shantanu Arya, Sachin Sakhuja, Gunjan Mehta, Sanjay Munjal
Abstract:
Stuttering may be defined as a speech disorder affecting the fluency domain of speech and characterized by covert features like word substitution, omittance and circumlocution and overt features like prolongation of sound, syllables and blocks etc. Many etiologies have been postulated to explain stuttering based on various experiments and research. Moreover, Breathlessness has also been reported by many individuals with stuttering for which breathing exercises are generally advised. However, no studies reporting objective evaluation of the pulmonary capacity and further objective assessment of the efficacy of breathing exercises have been conducted. Pulmonary Function Test which evaluates parameters like Forced Vital Capacity, Peak Expiratory Flow Rate, Forced expiratory flow Rate can be used to study the pulmonary behavior of individuals with stuttering. The study aimed: a) To identify speech motor & physiologic behaviours associated with stuttering by administering PFT. b) To recognize possible reasons for an association between speech motor behaviour & stuttering severity. In this regard, PFT tests were administered on individuals who reported signs and symptoms of stuttering and showed abnormal scores on Stuttering Severity Index. Parameters like Forced Vital Capacity, Forced Expiratory Volume, Peak Expiratory Flow Rate (L/min), Forced Expiratory Flow Rate (L/min) were evaluated and correlated with scores of Stuttering Severity Index. Results showed significant decrease in the parameters (lower than normal scores) in individuals with established stuttering. Strong correlation was also found between degree of stuttering and the degree of decrease in the pulmonary volumes. Thus, it is evident that fluent speech requires strong support of lung pressure and requisite volumes. Further research in demonstrating the efficacy of abdominal breathing exercises in this regard is needed.Keywords: forced expiratory flow rate, forced expiratory volume, forced vital capacity, peak expiratory flow rate, stuttering
Procedia PDF Downloads 2753135 A Comprehensive Analysis of the Rheological Properties of Polymer Hydrogels in Order to Explore Their Potential for Practical Utilization in Industries
Authors: Raana Babadi Fathipour
Abstract:
Hydrogels are three-dimensional structures formed by the interweaving of polymeric materials, possessing the remarkable ability to imbibe copious amounts of water. Numerous methodologies have been devised for examining and understanding the properties of these synthesized gels. Amongst them, spectroscopic techniques such as ultraviolet/visible (UV/Vis) and Fourier-transform infrared (FTIR) spectroscopy offer a glimpse into molecular and atomic aspects. Additionally, diffraction methods like X-ray diffraction (XRD) enable one to measure crystallinity within the gel's structure, while microscopy tools encompassing scanning electron microscopy (SEM) and transmission electron microscopy (TEM) provide insights into surface texture and morphology. Furthermore, rheology serves as an invaluable tool for unraveling the viscoelastic behavior inherent in hydrogels—a parameter crucial not only to numerous industries, including pharmaceuticals, cosmetics, food processing, agriculture and water treatment, but also pivotal to related fields of research. Likewise, the ultimate configuration of the product is contingent upon its characterization at a microscopic scale in order to comprehend the intricacies of the hydrogel network's structure and interaction dynamics in response to external forces. Within this present scrutiny, our attention has been devoted to unraveling the intricate rheological tendencies exhibited by materials founded on synthetic, natural, and semi-synthetic hydrogels. We also explore their practical utilization within various facets of everyday life from an industrial perspective.Keywords: rheology, hydrogels characterization, viscoelastic behavior, application
Procedia PDF Downloads 523134 An Image Processing Scheme for Skin Fungal Disease Identification
Authors: A. A. M. A. S. S. Perera, L. A. Ranasinghe, T. K. H. Nimeshika, D. M. Dhanushka Dissanayake, Namalie Walgampaya
Abstract:
Nowadays, skin fungal diseases are mostly found in people of tropical countries like Sri Lanka. A skin fungal disease is a particular kind of illness caused by fungus. These diseases have various dangerous effects on the skin and keep on spreading over time. It becomes important to identify these diseases at their initial stage to control it from spreading. This paper presents an automated skin fungal disease identification system implemented to speed up the diagnosis process by identifying skin fungal infections in digital images. An image of the diseased skin lesion is acquired and a comprehensive computer vision and image processing scheme is used to process the image for the disease identification. This includes colour analysis using RGB and HSV colour models, texture classification using Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix and Local Binary Pattern, Object detection, Shape Identification and many more. This paper presents the approach and its outcome for identification of four most common skin fungal infections, namely, Tinea Corporis, Sporotrichosis, Malassezia and Onychomycosis. The main intention of this research is to provide an automated skin fungal disease identification system that increase the diagnostic quality, shorten the time-to-diagnosis and improve the efficiency of detection and successful treatment for skin fungal diseases.Keywords: Circularity Index, Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix, Local Binary Pattern, Object detection, Ring Detection, Shape Identification
Procedia PDF Downloads 2333133 The Relationship between Lithological and Geomechanical Properties of Carbonate Rocks. Case study: Arab-D Reservoir Outcrop Carbonate, Central Saudi Arabia
Authors: Ammar Juma Abdlmutalib, Osman Abdullatif
Abstract:
Upper Jurrasic Arab-D Reservoir is considered as the largest oil reservoir in Saudi Arabia. The equivalent outcrop is exposed near Riyadh. The study investigates the relationships between lithofacies properties changes and geomechanical properties of Arab-D Reservoir in the outcrop scale. The methods used included integrated field observations and laboratory measurements. Schmidt Hammer Rebound Hardness, Point Load Index tests were carried out to estimate the strength of the samples, ultrasonic wave velocity test also was applied to measure P-wave, S-wave, and dynamic Poisson's ratio. Thin sections have been analyzed and described. The results show that there is a variation in geomechanical properties between the Arab-D member and Upper Jubaila Formation at outcrop scale, the change in texture or grain size has no or little effect on these properties. This is because of the clear effect of diagenesis which changes the strength of the samples. The result also shows the negative or inverse correlation between porosity and geomechanical properties. As for the strength, dolomitic mudstone and wackestone within Upper Jubaila Formation has higher Schmidt hammer values, wavy rippled sandy grainstone which is rich in quarts has the greater point load index values. While laminated mudstone and breccias, facies has lower strength. This emphasizes the role of mineral content in the geomechanical properties of Arab-D reservoir lithofacies.Keywords: geomechanical properties, Arab-D reservoir, lithofacies changes, Poisson's ratio, diageneis
Procedia PDF Downloads 4003132 Reimaging Archetype of Mosque: A Case Study on Contemporary Mosque Architecture in Bangladesh
Authors: Sabrina Rahman
Abstract:
The Mosque is Islam’s most symbolic structure, as well as the expression of collective identity. From the explicit words of our Prophet, 'The earth has been created for me as a masjid and a place of purity, and whatever man from my Ummah finds himself in need of prayer, let him pray' (anywhere)! it is obvious that a devout Muslim does not require a defined space or structure for divine worship since the whole earth is his prayer house. Yet we see that from time immemorial man throughout the Muslim world has painstakingly erected innumerable mosques. However, mosque design spans time, crosses boundaries, and expresses cultures. It is a cultural manifestation as much as one based on a regional building tradition or a certain interpretation of religion. The trend to express physical signs of religion is not new. Physical forms seem to convey symbolic messages. However, in recent times physical forms of mosque architecture are dominantly demising from mosque architecture projects in Bangladesh. Dome & minaret, the most prominent symbol of the mosque, is replacing by contextual and contemporary improvisation rather than subcontinental mosque architecture practice of early fellows. Thus the recent mosque projects of the last 15 years established the contemporary architectural realm in their design. Contextually, spiritual lighting, the serenity of space, tranquility of outdoor spaces, the texture of materials is widely establishing a new genre of Muslim prayer space. A case study based research will lead to specify its significant factors of modernism. Based on the findings, the paper presents evidence of recent projects as well as a guideline for the future image of contemporary Mosque architecture in Bangladesh.Keywords: contemporary architecture, modernism, prayer space, symbolism
Procedia PDF Downloads 1223131 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education
Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue
Abstract:
In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education
Procedia PDF Downloads 1093130 Improving Security by Using Secure Servers Communicating via Internet with Standalone Secure Software
Authors: Carlos Gonzalez
Abstract:
This paper describes the use of the Internet as a feature to enhance the security of our software that is going to be distributed/sold to users potentially all over the world. By placing in a secure server some of the features of the secure software, we increase the security of such software. The communication between the protected software and the secure server is done by a double lock algorithm. This paper also includes an analysis of intruders and describes possible responses to detect threats.Keywords: internet, secure software, threats, cryptography process
Procedia PDF Downloads 3343129 Modeling Visual Memorability Assessment with Autoencoders Reveals Characteristics of Memorable Images
Authors: Elham Bagheri, Yalda Mohsenzadeh
Abstract:
Image memorability refers to the phenomenon where certain images are more likely to be remembered by humans than others. It is a quantifiable and intrinsic attribute of an image. Understanding how visual perception and memory interact is important in both cognitive science and artificial intelligence. It reveals the complex processes that support human cognition and helps to improve machine learning algorithms by mimicking the brain's efficient data processing and storage mechanisms. To explore the computational underpinnings of image memorability, this study examines the relationship between an image's reconstruction error, distinctiveness in latent space, and its memorability score. A trained autoencoder is used to replicate human-like memorability assessment inspired by the visual memory game employed in memorability estimations. This study leverages a VGG-based autoencoder that is pre-trained on the vast ImageNet dataset, enabling it to recognize patterns and features that are common to a wide and diverse range of images. An empirical analysis is conducted using the MemCat dataset, which includes 10,000 images from five broad categories: animals, sports, food, landscapes, and vehicles, along with their corresponding memorability scores. The memorability score assigned to each image represents the probability of that image being remembered by participants after a single exposure. The autoencoder is finetuned for one epoch with a batch size of one, attempting to create a scenario similar to human memorability experiments where memorability is quantified by the likelihood of an image being remembered after being seen only once. The reconstruction error, which is quantified as the difference between the original and reconstructed images, serves as a measure of how well the autoencoder has learned to represent the data. The reconstruction error of each image, the error reduction, and its distinctiveness in latent space are calculated and correlated with the memorability score. Distinctiveness is measured as the Euclidean distance between each image's latent representation and its nearest neighbor within the autoencoder's latent space. Different structural and perceptual loss functions are considered to quantify the reconstruction error. The results indicate that there is a strong correlation between the reconstruction error and the distinctiveness of images and their memorability scores. This suggests that images with more unique distinct features that challenge the autoencoder's compressive capacities are inherently more memorable. There is also a negative correlation between the reduction in reconstruction error compared to the autoencoder pre-trained on ImageNet, which suggests that highly memorable images are harder to reconstruct, probably due to having features that are more difficult to learn by the autoencoder. These insights suggest a new pathway for evaluating image memorability, which could potentially impact industries reliant on visual content and mark a step forward in merging the fields of artificial intelligence and cognitive science. The current research opens avenues for utilizing neural representations as instruments for understanding and predicting visual memory.Keywords: autoencoder, computational vision, image memorability, image reconstruction, memory retention, reconstruction error, visual perception
Procedia PDF Downloads 923128 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features
Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh
Abstract:
In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve
Procedia PDF Downloads 2643127 Lithuanian Sign Language Literature: Metaphors at the Phonological Level
Authors: Anželika Teresė
Abstract:
In order to solve issues in sign language linguistics, address matters pertaining to maintaining high quality of sign language (SL) translation, contribute to dispelling misconceptions about SL and deaf people, and raise awareness and understanding of the deaf community heritage, this presentation discusses literature in Lithuanian Sign Language (LSL) and inherent metaphors that are created by using the phonological parameter –handshape, location, movement, palm orientation and nonmanual features. The study covered in this presentation is twofold, involving both the micro-level analysis of metaphors in terms of phonological parameters as a sub-lexical feature and the macro-level analysis of the poetic context. Cognitive theories underlie research of metaphors in sign language literature in a range of SL. The study follows this practice. The presentation covers the qualitative analysis of 34 pieces of LSL literature. The analysis employs ELAN software widely used in SL research. The target is to examine how specific types of each phonological parameter are used for the creation of metaphors in LSL literature and what metaphors are created. The results of the study show that LSL literature employs a range of metaphors created by using classifier signs and by modifying the established signs. The study also reveals that LSL literature tends to create reference metaphors indicating status and power. As the study shows, LSL poets metaphorically encode status by encoding another meaning in the same sign, which results in creating double metaphors. The metaphor of identity has been determined. Notably, the poetic context has revealed that the latter metaphor can also be identified as a metaphor for life. The study goes on to note that deaf poets create metaphors related to the importance of various phenomena significance of the lyrical subject. Notably, the study has allowed detecting locations, nonmanual features and etc., never mentioned in previous SL research as used for the creation of metaphors.Keywords: Lithuanian sign language, sign language literature, sign language metaphor, metaphor at the phonological level, cognitive linguistics
Procedia PDF Downloads 1373126 Dido: An Automatic Code Generation and Optimization Framework for Stencil Computations on Distributed Memory Architectures
Authors: Mariem Saied, Jens Gustedt, Gilles Muller
Abstract:
We present Dido, a source-to-source auto-generation and optimization framework for multi-dimensional stencil computations. It enables a large programmer community to easily and safely implement stencil codes on distributed-memory parallel architectures with Ordered Read-Write Locks (ORWL) as an execution and communication back-end. ORWL provides inter-task synchronization for data-oriented parallel and distributed computations. It has been proven to guarantee equity, liveness, and efficiency for a wide range of applications, particularly for iterative computations. Dido consists mainly of an implicitly parallel domain-specific language (DSL) implemented as a source-level transformer. It captures domain semantics at a high level of abstraction and generates parallel stencil code that leverages all ORWL features. The generated code is well-structured and lends itself to different possible optimizations. In this paper, we enhance Dido to handle both Jacobi and Gauss-Seidel grid traversals. We integrate temporal blocking to the Dido code generator in order to reduce the communication overhead and minimize data transfers. To increase data locality and improve intra-node data reuse, we coupled the code generation technique with the polyhedral parallelizer Pluto. The accuracy and portability of the generated code are guaranteed thanks to a parametrized solution. The combination of ORWL features, the code generation pattern and the suggested optimizations, make of Dido a powerful code generation framework for stencil computations in general, and for distributed-memory architectures in particular. We present a wide range of experiments over a number of stencil benchmarks.Keywords: stencil computations, ordered read-write locks, domain-specific language, polyhedral model, experiments
Procedia PDF Downloads 1293125 The Impact of Enzymatic Treatments on the Pasting Behavior and Its Reflection on Stalling and Quality of Bread
Authors: Sayed Mostafa, Mohamed Shebl
Abstract:
The problem of bread stalling is still one of the most troubling problems for those interested in manufacturing bakery products, as increasing the freshness period of bread is considered one of the most important factors that help encourage this industry due to its important role in reducing expected losses. Therefore, this study aims to improve the quality of pan bread and increase its freshness period by enzymatic treatments, including maltogenic α-amylase (MAA), amyloglucosidase (AGS), glucoseoxidase (GOX) and phospholipase (PhL). Rheological and pasting behavior of wheat flour were estimated in addition to the physical, texture, and sensory parameters of the final product. The addition of MAA resulted in a decrease in peak viscosity, breakdown, setback, and pasting temperature. The addition of MAA also led to a reduction in falling number values. Enzymatic treatments (MAA and PhL) exhibited higher alkaline water retention capacity of pan bread compared to untreated pan bread (control) throughout different storage periods. Furthermore, other enzymes displayed varying effects on bread quality; for instance, AGS enhanced the crust color, while a high concentration of GOX improved the specific volume of the bread. Conclusion: The research findings demonstrate that the enzymatic treatments can significantly improve its quality attributes, such as specific volume, increase the alkaline water retention capacity with lower hardness value, which reflects bread freshness during storage periods, and improve sensory characteristics.Keywords: anti-stalling agents, enzymatic treatments, maltogenic α-amylase, amyloglucosidase, glucoseoxidase, phospholipase, pasting behavior, wheat flour
Procedia PDF Downloads 133124 A Theoretical Study on Pain Assessment through Human Facial Expresion
Authors: Mrinal Kanti Bhowmik, Debanjana Debnath Jr., Debotosh Bhattacharjee
Abstract:
A facial expression is undeniably the human manners. It is a significant channel for human communication and can be applied to extract emotional features accurately. People in pain often show variations in facial expressions that are readily observable to others. A core of actions is likely to occur or to increase in intensity when people are in pain. To illustrate the changes in the facial appearance, a system known as Facial Action Coding System (FACS) is pioneered by Ekman and Friesen for human observers. According to Prkachin and Solomon, a set of such actions carries the bulk of information about pain. Thus, the Prkachin and Solomon pain intensity (PSPI) metric is defined. So, it is very important to notice that facial expressions, being a behavioral source in communication media, provide an important opening into the issues of non-verbal communication in pain. People express their pain in many ways, and this pain behavior is the basis on which most inferences about pain are drawn in clinical and research settings. Hence, to understand the roles of different pain behaviors, it is essential to study the properties. For the past several years, the studies are concentrated on the properties of one specific form of pain behavior i.e. facial expression. This paper represents a comprehensive study on pain assessment that can model and estimate the intensity of pain that the patient is suffering. It also reviews the historical background of different pain assessment techniques in the context of painful expressions. Different approaches incorporate FACS from psychological views and a pain intensity score using the PSPI metric in pain estimation. This paper investigates in depth analysis of different approaches used in pain estimation and presents different observations found from each technique. It also offers a brief study on different distinguishing features of real and fake pain. Therefore, the necessity of the study lies in the emerging fields of painful face assessment in clinical settings.Keywords: facial action coding system (FACS), pain, pain behavior, Prkachin and Solomon pain intensity (PSPI)
Procedia PDF Downloads 3483123 Sociolinguistic Aspects and Language Contact, Lexical Consequences in Francoprovençal Settings
Authors: Carmela Perta
Abstract:
In Italy the coexistence of standard language, its varieties and different minority languages - historical and migration languages - has been a way to study language contact in different directions; the focus of most of the studies is either the relations among the languages of the social repertoire, or the study of contact phenomena occurring in a particular structural level. However, studies on contact facts in relation to a given sociolinguistic situation of the speech community are still not present in literature. As regard the language level to investigate from the perspective of contact, it is commonly claimed that the lexicon is the most volatile part of language and most likely to undergo change due to superstrate influence, indeed first lexical features are borrowed, then, under long term cultural pressure, structural features may also be borrowed. The aim of this paper is to analyse language contact in two historical minority communities where Francoprovençal is spoken, in relation to their sociolinguistic situation. In this perspective, firstly lexical borrowings present in speakers’ speech production will be examined, trying to find a possible correlation between this part of the lexicon and informants’ sociolinguistic variables; secondly a possible correlation between a particular community sociolinguistic situation and lexical borrowing will be found. Methods used to collect data are based on the results obtained from 24 speakers in both the villages; the speaker group in the two communities consisted of 3 males and 3 females in each of four age groups, ranging in age from 9 to 85, and then divided into five groups according to their occupations. Speakers were asked to describe a sequence of pictures naming common objects and then describing scenes when they used these objects: they are common objects, frequently pronounced and belonging to semantic areas which are usually resistant and which are thought to survive. A subset of this task, involving 19 items with Italian source is examined here: in order to determine the significance of the independent variables (social factors) on the dependent variable (lexical variation) the statistical package SPSS, particularly the linear regression, was used.Keywords: borrowing, Francoprovençal, language change, lexicon
Procedia PDF Downloads 3733122 Role of Symbolism in the Journey towards Spirituality: A Case Study of Mosque Architecture in Bahrain
Authors: Ayesha Agha Shah
Abstract:
The purpose of a mosque or a place of worship is to build a spiritual relation with God. If the sense of spirituality is not achieved, then sacred architecture appears to be lacking depth. Form and space play a significant role to enhance the architectural quality to impart a divine feel to a place. To achieve this divine feeling, form and space, and unity of opposites, either abstract or symbolic can be employed. It is challenging to imbue the emptiness of a space with qualitative experience. Mosque architecture mostly entails traditional forms and design typology. This approach for Muslim worship produces distinct landmarks in the urban neighborhoods of Muslim societies, while creating a great sense of spirituality. The universal symbolic characters in the mosque architecture had prototype geometrical forms for a long time in history. However, modern mosques have deviated from this approach to employ different built elements and symbolism, which are often hard to be identified as related to mosques or even as Islamic. This research aims to explore the sense of spirituality in modern mosques and questions whether the modification of geometrical features produce spirituality in the same manner. The research also seeks to investigate the role of ‘geometry’ in the modern mosque architecture. The research employs the analytical study of some modern mosque examples in the Kingdom of Bahrain, reflecting on the geometry and symbolism adopted in the new mosque architecture design. It buttresses the analysis by the engagement of people’s perceptions derived using a survey of opinions. The research expects to see the significance of geometrical architectural elements in the mosque designs. It will find answers to the questions such as; what is the role of the form of the mosque, interior spaces and the effect of the modified symbolic features in the modern mosque design? How can the symbolic geometry, forms and spaces of a mosque invite a believer to leave the worldly environment behind and move towards spirituality?Keywords: geometry, mosque architecture, spirituality, symbolism
Procedia PDF Downloads 1153121 Sociolinguistic and Classroom Functions of Using Code-Switching in CLIL Context
Authors: Khatuna Buskivadze
Abstract:
The aim of the present study is to investigate the sociolinguistic and classroom functions and frequency of Teacher’s Code Switching (CS) in the Content and Language Integrated (CLIL) Lesson. Nowadays, Georgian society struggles to become the part of the European world, the English language itself plays a role in forming new generations with European values. Based on our research conducted in 2019, out of all 114 private schools in Tbilisi, full- programs of CLIL are taught in 7 schools, while only some subjects using CLIL are conducted in 3 schools. The goal of the former research was to define the features of Content and Language Integrated learning (CLIL) methodology within the process of teaching English on the Example of Georgian private high schools. Taking the Georgian reality and cultural features into account, the modified version of the questionnaire, based on the classification of using CS in ESL Classroom proposed By Ferguson (2009) was used. The qualitative research revealed students’ and teacher’s attitudes towards teacher’s code-switching in CLIL lesson. Both qualitative and quantitative research were conducted: the observations of the teacher’s lessons (Recording of T’s online lessons), interview and the questionnaire among Math’s T’s 20 high school students. We came to the several conclusions, some of them are given here: Math’s teacher’s CS behavior mostly serves (1) the conversational function of interjection; (2) the classroom functions of introducing unfamiliar materials and topics, explaining difficult concepts, maintaining classroom discipline and the structure of the lesson; The teacher and 13 students have negative attitudes towards using only Georgian in teaching Math. The higher level of English is the more negative is attitude towards using Georgian in the classroom. Although all the students were Georgian, their competence in English is higher than in Georgian, therefore they consider English as an inseparable part of their identities. The overall results of the case study of teaching Math (Educational discourse) in one of the private schools in Tbilisi will be presented at the conference.Keywords: attitudes, bilingualism, code-switching, CLIL, conversation analysis, interactional sociolinguistics.
Procedia PDF Downloads 1623120 Real-Time Generative Architecture for Mesh and Texture
Abstract:
In the evolving landscape of physics-based machine learning (PBML), particularly within fluid dynamics and its applications in electromechanical engineering, robot vision, and robot learning, achieving precision and alignment with researchers' specific needs presents a formidable challenge. In response, this work proposes a methodology that integrates neural transformation with a modified smoothed particle hydrodynamics model for generating transformed 3D fluid simulations. This approach is useful for nanoscale science, where the unique and complex behaviors of viscoelastic medium demand accurate neurally-transformed simulations for materials understanding and manipulation. In electromechanical engineering, the method enhances the design and functionality of fluid-operated systems, particularly microfluidic devices, contributing to advancements in nanomaterial design, drug delivery systems, and more. The proposed approach also aligns with the principles of PBML, offering advantages such as multi-fluid stylization and consistent particle attribute transfer. This capability is valuable in various fields where the interaction of multiple fluid components is significant. Moreover, the application of neurally-transformed hydrodynamical models extends to manufacturing processes, such as the production of microelectromechanical systems, enhancing efficiency and cost-effectiveness. The system's ability to perform neural transfer on 3D fluid scenes using a deep learning algorithm alongside physical models further adds a layer of flexibility, allowing researchers to tailor simulations to specific needs across scientific and engineering disciplines.Keywords: physics-based machine learning, robot vision, robot learning, hydrodynamics
Procedia PDF Downloads 663119 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering
Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi
Abstract:
In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering
Procedia PDF Downloads 1523118 Development of Electrospun Porous Carbon Fibers from Cellulose/Polyacrylonitrile Blend
Authors: Zubair Khaliq, M. Bilal Qadir, Amir Shahzad, Zulfiqar Ali, Ahsan Nazir, Ali Afzal, Abdul Jabbar
Abstract:
Carbon fibers are one of the most demanding materials on earth due to their potential application in energy, high strength materials, and conductive materials. The nanostructure of carbon fibers offers enhanced properties of conductivity due to the larger surface area. The next generation carbon nanofibers demand the porous structure as it offers more surface area. Multiple techniques are used to produce carbon fibers. However, electrospinning followed by carbonization of the polymeric materials is easy to carry process on a laboratory scale. Also, it offers multiple diversity of changing parameters to acquire the desired properties of carbon fibers. Polyacrylonitrile (PAN) is the most used material for the production of carbon fibers due to its promising processing parameters. Also, cellulose is one of the highest yield producers of carbon fibers. However, the electrospinning of cellulosic materials is difficult due to its rigid chain structure. The combination of PAN and cellulose can offer a suitable solution for the production of carbon fibers. Both materials are miscible in the mixed solvent of N, N, Dimethylacetamide and lithium chloride. This study focuses on the production of porous carbon fibers as a function of PAN/Cellulose blend ratio, solution properties, and electrospinning parameters. These single polymer and blend with different ratios were electrospun to give fine fibers. The higher amount of cellulose offered more difficulty in electrospinning of nanofibers. After carbonization, the carbon fibers were studied in terms of their blend ratio, surface area, and texture. Cellulose contents offered the porous structure of carbon fibers. Also, the presence of LiCl contributed to the porous structure of carbon fibers.Keywords: cellulose, polyacrylonitrile, carbon nanofibers, electrospinning, blend
Procedia PDF Downloads 2043117 Failure of Agriculture Soil following the Passage of Tractors
Authors: Anis Eloud, Sayed Chehaibi
Abstract:
Compaction of agricultural soils as a result of the passage of heavy machinery on the fields is a problem that affects many agronomists and farmers since it results in a loss of yield of most crops. To remedy this, and raise the overall future of the food security challenge, we must study and understand the process of soil degradation. The present review is devoted to understanding the effect of repeated passages on agricultural land. The experiments were performed on a plot of the area of the ESIER, characterized by a clay texture in order to quantify the soil compaction caused by the wheels of the tractor during repeated passages on agricultural land. The test tractor CASE type puissance 110 hp and 5470 kg total mass of 3500 kg including the two rear axles and 1970 kg on the front axle. The state of soil compaction has been characterized by measuring its resistance to penetration by means of a penetrometer and direct manual reading, the density and permeability of the soil. Soil moisture was taken jointly. The measurements are made in the initial state before passing the tractor and after each pass varies from 1 to 7 on the track wheel inflated to 1.5 bar for the rear wheel and broke water to the level of valve and 4 bar for the front wheels. The passages are spaced to the average of one week. The results show that the passage of wheels on a farm tilled soil leads to compaction and the latter increases with the number of passages, especially for the upper 15 cm depth horizons. The first passage is characterized by the greatest effect. However, the effect of other passages do not follow a definite law for the complex behavior of granular media and the history of labor and the constraints it suffers from its formation.Keywords: wheel traffic, tractor, soil compaction, wheel
Procedia PDF Downloads 4843116 Enhancing Health Information Management with Smart Rings
Authors: Bhavishya Ramchandani
Abstract:
A little electronic device that is worn on the finger is called a smart ring. It incorporates mobile technology and has features that make it simple to use the device. These gadgets, which resemble conventional rings and are usually made to fit on the finger, are outfitted with features including access management, gesture control, mobile payment processing, and activity tracking. A poor sleep pattern, an irregular schedule, and bad eating habits are all part of the problems with health that a lot of people today are facing. Diets lacking fruits, vegetables, legumes, nuts, and whole grains are common. Individuals in India also experience metabolic issues. In the medical field, smart rings will help patients with problems relating to stomach illnesses and the incapacity to consume meals that are tailored to their bodies' needs. The smart ring tracks all bodily functions, including blood sugar and glucose levels, and presents the information instantly. Based on this data, the ring generates what the body will find to be perfect insights and a workable site layout. In addition, we conducted focus groups and individual interviews as part of our core approach and discussed the difficulties they're having maintaining the right diet, as well as whether or not the smart ring will be beneficial to them. However, everyone was very enthusiastic about and supportive of the concept of using smart rings in healthcare, and they believed that these rings may assist them in maintaining their health and having a well-balanced diet plan. This response came from the primary data, and also working on the Emerging Technology Canvas Analysis of smart rings in healthcare has led to a significant improvement in our understanding of the technology's application in the medical field. It is believed that there will be a growing demand for smart health care as people become more conscious of their health. The majority of individuals will finally utilize this ring after three to four years when demand for it will have increased. Their daily lives will be significantly impacted by it.Keywords: smart ring, healthcare, electronic wearable, emerging technology
Procedia PDF Downloads 643115 Theoretical Analysis of the Existing Sheet Thickness in the Calendering of Pseudoplastic Material
Authors: Muhammad Zahid
Abstract:
The mechanical process of smoothing and compressing a molten material by passing it through a number of pairs of heated rolls in order to produce a sheet of desired thickness is called calendering. The rolls that are in combination are called calenders, a term derived from kylindros the Greek word for the cylinder. It infects the finishing process used on cloth, paper, textiles, leather cloth, or plastic film and so on. It is a mechanism which is used to strengthen surface properties, minimize sheet thickness, and yield special effects such as a glaze or polish. It has a wide variety of applications in industries in the manufacturing of textile fabrics, coated fabrics, and plastic sheeting to provide the desired surface finish and texture. An analysis has been presented for the calendering of Pseudoplastic material. The lubrication approximation theory (LAT) has been used to simplify the equations of motion. For the investigation of the nature of the steady solutions that exist, we make use of the combination of exact solution and numerical methods. The expressions for the velocity profile, rate of volumetric flow and pressure gradient are found in the form of exact solutions. Furthermore, the quantities of interest by engineering point of view, such as pressure distribution, roll-separating force, and power transmitted to the fluid by the rolls are also computed. Some results are shown graphically while others are given in the tabulated form. It is found that the non-Newtonian parameter and Reynolds number serve as the controlling parameters for the calendering process.Keywords: calendering, exact solutions, lubrication approximation theory, numerical solutions, pseudoplastic material
Procedia PDF Downloads 1493114 An Efficient Emitting Supramolecular Material Derived from Calixarene: Synthesis, Optical and Electrochemical Features
Authors: Serkan Sayin, Songul F. Varol
Abstract:
High attention on the organic light-emitting diodes has been paid since their efficient properties in the flat panel displays, and solid-state lighting was realized. Because of their high efficient electroluminescence, brightness and providing eminent in the emission range, organic light emitting diodes have been preferred a material compared with the other materials consisting of the liquid crystal. Calixarenes obtained from the reaction of p-tert-butyl phenol and formaldehyde in a suitable base have been potentially used in various research area such as catalysis, enzyme immobilization, and applications, ion carrier, sensors, nanoscience, etc. In addition, their tremendous frameworks, as well as their easily functionalization, make them an effective candidate in the applied chemistry. Herein, a calix[4]arene derivative has been synthesized, and its structure has been fully characterized using Fourier Transform Infrared Spectrophotometer (FTIR), proton nuclear magnetic resonance (¹H-NMR), carbon-13 nuclear magnetic resonance (¹³C-NMR), liquid chromatography-mass spectrometry (LC-MS), and elemental analysis techniques. The calixarene derivative has been employed as an emitting layer in the fabrication of the organic light-emitting diodes. The optical and electrochemical features of calixarane-contained organic light-emitting diodes (Clx-OLED) have been also performed. The results showed that Clx-OLED exhibited blue emission and high external quantum efficacy. As a conclusion obtained results attributed that the synthesized calixarane derivative is a promising chromophore with efficient fluorescent quantum yield that provides it an attractive candidate for fabricating effective materials for fluorescent probes and labeling studies. This study was financially supported by the Scientific and Technological Research Council of Turkey (TUBITAK Grant no. 117Z402).Keywords: calixarene, OLED, supramolecular chemistry, synthesis
Procedia PDF Downloads 2543113 System Identification of Building Structures with Continuous Modeling
Authors: Ruichong Zhang, Fadi Sawaged, Lotfi Gargab
Abstract:
This paper introduces a wave-based approach for system identification of high-rise building structures with a pair of seismic recordings, which can be used to evaluate structural integrity and detect damage in post-earthquake structural condition assessment. The fundamental of the approach is based on wave features of generalized impulse and frequency response functions (GIRF and GFRF), i.e., wave responses at one structural location to an impulsive motion at another reference location in time and frequency domains respectively. With a pair of seismic recordings at the two locations, GFRF is obtainable as Fourier spectral ratio of the two recordings, and GIRF is then found with the inverse Fourier transformation of GFRF. With an appropriate continuous model for the structure, a closed-form solution of GFRF, and subsequent GIRF, can also be found in terms of wave transmission and reflection coefficients, which are related to structural physical properties above the impulse location. Matching the two sets of GFRF and/or GIRF from recordings and the model helps identify structural parameters such as wave velocity or shear modulus. For illustration, this study examines ten-story Millikan Library in Pasadena, California with recordings of Yorba Linda earthquake of September 3, 2002. The building is modelled as piecewise continuous layers, with which GFRF is derived as function of such building parameters as impedance, cross-sectional area, and damping. GIRF can then be found in closed form for some special cases and numerically in general. Not only does this study reveal the influential factors of building parameters in wave features of GIRF and GRFR, it also shows some system-identification results, which are consistent with other vibration- and wave-based results. Finally, this paper discusses the effectiveness of the proposed model in system identification.Keywords: wave-based approach, seismic responses of buildings, wave propagation in structures, construction
Procedia PDF Downloads 2343112 Anton Bruckner’s Requiem in Dm: The Reinterpretation of a Liturgical Genre in the Viennese Romantic Context
Authors: Sara Ramos Contioso
Abstract:
The premiere of Anton Bruckner's Requiem in Dm, in September 1849, represents a turning point in the composer's creative evolution. This Mass of the Dead, which was dedicated to the memory of his esteemed friend and mentor Franz Sailer, establishes the beginning of a new creative aesthetic in the composer´s production and links its liturgical development, which is contextualized in the monastery of St. Florian, to the use of a range of musicals possibilities that are projected by Bruckner on an orchestral texture with choir and organ. Set on a strict tridentine ritual model, this requiem exemplifies the religious aesthetics of a composer that is committed to the Catholic faith and that also links to its structure the reinterpretation of a religious model that, despite being romantic, shows a strong influence derived from the baroque or the Viennese Classicism language. Consequently, the study responds to the need to show the survival of the Requiem Mass within the romantic context of Vienna. Therefore, it draws on a detailed analysis of the score and the creative context of the composer with the intention of linking the work to the tradition of the genre and also specifying the stylistic particularities of its musical model within a variability of possibilities such as the contrasting precedents of Mozart, Haydn, Cherubini or Berlioz´s requiems. Tradition or modernity, liturgy or concert hall are aesthetic references that will condition the development of the Requiem Mass in the middle of the nineteenth century. In this context, this paper tries to recover Bruckner's Requiem in Dm as a musical model of the romantic ritual of deceased and as a stylistic reference of a creative composition that will condition the development of later liturgical works such as Liszt or DeLange (1868) ones.Keywords: liturgy, religious symbolism, requiem, romanticism
Procedia PDF Downloads 3393111 The Representation of Migrants in the UK and Saudi Arabia Press: A Cross-Linguistic Discourse Analysis Study
Authors: Eman Alatawi
Abstract:
The world is currently experiencing an upsurge in the number of international migrants, which has reached 281 million worldwide; in particular, both the UK and Saudi Arabia have recently been faced with an unprecedented number of immigrants. As a result, the media in these two countries is constantly posting news about the issue, and newspapers, in particular, play a vital role in shaping the public’s view of immigration issues. Because the media is an influential tool in society, it has the ability to construct a specific image of migrants and influence public opinion concerning immigrant groups. However, most of the existing studies have addressed the plight of migrants in the UK, Europe, and the US, and few have considered the Middle East; specifically, there is a pressing need for studies that focus on the press in Saudi Arabia, which is one of the main countries that is experiencing immigration at a tremendous rate. This paper employs critical discourse analysis (CDA) to examine the depiction of migrants in the British and Saudi Arabian media in order to explore the involvement of three linguistic features in the media’s representation of migrant-related topics. These linguistic features are the names, metaphors, and collocations that the press in the UK and in Saudi Arabia uses to describe migrants; the impact of these depictions is also considered. This comparative study could create a better understanding of how the Saudi Arabian press presents the topic of migrants and immigration, which will assist in extending the understanding of migration discourses beyond an Anglo-centric viewpoint. The main finding of this study was that both British and Saudi Arabian newspapers tended to represent migrants’ issues by painting migrants in a negative light through the use of negative references or names, metaphors, and collocations; furthermore, the media’s negative stereotyping of migrants was found to be consistent, which could have an influence on the public’s opinion of these minority groups. Such observations show that the issue is not as simple as individuals, press systems, or political affiliations.Keywords: representation, migrants, the UK press, Saudi Arabia press, cross-linguistic, discourse analysis
Procedia PDF Downloads 81