Search results for: sensory processing patterns
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6740

Search results for: sensory processing patterns

2180 Information Theoretic Approach for Beamforming in Wireless Communications

Authors: Syed Khurram Mahmud, Athar Naveed, Shoaib Arif

Abstract:

Beamforming is a signal processing technique extensively utilized in wireless communications and radars for desired signal intensification and interference signal minimization through spatial selectivity. In this paper, we present a method for calculation of optimal weight vectors for smart antenna array, to achieve a directive pattern during transmission and selective reception in interference prone environment. In proposed scheme, Mutual Information (MI) extrema are evaluated through an energy constrained objective function, which is based on a-priori information of interference source and desired array factor. Signal to Interference plus Noise Ratio (SINR) performance is evaluated for both transmission and reception. In our scheme, MI is presented as an index to identify trade-off between information gain, SINR, illumination time and spatial selectivity in an energy constrained optimization problem. The employed method yields lesser computational complexity, which is presented through comparative analysis with conventional methods in vogue. MI based beamforming offers enhancement of signal integrity in degraded environment while reducing computational intricacy and correlating key performance indicators.

Keywords: beamforming, interference, mutual information, wireless communications

Procedia PDF Downloads 273
2179 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 143
2178 Integrated Risk Assessment of Storm Surge and Climate Change for the Coastal Infrastructure

Authors: Sergey V. Vinogradov

Abstract:

Coastal communities are presently facing increased vulnerabilities due to rising sea levels and shifts in global climate patterns, a trend expected to escalate in the long run. To address the needs of government entities, the public sector, and private enterprises, there is an urgent need to thoroughly investigate, assess, and manage the present and projected risks associated with coastal flooding, including storm surges, sea level rise, and nuisance flooding. In response to these challenges, a practical approach to evaluating storm surge inundation risks has been developed. This methodology offers an integrated assessment of potential flood risk in targeted coastal areas. The physical modeling framework involves simulating synthetic storms and utilizing hydrodynamic models that align with projected future climate and ocean conditions. Both publicly available and site-specific data form the basis for a risk assessment methodology designed to translate inundation model outputs into statistically significant projections of expected financial and operational consequences. This integrated approach produces measurable indicators of impacts stemming from floods, encompassing economic and other dimensions. By establishing connections between the frequency of modeled flood events and their consequences across a spectrum of potential future climate conditions, our methodology generates probabilistic risk assessments. These assessments not only account for future uncertainty but also yield comparable metrics, such as expected annual losses for each inundation event. These metrics furnish stakeholders with a dependable dataset to guide strategic planning and inform investments in mitigation. Importantly, the model's adaptability ensures its relevance across diverse coastal environments, even in instances where site-specific data for analysis may be limited.

Keywords: climate, coastal, surge, risk

Procedia PDF Downloads 50
2177 Enhancement of Long Term Peak Demand Forecast in Peninsular Malaysia Using Hourly Load Profile

Authors: Nazaitul Idya Hamzah, Muhammad Syafiq Mazli, Maszatul Akmar Mustafa

Abstract:

The peak demand forecast is crucial to identify the future generation plant up needed in the long-term capacity planning analysis for Peninsular Malaysia as well as for the transmission and distribution network planning activities. Currently, peak demand forecast (in Mega Watt) is derived from the generation forecast by using load factor assumption. However, a forecast using this method has underperformed due to the structural changes in the economy, emerging trends and weather uncertainty. The dynamic changes of these drivers will result in many possible outcomes of peak demand for Peninsular Malaysia. This paper will look into the independent model of peak demand forecasting. The model begins with the selection of driver variables to capture long-term growth. This selection and construction of variables, which include econometric, emerging trend and energy variables, will have an impact on the peak forecast. The actual framework begins with the development of system energy and load shape forecast by using the system’s hourly data. The shape forecast represents the system shape assuming all embedded technology and use patterns to continue in the future. This is necessary to identify the movements in the peak hour or changes in the system load factor. The next step would be developing the peak forecast, which involves an iterative process to explore model structures and variables. The final step is combining the system energy, shape, and peak forecasts into the hourly system forecast then modifying it with the forecast adjustments. Forecast adjustments are among other sales forecasts for electric vehicles, solar and other adjustments. The framework will result in an hourly forecast that captures growth, peak usage and new technologies. The advantage of this approach as compared to the current methodology is that the peaks capture new technology impacts that change the load shape.

Keywords: hourly load profile, load forecasting, long term peak demand forecasting, peak demand

Procedia PDF Downloads 159
2176 The Use of Space Syntax in Urban Transportation Planning and Evaluation: Limits and Potentials

Authors: Chuan Yang, Jing Bie, Yueh-Lung Lin, Zhong Wang

Abstract:

Transportation planning is an academic integration discipline combining research and practice with the aim of mobility and accessibility improvements at both strategic-level policy-making and operational dimensions of practical planning. Transportation planning could build the linkage between traffic and social development goals, for instance, economic benefits and environmental sustainability. The transportation planning analysis and evaluation tend to apply empirical quantitative approaches with the guidance of the fundamental principles, such as efficiency, equity, safety, and sustainability. Space syntax theory has been applied in the spatial distribution of pedestrian movement or vehicle flow analysis, however rare has been written about its application in transportation planning. The correlated relationship between the variables of space syntax analysis and authentic observations have declared that the urban configurations have a significant effect on urban dynamics, for instance, land value, building density, traffic, crime. This research aims to explore the potentials of applying Space Syntax methodology to evaluate urban transportation planning through studying the effects of urban configuration on cities transportation performance. By literature review, this paper aims to discuss the effects that urban configuration with different degrees of integration and accessibility have on three elementary components of transportation planning - transportation efficiency, transportation safety, and economic agglomeration development - via intensifying and stabilising the nature movements generated by the street network. And then the potential and limits of Space Syntax theory to study the performance of urban transportation and transportation planning would be discussed in the paper. In practical terms, this research will help future research explore the effects of urban design on transportation performance, and identify which patterns of urban street networks would allow for most efficient and safe transportation performance with higher economic benefits.

Keywords: transportation planning, space syntax, economic agglomeration, transportation efficiency, transportation safety

Procedia PDF Downloads 188
2175 Non-Contact Measurement of Soil Deformation in a Cyclic Triaxial Test

Authors: Erica Elice Uy, Toshihiro Noda, Kentaro Nakai, Jonathan Dungca

Abstract:

Deformation in a conventional cyclic triaxial test is normally measured by using point-wise measuring device. In this study, non-contact measurement technique was applied to be able to monitor and measure the occurrence of non-homogeneous behavior of the soil under cyclic loading. Non-contact measurement is executed through image processing. Two-dimensional measurements were performed using Lucas and Kanade optical flow algorithm and it was implemented Labview. In this technique, the non-homogeneous deformation was monitored using a mirrorless camera. A mirrorless camera was used because it is economical and it has the capacity to take pictures at a fast rate. The camera was first calibrated to remove the distortion brought about the lens and the testing environment as well. Calibration was divided into 2 phases. The first phase was the calibration of the camera parameters and distortion caused by the lens. The second phase was to for eliminating the distortion brought about the triaxial plexiglass. A correction factor was established from this phase. A series of consolidated undrained cyclic triaxial test was performed using a coarse soil. The results from the non-contact measurement technique were compared to the measured deformation from the linear variable displacement transducer. It was observed that deformation was higher at the area where failure occurs.

Keywords: cyclic loading, non-contact measurement, non-homogeneous, optical flow

Procedia PDF Downloads 297
2174 Stabilizing Additively Manufactured Superalloys at High Temperatures

Authors: Keivan Davami, Michael Munther, Lloyd Hackel

Abstract:

The control of properties and material behavior by implementing thermal-mechanical processes is based on mechanical deformation and annealing according to a precise schedule that will produce a unique and stable combination of grain structure, dislocation substructure, texture, and dispersion of precipitated phases. The authors recently developed a thermal-mechanical technique to stabilize the microstructure of additively manufactured nickel-based superalloys even after exposure to high temperatures. However, the mechanism(s) that controls this stability is still under investigation. Laser peening (LP), also called laser shock peening (LSP), is a shock based (50 ns duration) post-processing technique used for extending performance levels and improving service life of critical components by developing deep levels of plastic deformation, thereby generating high density of dislocations and inducing compressive residual stresses in the surface and deep subsurface of components. These compressive residual stresses are usually accompanied with an increase in hardness and enhance the material’s resistance to surface-related failures such as creep, fatigue, contact damage, and stress corrosion cracking. While the LP process enhances the life span and durability of the material, the induced compressive residual stresses relax at high temperatures (>0.5Tm, where Tm is the absolute melting temperature), limiting the applicability of the technology. At temperatures above 0.5Tm, the compressive residual stresses relax, and yield strength begins to drop dramatically. The principal reason is the increasing rate of solid-state diffusion, which affects both the dislocations and the microstructural barriers. Dislocation configurations commonly recover by mechanisms such as climbing and recombining rapidly at high temperatures. Furthermore, precipitates coarsen, and grains grow; virtually all of the available microstructural barriers become ineffective.Our results indicate that by using “cyclic” treatments with sequential LP and annealing steps, the compressive stresses survive, and the microstructure is stable after exposure to temperatures exceeding 0.5Tm for a long period of time. When the laser peening process is combined with annealing, dislocations formed as a result of LPand precipitates formed during annealing have a complex interaction that provides further stability at high temperatures. From a scientific point of view, this research lays the groundwork for studying a variety of physical, materials science, and mechanical engineering concepts. This research could lead to metals operating at higher sustained temperatures enabling improved system efficiencies. The strengthening of metals by a variety of means (alloying, work hardening, and other processes) has been of interest for a wide range of applications. However, the mechanistic understanding of the often complex processes of interactionsbetween dislocations with solute atoms and with precipitates during plastic deformation have largely remained scattered in the literature. In this research, the elucidation of the actual mechanisms involved in the novel cyclic LP/annealing processes as a scientific pursuit is investigated through parallel studies of dislocation theory and the implementation of advanced experimental tools. The results of this research help with the validation of a novel laser processing technique for high temperature applications. This will greatly expand the applications of the laser peening technology originally devised only for temperatures lower than half of the melting temperature.

Keywords: laser shock peening, mechanical properties, indentation, high temperature stability

Procedia PDF Downloads 143
2173 The Evolution of the Human Brain from the Hind Brain to the Fore Brain: Dialectics from the African Perspective in Understanding Stunted Development in Science and Technology

Authors: Philemon Wokoma Iyagba, Obey Onenee Christie

Abstract:

From the hindbrain, which is responsible for motor activities, to the forebrain, responsible for processing information related to complex cognitive activities, the human brain has continued to evolve over the years. This evolution- has been progressive, leading to advancements in science and technology. However, the development of science and technology in Africa, where ancient civilization arguably began, has been retrogressive. Dialectics was done by dissecting different opinions on the reason behind the stunted development of science and technology in Africa. The researchers proposed that the inability to sustain the technological advancements made by early Africans is due to poor or lack of replicability of the African knowledge-based system, almost no or poor documentation of adopted procedures and the approval-seeking mentality that cheaply paved the way for westernization which also led to the adulteration of the African way of life and education without making room for incorporating her identity and proper alignment of her rich cultural heritage in education and her enormous achievements before and during the middle age. This article discussed conceptual issues, with its positions based on established facts, the discussion was based on relevant literature and recommendations were made accordingly.

Keywords: forebrain, hindbrain, dialectics from African perspective, development in science and technology

Procedia PDF Downloads 74
2172 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods

Authors: Bandar Alahmadi, Lethia Jackson

Abstract:

Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.

Keywords: adversarial examples, attack, computer vision, image processing

Procedia PDF Downloads 333
2171 Denoising of Motor Unit Action Potential Based on Tunable Band-Pass Filter

Authors: Khalida S. Rijab, Mohammed E. Safi, Ayad A. Ibrahim

Abstract:

When electrical electrodes are mounted on the skin surface of the muscle, a signal is detected when a skeletal muscle undergoes contraction; the signal is known as surface electromyographic signal (EMG). This signal has a noise-like interference pattern resulting from the temporal and spatial summation of action potentials (AP) of all active motor units (MU) near electrode detection. By appropriate processing (Decomposition), the surface EMG signal may be used to give an estimate of motor unit action potential. In this work, a denoising technique is applied to the MUAP signals extracted from the spatial filter (IB2). A set of signals from a non-invasive two-dimensional grid of 16 electrodes from different types of subjects, muscles, and sex are recorded. These signals will acquire noise during recording and detection. A digital fourth order band- pass Butterworth filter is used for denoising, with a tuned band-pass frequency of suitable choice of cutoff frequencies is investigated, with the aim of obtaining a suitable band pass frequency. Results show an improvement of (1-3 dB) in the signal to noise ratio (SNR) have been achieved, relative to the raw spatial filter output signals for all cases that were under investigation. Furthermore, the research’s goal included also estimation and reconstruction of the mean shape of the MUAP.

Keywords: EMG, Motor Unit, Digital Filter, Denoising

Procedia PDF Downloads 397
2170 Evaluation Of The Incorporation Of Modified Starch In Puff Pastry Dough By Mixolab Rheological Analysis

Authors: Alejandra Castillo-Arias, Carlos A. Fuenmayor, Carlos M. Zuluaga-Domínguez

Abstract:

The connection between health and nutrition has driven the food industry to explore healthier and more sustainable alternatives. Key strategies to enhance nutritional quality and extend shelf life include reducing saturated fats and incorporating natural ingredients. One area of focus is the use of modified starch in baked goods, which has attracted significant interest in food science and industry due to its functional benefits. Modified starches are commonly used for their gelling, thickening, and water-retention properties. Derived from sources like waxy corn, potatoes, tapioca, or rice, these polysaccharides improve thermal stability and resistance to dough. The use of modified starch enhances the texture and structure of baked goods, which is crucial for consumer acceptance. In this study, it was evaluated the effects of modified starch inclusion on dough used for puff pastry elaboration, measured with Mixolab analysis. This technique assesses flour quality by examining its behavior under varying conditions, providing a comprehensive profile of its baking properties. The analysis included measurements of water absorption capacity, dough development time, dough stability, softening, final consistency, and starch gelatinization. Each of these parameters offers insights into how the flour will perform during baking and the quality of the final product. The performance of wheat flour with varying levels of modified starch inclusion (10%, 20%, 30%, and 40%) was evaluated through Mixolab analysis, with a control sample consisting of 100% wheat flour. Water absorption, gluten content, and retrogradation indices were analyzed to understand how modified starch affects dough properties. The results showed that the inclusion of modified starch increased the absorption index, especially at levels above 30%, indicating a dough with better handling qualities and potentially improved texture in the final baked product. However, the reduction in wheat flour resulted in a lower kneading index, affecting dough strength. Conversely, incorporating more than 20% modified starch reduced the retrogradation index, indicating improved stability and resistance to crystallization after cooling. Additionally, the modified starch improved the gluten index, contributing to better dough elasticity and stability, providing good structural support and resistance to deformation during mixing and baking. As expected, the control sample exhibited a higher amylase index, due to the presence of enzymes in wheat flour. However, this is of low concern in puff pastry dough, as amylase activity is more relevant in fermented doughs, which is not the case here. Overall, the use of modified starch in puff pastry enhanced product quality by improving texture, structure, and shelf life, particularly when used at levels between 30% and 40%. This research underscores the potential of modified starches to address health concerns associated with traditional starches and to contribute to the development of higher-quality, consumer-friendly baked products. Furthermore, the findings suggest that modified starches could play a pivotal role in future innovations within the baking industry, particularly in products aiming to balance healthfulness with sensory appeal. By incorporating modified starch into their formulations, bakeries can meet the growing demand for healthier, more sustainable products while maintaining the indulgent qualities that consumers expect from baked goods.

Keywords: baking quality, dough properties, modified starch, puff pastry

Procedia PDF Downloads 11
2169 Inflammatory Alleviation on Microglia Cells by an Apoptotic Mimicry

Authors: Yi-Feng Kao, Huey-Jine Chai, Chin-I Chang, Yi-Chen Chen, June-Ru Chen

Abstract:

Microglia is a macrophage that resides in brain, and overactive microglia may result in brain neuron damage or inflammation. In this study, the phospholipids was extracted from squid skin and manufactured into a liposome (SQ liposome) to mimic apoptotic body. We then evaluated anti-inflammatory effects of SQ liposome on mouse microglial cell line (BV-2) by lipopolysaccharide (LPS) induction. First, the major phospholipid constituents in the squid skin extract were including 46.2% of phosphatidylcholine, 18.4% of phosphatidylethanolamine, 7.7% of phosphatidylserine, 3.5% of phosphatidylinositol, 4.9% of Lysophosphatidylcholine and 19.3% of other phospholipids by HPLC-UV analysis. The contents of eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) in the squid skin extract were 11.8 and 28.7%, respectively. The microscopic images showed that microglia cells can engulf apoptotic cells or SQ-liposome. In cell based studies, there was no cytotoxicity to BV-2 as the concentration of SQ-liposome was less than 2.5 mg/mL. The LPS induced pro-inflammatory cytokines, including tumor necrosis factor-alpha (TNF-α) and interleukin-6 (IL-6), were significant suppressed (P < 0.05) by pretreated 0.03~2.5mg/ml SQ liposome. Oppositely, the anti-inflammatory cytokines transforming growth factor-beta (TGF-β) and interleukin-10 (IL-10) secretion were enhanced (P < 0.05). The results suggested that SQ-liposome possess anti-inflammatory properties on BV-2 and may be a good strategy for against neuro-inflammatory disease.

Keywords: apoptotic mimicry, neuroinflammation, microglia, squid processing by-products

Procedia PDF Downloads 473
2168 Anthropometric Data Variation within Gari-Frying Population

Authors: T. M. Samuel, O. O. Aremu, I. O. Ismaila, L. I. Onu, B. O. Adetifa, S. E. Adegbite, O. O. Olokoshe

Abstract:

The imperative of anthropometry in designing to fit cannot be overemphasized. Of essence is the variability of measurements among population for which data is collected. In this paper anthropometric data were collected for the design of gari-frying facility such that work system would be designed to fit the gari-frying population in the Southwestern states of Nigeria comprising Lagos, Ogun, Oyo, Osun, Ondo, and Ekiti. Twenty-seven body dimensions were measured among 120 gari-frying processors. Statistical analysis was performed using SPSS package to determine the mean, standard deviation, minimum value, maximum value and percentiles (2nd, 5th, 25th, 50th, 75th, 95th, and 98th) of the different anthropometric parameters. One sample t-test was conducted to determine the variation within the population. The 50th percentiles of some of the anthropometric parameters were compared with those from other populations in literature. The correlation between the worker’s age and the body anthropometry was also investigated.The mean weight, height, shoulder height (sitting), eye height (standing) and eye height (sitting) are 63.37 kg, 1.57 m, 0.55 m, 1.45 m, and 0.67 m respectively.Result also shows a high correlation with other populations and a statistically significant difference in variability of data within the population in all the body dimensions measured. With a mean age of 42.36 years, results shows that age will be a wrong indicator for estimating the anthropometry for the population.

Keywords: anthropometry, cassava processing, design to fit, gari-frying, workstation design

Procedia PDF Downloads 249
2167 Marble Powder’s Effect on Permeability and Mechanical Properties of Concrete

Authors: Shams Ul Khaliq, Khan Shahzada, Bashir Alam, Fawad Bilal, Mushtaq Zeb, Faizan Akbar

Abstract:

Marble industry contributes its fair share in environmental deterioration, producing voluminous amounts of mud and other excess residues obtained from marble and granite processing, polluting soil, water and air. Reusing these products in other products will not just prevent our environment from polluting but also help with economy. In this research, an attempt has been made to study the expediency of waste Marble Powder (MP) in concrete production. Various laboratory tests were performed to investigate permeability, physical and mechanical properties, such as slump, compressive strength, split tensile test, etc. Concrete test samples were fabricated with varying MP content (replacing 5-30% cement), furnished from two different sources. 5% replacement of marble dust caused 6% and 12% decrease in compressive and tensile strength respectively. These parameters gradually decreased with increasing MP content up to 30%. Most optimum results were obtained with 10% replacement. Improvement in consistency and permeability were noticed. The permeability was improved with increasing MP proportion up to 10% without substantial decrease in compressive strength. Obtained results revealed that MP as an alternative to cement in concrete production is a viable option considering its economic and environment friendly implications.

Keywords: marble powder, strength, permeability, consistency, environment

Procedia PDF Downloads 324
2166 Genesis of Talc Bodies in Relation to the Mafic-Ultramafic Rocks around Wonu, Ibadan-Apomu Area, Southwestern Nigeria

Authors: Morenike Abimbola Adeleye, Anthony Temidayo Bolarinwa

Abstract:

The genesis of talc bodies around Wonu, Ibadan-Apomu area, southwestern Nigeria, has been speculative due to inadequate compositional data on the talc and the mafic-ultramafic protoliths. Petrography, morphology, using scanning electron microscope, mineral chemistry, X-ray diffraction, and major, trace and rare-earth element compositions of the talc and the mafic-ultramafic in the area were undertaken with a view to determine the genesis of the talc bodies. Fine-grained amphibolite and lherzolite are the major mafic-ultramafic rocks in the study area. The amphibolite is fine-grained, composed of amphiboles, pyroxenes plagioclase, K-feldspar, ilmenite, magnetite, and garnet. The lherzolite and talc are composed of olivines, pyroxenes, amphiboles, and plagioclase. Alteration minerals include serpentine, amesite, talc, Cr-bearing clinochlore, and ferritchromite. Cr-spinel, pyrite, and magnetite are the accessory minerals present. Alteration of olivines, pyroxenes, and amphiboles to talc and chlinochlore; and spinel to ferritchchromite by hydrothermal (H₂O-CO₂-Cl-HF) fluids, provided by the granitic intrusions in the area, showed retrograde metasomatism of amphibolites to greenschist facies at 500-550ºC. This led to the formation of talc, amesite, anthophyllite, actinolite, and tremolite. The Al₂O₃-Fe₂O₃+TiO₂-MgO discrimination diagram suggests tholeiitic protolith for the amphibolite and komatitic protolith for the lherzolite. The lherzolite has flat rare-earth element patterns typical of komatiites and dunites. The Al₂O₃/TiO₂ ratios, Ce/Nb vs. Th/Nb, Cr-TiO₂, TiO₂ vs. Al₂O₃, and Nd vs. Nb discrimination diagrams indicated that the talcs are from two-parent sources: altered metacarbonates and tholeiitic basalts (amphibolites) to komatitic basalts (lherzolites).

Keywords: amphibolites, lherzolites, talc, komatiite

Procedia PDF Downloads 210
2165 Mobile Devices and E-Learning Systems as a Cost-Effective Alternative for Digitizing Paper Quizzes and Questionnaires in Social Work

Authors: K. Myška, L. Pilařová

Abstract:

The article deals with possibilities of using cheap mobile devices with the combination of free or open source software tools as an alternative to professional hardware and software equipment. Especially in social work, it is important to find cheap yet functional solution that can compete with complex but expensive solutions for digitizing paper materials. Our research was focused on the analysis of cheap and affordable solutions for digitizing the most frequently used paper materials that are being commonly used by terrain workers in social work. We used comparative analysis as a research method. Social workers need to process data from paper forms quite often. It is still more affordable, time and cost-effective to use paper forms to get feedback in many cases. Collecting data from paper quizzes and questionnaires can be done with the help of professional scanners and software. These technologies are very powerful and have advanced options for digitizing and processing digitized data, but are also very expensive. According to results of our study, the combination of open source software and mobile phone or cheap scanner can be considered as a cost-effective alternative to professional equipment.

Keywords: digitalization, e-learning, mobile devices, questionnaire

Procedia PDF Downloads 148
2164 Exploring White-Matter Hyperintensities in Patients with Psychiatric Disorders and Their Clinical Relevance

Authors: Ubaid Ullah Kamgar, Ajaz Ahmed Suhaff, Mohammad Maqbool Dar

Abstract:

Objective: The aim is to study the association of MRI findings of T₂/FLAIR white matter hyperintensities among patients with psychiatric disorders. Background and Rationale: MRI findings in psychiatric disorders can vary widely depending on specific disorders and individual differences. However, some general patterns have been observed, such as, in Depression - reduced volume in areas such as the prefrontal cortex and hippocampus; in Schizophrenia - enlarged ventricles, abnormalities in frontal and temporal lobes, as well as hippocampus and thalamus; in Bipolar Disorder – reduced volume in the prefrontal cortex and hippocampus and abnormalities in the amygdala; in OCD – abnormalities in the orbitofrontal cortex, anterior cingulate cortex and striatum. However, many patients show findings of white-matter hyper-intensities, which are usually considered non-specific in psychiatry. These hyperintensities are low attenuation in the deep and white matter. The pathogenic mechanisms of white matter hyperintensities are not well-understood and have been attributed to cerebral small vessel disease. The aim of the study is to study the association of the above MRI findings in patients with psychiatric disorders after ruling out neurological disorders (if any are found). Methodology: Patients admitted to psychiatric hospitals or presenting to OPDs with underlying psychiatric disorders, having undergone MRI Brain as part of investigations, and having T₂/FLAIR white-matter hyperintensities on MRI were taken to study the association of the above MRI findings with different psychiatric disorders. Results: Out of the 22 patients having MRI findings of T₂/FLAIR white-matter hyper-intensities, the underlying psychiatric comorbidities were: Major Depressive Disorder in 7 pts; Obsessive Compulsive Disorder in 5 pts; Bipolar Disorder in 5 pts; Dementia (vascular type) in 5pts. Discussion and conclusion: In our study, the white matter hyper-intensities were found mostly in MDD (32%), OCD (22.7%), Bipolar Disorder (22.7%) and Dementia in 22.7% of patients. In conclusion, the presence of white-matter hyperintensities in psychiatric disorders underscores the complex interplay between vascular, neurobiological and psychosocial factors. Further research with a large sample size is needed to fully elucidate their clinical significance.

Keywords: white-matter hyperintensities, OCD, MDD, dementia, bipolar disorder.

Procedia PDF Downloads 53
2163 Investigation of the Growth Kinetics of Phases in Ni–Sn System

Authors: Varun A Baheti, Sanjay Kashyap, Kamanio Chattopadhyay, Praveen Kumar, Aloke Paul

Abstract:

Ni–Sn system finds applications in the microelectronics industry, especially with respect to flip–chip or direct chip, attach technology. Here the region of interest is under bump metallization (UBM), and solder bump (Sn) interface due to the formation of brittle intermetallic phases there. Understanding the growth of these phases at UBM/Sn interface is important, as in many cases it controls the electro–mechanical properties of the product. Cu and Ni are the commonly used UBM materials. Cu is used for good bonding because of fast reaction with solder and Ni often acts as a diffusion barrier layer due to its inherently slower reaction kinetics with Sn–based solders. Investigation on the growth kinetics of phases in Ni–Sn system is reported in this study. Just for simplicity, Sn being major solder constituent is chosen. Ni–Sn electroplated diffusion couples are prepared by electroplating pure Sn on Ni substrate. Bulk diffusion couples prepared by the conventional method are also studied along with Ni–Sn electroplated diffusion couples. Diffusion couples are annealed for 25–1000 h at 50–215°C to study the phase evolutions and growth kinetics of various phases. The interdiffusion zone was analysed using field emission gun equipped scanning electron microscope (FE–SEM) for imaging. Indexing of selected area diffraction (SAD) patterns obtained from transmission electron microscope (TEM) and composition measurements done in electron probe micro−analyser (FE–EPMA) confirms the presence of various product phases grown across the interdiffusion zone. Time-dependent experiments indicate diffusion controlled growth of the product phase. The estimated activation energy in the temperature range 125–215°C for parabolic growth constants (and hence integrated interdiffusion coefficients) of the Ni₃Sn₄ phase shed light on the growth mechanism of the phase; whether its grain boundary controlled or lattice controlled diffusion. The location of the Kirkendall marker plane indicates that the Ni₃Sn₄ phase grows mainly by diffusion of Sn in the binary Ni–Sn system.

Keywords: diffusion, equilibrium phase, metastable phase, the Ni-Sn system

Procedia PDF Downloads 300
2162 Spatial Element Importance and Its Relation to Characters’ Emotions and Self Awareness in Michela Murgia’s Collection of Short Stories Tre Ciotole. Rituali per Un Anno DI Crisi

Authors: Nikica Mihaljević

Abstract:

Published in 2023, "Tre ciotole. Rituali per un anno di crisi" is a collection of short stories completely disconnected from one another in regard to topics and the representation of characters. However, these short stories complete and somehow continue each other in a particular way. The book happens to be Murgia's last book, as the author died a few months later after the book's publication and it appears as a kind of summary of all her previous literary works. Namely, in her previous publications, Murgia already stressed certain characters' particularities, such as solitude and alienation from others, which are at the center of attention in this literary work, too. What all the stories present in "Tre ciotole" have in common is the dealing with characters' identity and self-awareness through the challenges they confront and the way the characters live their emotions in relation to the surrounding space. Although the challenges seem similar, the spatial element around the characters is different, but it confirms each time that characters' emotions, and, consequently, their self-awareness, can be formed and built only through their connection and relation to the surrounding space. In that way, the reader creates an imaginary network of complex relations among characters in all the short stories, which gives him/her the opportunity to search for a way to break out of the usual patterns that tend to be repeated while characters focus on building self-awareness. The aim of the paper is to determine and analyze the role of spatial elements in the creation of characters' emotions and in the process of self-awareness. As the spatial element changes or gets transformed and/or substituted, in the same way, we notice the arise of the unconscious desire for self-harm in the characters, which damages their self-awareness. Namely, the characters face a crisis that they cannot control by inventing other types of crises that can be controlled. That happens to be their way of acting in order to find the way out of the identity crisis. Consequently, we expect that the results of the analysis point out the similarities in the short stories in characters' depiction as well as to show the extent to which the characters' identities depend on the surrounding space in each short story. In this way, the results will highlight the importance of spatial elements in characters' identity formation in Michela Murgia's short stories and also summarize the importance of the whole Murgia's literary opus.

Keywords: Italian literature, short stories, environment, spatial element, emotions, characters

Procedia PDF Downloads 48
2161 Integration of Educational Data Mining Models to a Web-Based Support System for Predicting High School Student Performance

Authors: Sokkhey Phauk, Takeo Okazaki

Abstract:

The challenging task in educational institutions is to maximize the high performance of students and minimize the failure rate of poor-performing students. An effective method to leverage this task is to know student learning patterns with highly influencing factors and get an early prediction of student learning outcomes at the timely stage for setting up policies for improvement. Educational data mining (EDM) is an emerging disciplinary field of data mining, statistics, and machine learning concerned with extracting useful knowledge and information for the sake of improvement and development in the education environment. The study is of this work is to propose techniques in EDM and integrate it into a web-based system for predicting poor-performing students. A comparative study of prediction models is conducted. Subsequently, high performing models are developed to get higher performance. The hybrid random forest (Hybrid RF) produces the most successful classification. For the context of intervention and improving the learning outcomes, a feature selection method MICHI, which is the combination of mutual information (MI) and chi-square (CHI) algorithms based on the ranked feature scores, is introduced to select a dominant feature set that improves the performance of prediction and uses the obtained dominant set as information for intervention. By using the proposed techniques of EDM, an academic performance prediction system (APPS) is subsequently developed for educational stockholders to get an early prediction of student learning outcomes for timely intervention. Experimental outcomes and evaluation surveys report the effectiveness and usefulness of the developed system. The system is used to help educational stakeholders and related individuals for intervening and improving student performance.

Keywords: academic performance prediction system, educational data mining, dominant factors, feature selection method, prediction model, student performance

Procedia PDF Downloads 103
2160 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices

Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner

Abstract:

Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.

Keywords: biometrics, electrocardiographic, machine learning, signals processing

Procedia PDF Downloads 137
2159 Indoor Real-Time Positioning and Mapping Based on Manhattan Hypothesis Optimization

Authors: Linhang Zhu, Hongyu Zhu, Jiahe Liu

Abstract:

This paper investigated a method of indoor real-time positioning and mapping based on the Manhattan world assumption. In indoor environments, relying solely on feature matching techniques or other geometric algorithms for sensor pose estimation inevitably resulted in cumulative errors, posing a significant challenge to indoor positioning. To address this issue, we adopt the Manhattan world hypothesis to optimize the camera pose algorithm based on feature matching, which improves the accuracy of camera pose estimation. A special processing method was applied to image data frames that conformed to the Manhattan world assumption. When similar data frames appeared subsequently, this could be used to eliminate drift in sensor pose estimation, thereby reducing cumulative errors in estimation and optimizing mapping and positioning. Through experimental verification, it is found that our method achieves high-precision real-time positioning in indoor environments and successfully generates maps of indoor environments. This provides effective technical support for applications such as indoor navigation and robot control.

Keywords: Manhattan world hypothesis, real-time positioning and mapping, feature matching, loopback detection

Procedia PDF Downloads 56
2158 A Hybrid Expert System for Generating Stock Trading Signals

Authors: Hosein Hamisheh Bahar, Mohammad Hossein Fazel Zarandi, Akbar Esfahanipour

Abstract:

In this paper, a hybrid expert system is developed by using fuzzy genetic network programming with reinforcement learning (GNP-RL). In this system, the frame-based structure of the system uses the trading rules extracted by GNP. These rules are extracted by using technical indices of the stock prices in the training time period. For developing this system, we applied fuzzy node transition and decision making in both processing and judgment nodes of GNP-RL. Consequently, using these method not only did increase the accuracy of node transition and decision making in GNP's nodes, but also extended the GNP's binary signals to ternary trading signals. In the other words, in our proposed Fuzzy GNP-RL model, a No Trade signal is added to conventional Buy or Sell signals. Finally, the obtained rules are used in a frame-based system implemented in Kappa-PC software. This developed trading system has been used to generate trading signals for ten companies listed in Tehran Stock Exchange (TSE). The simulation results in the testing time period shows that the developed system has more favorable performance in comparison with the Buy and Hold strategy.

Keywords: fuzzy genetic network programming, hybrid expert system, technical trading signal, Tehran stock exchange

Procedia PDF Downloads 328
2157 Self-Attention Mechanism for Target Hiding Based on Satellite Images

Authors: Hao Yuan, Yongjian Shen, Xiangjun He, Yuheng Li, Zhouzhou Zhang, Pengyu Zhang, Minkang Cai

Abstract:

Remote sensing data can provide support for decision-making in disaster assessment or disaster relief. The traditional processing methods of sensitive targets in remote sensing mapping are mainly based on manual retrieval and image editing tools, which are inefficient. Methods based on deep learning for sensitive target hiding are faster and more flexible. But these methods have disadvantages in training time and cost of calculation. This paper proposed a target hiding model Self Attention (SA) Deepfill, which used self-attention modules to replace part of gated convolution layers in image inpainting. By this operation, the calculation amount of the model becomes smaller, and the performance is improved. And this paper adds free-form masks to the model’s training to enhance the model’s universal. The experiment on an open remote sensing dataset proved the efficiency of our method. Moreover, through experimental comparison, the proposed method can train for a longer time without over-fitting. Finally, compared with the existing methods, the proposed model has lower computational weight and better performance.

Keywords: remote sensing mapping, image inpainting, self-attention mechanism, target hiding

Procedia PDF Downloads 124
2156 Experimental Characterization of Composite Material with Non Contacting Methods

Authors: Nikolaos Papadakis, Constantinos Condaxakis, Konstantinos Savvakis

Abstract:

The aim of this paper is to determine the elastic properties (elastic modulus and Poisson ratio) of a composite material based on noncontacting imaging methods. More specifically, the significantly reduced cost of digital cameras has given the opportunity of the high reliability of low-cost strain measurement. The open source platform Ncorr is used in this paper which utilizes the method of digital image correlation (DIC). The use of digital image correlation in measuring strain uses random speckle preparation on the surface of the gauge area, image acquisition, and postprocessing the image correlation to obtain displacement and strain field on surface under study. This study discusses technical issues relating to the quality of results to be obtained are discussed. [0]8 fabric glass/epoxy composites specimens were prepared and tested at different orientations 0[o], 30[o], 45[o], 60[o], 90[o]. Each test was recorded with the camera at a constant frame rate and constant lighting conditions. The recorded images were processed through the use of the image processing software. The parameters of the test are reported. The strain map output which is obtained through strain measurement using Ncorr is validated by a) comparing the elastic properties with expected values from Classical laminate theory, b) through finite element analysis.

Keywords: composites, Ncorr, strain map, videoextensometry

Procedia PDF Downloads 139
2155 Extraction of Essential Oil and Pectin from Lime and Waste Technology Development

Authors: Wilaisri Limphapayom

Abstract:

Lime is one of the economically important produced in Thailand. The objective of this research is to increase utilization in food and cosmetic. Extraction of essential oil and pectin from lime (Citrus aurantifolia (Christm & Panz ) Swing) have been studied. Extraction of essential oil has been made by using hydro-distillation .The essential oil ranged from 1.72-2.20%. The chemical composition of essential oil composed of alpha-pinene , beta-pinene , D-limonene , comphene , a-phellandrene , g-terpinene , a-ocimene , O-cymene , 2-carene , Linalool , trans-ocimenol , Geraniol , Citral , Isogeraniol , Verbinol , and others when analyzed by using GC-MS method. Pectin extraction from lime waste , boiled water after essential oil extraction. Pectin extraction were found 40.11-65.81 g /100g of lime peel. The best extraction condition was found to be higher in yield by using ethanol extraction. The potential of this study had satisfactory results to improve lime processing system for value-added . The present study was also focused on Lime powder production as source of vitamin C or ascorbic acid and the potential of lime waste as a source of essential oil and pectin. Lime powder produced from Spray Dryer . Lime juice with 2 different level of maltodextrins DE 10 , 30 and 50% w/w was sprayed at 150 degrees celsius inlet air temperature and at 90-degree celsius outlet temperature. Lime powder with 50% maltodextrin gave the most desirable quality product. This product has vitamin C contents of 25 mg/100g (w/w).

Keywords: extraction, pectin, essential oil, lime

Procedia PDF Downloads 292
2154 Network Word Discovery Framework Based on Sentence Semantic Vector Similarity

Authors: Ganfeng Yu, Yuefeng Ma, Shanliang Yang

Abstract:

The word discovery is a key problem in text information retrieval technology. Methods in new word discovery tend to be closely related to words because they generally obtain new word results by analyzing words. With the popularity of social networks, individual netizens and online self-media have generated various network texts for the convenience of online life, including network words that are far from standard Chinese expression. How detect network words is one of the important goals in the field of text information retrieval today. In this paper, we integrate the word embedding model and clustering methods to propose a network word discovery framework based on sentence semantic similarity (S³-NWD) to detect network words effectively from the corpus. This framework constructs sentence semantic vectors through a distributed representation model, uses the similarity of sentence semantic vectors to determine the semantic relationship between sentences, and finally realizes network word discovery by the meaning of semantic replacement between sentences. The experiment verifies that the framework not only completes the rapid discovery of network words but also realizes the standard word meaning of the discovery of network words, which reflects the effectiveness of our work.

Keywords: text information retrieval, natural language processing, new word discovery, information extraction

Procedia PDF Downloads 88
2153 Lexical Knowledge of Verb Particle Constructions with the Particle on by Mexican English Learners

Authors: Sarai Alvarado Pineda, Ricardo Maldonado Soto

Abstract:

The acquisition of Verb Particle Constructions is a challenge for Spanish speakers learning English. The acquisition is particularly difficult for speakers of languages with no verb particle constructions. The purpose of the current study is to define the procedural steps in the acquisition of constructions with the particle on. There are three outstanding meanings for the particle on; Surface: The movie is based on a true story, Activation: John turn on the light, Continuity: The band played on all night. The central aim of this study is to measure how Mexican Spanish participants respond to both the three meanings mentioned above and the degree of meaning transparency/opacity of on verb particle constructions. Forty Mexican Spanish learners of English (20 basic and 20 advanced) are compared against a control group of 20 American native English speakers through a reaction time test (PsychoPy2 2015). The participants were asked to discriminate 90 items based on their knowledge of these constructions. There are 30 items per meaning divided into two groups of transparent and opaque meaning. Results revealed three major findings: Advanced students have a reaction time similar to that of native speakers (advanced 4.5s versus native 3.7s), while students with a lower level of English proficiency, show a high reaction time (7s). Likewise, there is a shorter reaction time in constructions with lower opacity in the three groups of participants, with differences between each level (basic 6.7s, advanced 4.3s, and native 3.4s). Finally, a difference in reaction time can be identified according to the meaning provided by the construction. The reaction time for the activation category (5.27s) is greater than continuity (5.04s), and this category is also slower than the surface (4.94s). The study shows that the level of sensitivity of English learners increases significantly aiming towards native speaker patterns as determined by the level of transparency of meaning of each construction as well as the degree of entrenchment of each constructional meaning.

Keywords: meaning of the particle, opacity, reaction time, verb particle constructions

Procedia PDF Downloads 261
2152 Reduplication In Urdu-Hindi Nonsensical Words: An OT Analysis

Authors: Riaz Ahmed Mangrio

Abstract:

Reduplication in Urdu-Hindi affects all major word categories, particles, and even nonsensical words. It conveys a variety of meanings, including distribution, emphasis, iteration, adjectival and adverbial. This study will primarily discuss reduplicative structures of nonsensical words in Urdu-Hindi and then briefly look at some examples from other Indo-Aryan languages to introduce the debate regarding the same structures in them. The goal of this study is to present counter-evidence against Keane (2005: 241), who claims “the base in the cases of lexical and phrasal echo reduplication is always independently meaningful”. However, Urdu-Hindi reduplication derives meaningful compounds from nonsensical words e.g. gũ mgũ (A) ‘silent and confused’ and d̪əb d̪əb-a (N) ‘one’s fear over others’. This needs a comprehensive examination to see whether and how the various structures form patterns of a base-reduplicant relationship or, rather, they are merely sub lexical items joining together to form a word pattern of any grammatical category in content words. Another interesting theoretical question arises within the Optimality framework: in an OT analysis, is it necessary to identify one of the two constituents as the base and the other as reduplicant? Or is it best to consider this a pattern, but then how does this fit in with an OT analysis? This may be an even more interesting theoretical question. Looking for the solution to such questions can serve to make an important contribution. In the case at hand, each of the two constituents is an independent nonsensical word, but their echo reduplication is nonetheless meaningful. This casts significant doubt upon Keane’s (2005: 241) observation of some examples from Hindi and Tamil reduplication that “the base in cases of lexical and phrasal echo reduplication is always independently meaningful”. The debate on the point becomes further interesting when the triplication of nonsensical words in Urdu-Hindi e.g. aẽ baẽ ʃaẽ (N) ‘useless talk’ is also seen, which is equally important to discuss. The example is challenging to Harrison’s (1973) claim that only the monosyllabic verbs in their progressive forms reduplicate twice to result in triplication, which is not the case with the example presented. The study will consist of a thorough descriptive analysis of the data for the purpose of documentation, and then there will be OT analysis.

Keywords: reduplication, urdu-hindi, nonsensical, optimality theory

Procedia PDF Downloads 71
2151 Spatiotemporal Variability in Rainfall Trends over Sinai Peninsula Using Nonparametric Methods and Discrete Wavelet Transforms

Authors: Mosaad Khadr

Abstract:

Knowledge of the temporal and spatial variability of rainfall trends has been of great concern for efficient water resource planning, management. In this study annual, seasonal and monthly rainfall trends over the Sinai Peninsula were analyzed by using absolute homogeneity tests, nonparametric Mann–Kendall (MK) test and Sen’s slope estimator methods. The homogeneity of rainfall time-series was examined using four absolute homogeneity tests namely, the Pettitt test, standard normal homogeneity test, Buishand range test, and von Neumann ratio test. Further, the sequential change in the trend of annual and seasonal rainfalls is conducted using sequential MK (SQMK) method. Then the trend analysis based on discrete wavelet transform technique (DWT) in conjunction with SQMK method is performed. The spatial patterns of the detected rainfall trends were investigated using a geostatistical and deterministic spatial interpolation technique. The results achieved from the Mann–Kendall test to the data series (using the 5% significance level) highlighted that rainfall was generally decreasing in January, February, March, November, December, wet season, and annual rainfall. A significant decreasing trend in the winter and annual rainfall with significant levels were inferred based on the Mann-Kendall rank statistics and linear trend. Further, the discrete wavelet transform (DWT) analysis reveal that in general, intra- and inter-annual events (up to 4 years) are more influential in affecting the observed trends. The nature of the trend captured by both methods is similar for all of the cases. On the basis of spatial trend analysis, significant rainfall decreases were also noted in the investigated stations. Overall, significant downward trends in winter and annual rainfall over the Sinai Peninsula was observed during the study period.

Keywords: trend analysis, rainfall, Mann–Kendall test, discrete wavelet transform, Sinai Peninsula

Procedia PDF Downloads 165