Search results for: radiation processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4913

Search results for: radiation processing

1313 Electrodeposition and Selenization of Cuin Alloys for the Synthesis of Photoactive Cu2in1-X Gax Se2 (Cigs) Thin Films

Authors: Mohamed Benaicha, Mahdi Allam

Abstract:

A new two stage electrochemical process as a safe, large area and low processing cost technique for the production of semi-conducting CuInSe2 (CIS) thin films is studied. CuIn precursors were first potentiostatically electrodeposited onto molybdenum substrates from an acidic thiocyanate electrolyte. In a second stage, the prepared metallic CuIn layers were used as substrate in the selenium electrochemical deposition system and subjected to a thermal treatment in vacuum atmosphere, to eliminate binary phase formation by reaction of the Cu2-x Se and InxSey selenides, leading to the formation of CuInSe2 thin film. Electrochemical selenization from aqueous electrolyte is introduced as an alternative to toxic and hazardous H2Se or Se vapor phase selenization used in physical techniques. In this study, the influence of film deposition parameters such as bath composition, temperature and potential on film properties was studied. The electrochemical, morphological, structural and compositional properties of electrodeposited thin films were characterized using various techniques. Results of Cyclic and Stripping-Cyclic Voltammetry (CV, SCV), Scanning Electron Microscopy (SEM) and Energy Dispersive X-Ray microanalysis (EDX) investigations revealed good reproducibility and homogeneity of the film composition. Thereby optimal technological parameters for the electrochemical production of CuIn, Se as precursors for CuInSe2 thin layers are determined.

Keywords: photovoltaic, CIGS, copper alloys, electrodeposition, thin films

Procedia PDF Downloads 446
1312 Using Autoencoder as Feature Extractor for Malware Detection

Authors: Umm-E-Hani, Faiza Babar, Hanif Durad

Abstract:

Malware-detecting approaches suffer many limitations, due to which all anti-malware solutions have failed to be reliable enough for detecting zero-day malware. Signature-based solutions depend upon the signatures that can be generated only when malware surfaces at least once in the cyber world. Another approach that works by detecting the anomalies caused in the environment can easily be defeated by diligently and intelligently written malware. Solutions that have been trained to observe the behavior for detecting malicious files have failed to cater to the malware capable of detecting the sandboxed or protected environment. Machine learning and deep learning-based approaches greatly suffer in training their models with either an imbalanced dataset or an inadequate number of samples. AI-based anti-malware solutions that have been trained with enough samples targeted a selected feature vector, thus ignoring the input of leftover features in the maliciousness of malware just to cope with the lack of underlying hardware processing power. Our research focuses on producing an anti-malware solution for detecting malicious PE files by circumventing the earlier-mentioned shortcomings. Our proposed framework, which is based on automated feature engineering through autoencoders, trains the model over a fairly large dataset. It focuses on the visual patterns of malware samples to automatically extract the meaningful part of the visual pattern. Our experiment has successfully produced a state-of-the-art accuracy of 99.54 % over test data.

Keywords: malware, auto encoders, automated feature engineering, classification

Procedia PDF Downloads 58
1311 Ferulic Acid-Grafted Chitosan: Thermal Stability and Feasibility as an Antioxidant for Active Biodegradable Packaging Film

Authors: Sarekha Woranuch, Rangrong Yoksan

Abstract:

Active packaging has been developed based on the incorporation of certain additives, in particular antimicrobial and antioxidant agents, into packaging systems to maintain or extend product quality and shelf-life. Ferulic acid is one of the most effective natural phenolic antioxidants, which has been used in food, pharmaceutical and active packaging film applications. However, most phenolic compounds are sensitive to oxygen, light and heat; its activities are thus lost during product formulation and processing. Grafting ferulic acid onto polymer is an alternative to reduce its loss under thermal processes. Therefore, the objectives of the present research were to study the thermal stability of ferulic acid after grafting onto chitosan, and to investigate the possibility of using ferulic acid-grafted chitosan (FA-g-CTS) as an antioxidant for active biodegradable packaging film. FA-g-CTS was incorporated into biodegradable film via a two-step process, i.e. compounding extrusion at temperature up to 150 °C followed by blown film extrusion at temperature up to 175 °C. Although incorporating FA-g-CTS with a content of 0.02–0.16% (w/w) caused decreased water vapor barrier property and reduced extensibility, the films showed improved oxygen barrier property and antioxidant activity. Radical scavenging activity and reducing power of the film containing FA-g-CTS with a content of 0.04% (w/w) were higher than that of the naked film about 254% and 94%, respectively. Tensile strength and rigidity of the films were not significantly affected by adding FA-g-CTS with a content of 0.02–0.08% (w/w). The results indicated that FA-g-CTS could be potentially used as an antioxidant for active packaging film.

Keywords: active packaging film, antioxidant activity, chitosan, ferulic acid

Procedia PDF Downloads 490
1310 Wolof Voice Response Recognition System: A Deep Learning Model for Wolof Audio Classification

Authors: Krishna Mohan Bathula, Fatou Bintou Loucoubar, FNU Kaleemunnisa, Christelle Scharff, Mark Anthony De Castro

Abstract:

Voice recognition algorithms such as automatic speech recognition and text-to-speech systems with African languages can play an important role in bridging the digital divide of Artificial Intelligence in Africa, contributing to the establishment of a fully inclusive information society. This paper proposes a Deep Learning model that can classify the user responses as inputs for an interactive voice response system. A dataset with Wolof language words ‘yes’ and ‘no’ is collected as audio recordings. A two stage Data Augmentation approach is adopted for enhancing the dataset size required by the deep neural network. Data preprocessing and feature engineering with Mel-Frequency Cepstral Coefficients are implemented. Convolutional Neural Networks (CNNs) have proven to be very powerful in image classification and are promising for audio processing when sounds are transformed into spectra. For performing voice response classification, the recordings are transformed into sound frequency feature spectra and then applied image classification methodology using a deep CNN model. The inference model of this trained and reusable Wolof voice response recognition system can be integrated with many applications associated with both web and mobile platforms.

Keywords: automatic speech recognition, interactive voice response, voice response recognition, wolof word classification

Procedia PDF Downloads 97
1309 Spatial Information and Urbanizing Futures

Authors: Mohammad Talei, Neda Ranjbar Nosheri, Reza Kazemi Gorzadini

Abstract:

Today municipalities are searching for the new tools for increasing the public participation in different levels of urban planning. This approach of urban planning involves the community in planning process using participatory approaches instead of the long traditional top-down planning methods. These tools can be used to obtain the particular problems of urban furniture form the residents’ point of view. One of the tools that is designed with this goal is public participation GIS (PPGIS) that enables citizen to record and following up their feeling and spatial knowledge regarding main problems of the city, specifically urban furniture, in the form of maps. However, despite the good intentions of PPGIS, its practical implementation in developing countries faces many problems including the lack of basic supporting infrastructure and services and unavailability of sophisticated public participatory models. In this research we develop a PPGIS using of Web 2 to collect voluntary geodataand to perform spatial analysis based on Spatial OnLine Analytical Processing (SOLAP) and Spatial Data Mining (SDM). These tools provide urban planners with proper informationregarding the type, spatial distribution and the clusters of reported problems. This system is implemented in a case study area in Tehran, Iran and the challenges to make it applicable and its potential for real urban planning have been evaluated. It helps decision makers to better understand, plan and allocate scarce resources for providing most requested urban furniture.

Keywords: PPGIS, spatial information, urbanizing futures, urban planning

Procedia PDF Downloads 709
1308 Reliability-Centered Maintenance Application for the Development of Maintenance Strategy for a Cement Plant

Authors: Nabil Hameed Al-Farsi

Abstract:

This study’s main goal is to develop a model and a maintenance strategy for a cement factory called Arabian Cement Company, Rabigh Plant. The proposed work here depends on Reliability centric maintenance approach to develop a strategy and maintenance schedule that ensures increasing the reliability of the production system components, thus ensuring continuous productivity. The cost-effective maintenance of the plant’s dependability performance is the key goal of durability-based maintenance is. The cement plant consists of 7 important steps, so, developing a maintenance plan based on Reliability centric maintenance (RCM) method is made up of 10 steps accordingly starting from selecting units and data until performing and updating the model. The processing unit chosen for the analysis of this case is the calcinatory unit regarding model’s validation and the Travancore Titanium Products Ltd (TTP) using the claimed data history acquired from the maintenance department maintenance from the mentioned company. After applying the proposed model, the results of the maintenance simulation justified the plant's existing scheduled maintenance policy being reconsidered. Results represent the need for preventive maintenance for all Class A criticality equipment instead of the planned maintenance and the breakdown one for all other equipment depends on its criticality and an FMEA report. Consequently, the additional cost of preventive maintenance would be offset by the cost savings from breakdown maintenance for the remaining equipment.

Keywords: engineering, reliability, strategy, maintenance, failure modes, effects and criticality analysis (FMEA)

Procedia PDF Downloads 151
1307 A BERT-Based Model for Financial Social Media Sentiment Analysis

Authors: Josiel Delgadillo, Johnson Kinyua, Charles Mutigwe

Abstract:

The purpose of sentiment analysis is to determine the sentiment strength (e.g., positive, negative, neutral) from a textual source for good decision-making. Natural language processing in domains such as financial markets requires knowledge of domain ontology, and pre-trained language models, such as BERT, have made significant breakthroughs in various NLP tasks by training on large-scale un-labeled generic corpora such as Wikipedia. However, sentiment analysis is a strong domain-dependent task. The rapid growth of social media has given users a platform to share their experiences and views about products, services, and processes, including financial markets. StockTwits and Twitter are social networks that allow the public to express their sentiments in real time. Hence, leveraging the success of unsupervised pre-training and a large amount of financial text available on social media platforms could potentially benefit a wide range of financial applications. This work is focused on sentiment analysis using social media text on platforms such as StockTwits and Twitter. To meet this need, SkyBERT, a domain-specific language model pre-trained and fine-tuned on financial corpora, has been developed. The results show that SkyBERT outperforms current state-of-the-art models in financial sentiment analysis. Extensive experimental results demonstrate the effectiveness and robustness of SkyBERT.

Keywords: BERT, financial markets, Twitter, sentiment analysis

Procedia PDF Downloads 135
1306 Optical Image Analysis Through Semiconductor Defect Detection Simulation and Suggestion on How to Improve the Fine Particle Detection Capability of Semiconductor Equipment

Authors: Hyoseop Shin

Abstract:

As design rules become smaller, semiconductor processes are becoming a new problem because defects that were not previously a problem affect yields. Recently, semiconductor fine inspection technology has been required to develop high-precision, high-efficiency technology to manage defects, and the detection capability of semiconductor inspection equipment has been improved by studying defect detection through a comprehensive understanding of semiconductor inspection equipment, semiconductor processing, and defects. The optimal test parameters were applied to actual equipment by conditional comparison results aimed at detecting 30 nm particles in low-density semiconductors, thereby improving the detection capability of particle inspection equipment. The improvement of 30 nm particle detection has been studied based on the results of image analysis and evaluation through defect simulation. Factor analysis such as wavelength polarization incident angle of semiconductor equipment parameters and acquisition of scattering signals of actual equipment has been found to have found the optimal conditions of detection power and contributed to defect detection. As a result, it was confirmed that the detection power differed significantly in the experiment of 266 nm wavelength and P incident polarization conditions using P polarization, and 30 nm particles were detected, contributing to the yield improvement.

Keywords: electronic simulation system, a semiconductor defect, Reynolds' equation, semiconductor optical measuring equipment, facility engineering

Procedia PDF Downloads 7
1305 Graphene Metamaterials Supported Tunable Terahertz Fano Resonance

Authors: Xiaoyong He

Abstract:

The manipulation of THz waves is still a challenging task due to lack of natural materials interacted with it strongly. Designed by tailoring the characters of unit cells (meta-molecules), the advance of metamaterials (MMs) may solve this problem. However, because of Ohmic and radiation losses, the performance of MMs devices is subjected to the dissipation and low quality factor (Q-factor). This dilemma may be circumvented by Fano resonance, which arises from the destructive interference between a bright continuum mode and dark discrete mode (or a narrow resonance). Different from symmetric Lorentz spectral curve, Fano resonance indicates a distinct asymmetric line-shape, ultrahigh quality factor, steep variations in spectrum curves. Fano resonance is usually realized through symmetry breaking. However, if concentric double rings (DR) are placed closely to each other, the near-field coupling between them gives rise to two hybridized modes (bright and narrowband dark modes) because of the local asymmetry, resulting into the characteristic Fano line shape. Furthermore, from the practical viewpoint, it is highly desirable requirement that to achieve the modulation of Fano spectral curves conveniently, which is an important and interesting research topics. For current Fano systems, the tunable spectral curves can be realized by adjusting the geometrical structural parameters or magnetic fields biased the ferrite-based structure. But due to limited dispersion properties of active materials, it is still a tough work to tailor Fano resonance conveniently with the fixed structural parameters. With the favorable properties of extreme confinement and high tunability, graphene is a strong candidate to achieve this goal. The DR-structure possesses the excitation of so-called “trapped modes,” with the merits of simple structure and high quality of resonances in thin structures. By depositing graphene circular DR on the SiO2/Si/ polymer substrate, the tunable Fano resonance has been theoretically investigated in the terahertz regime, including the effects of graphene Fermi level, structural parameters and operation frequency. The results manifest that the obvious Fano peak can be efficiently modulated because of the strong coupling between incident waves and graphene ribbons. As Fermi level increases, the peak amplitude of Fano curve increases, and the resonant peak position shifts to high frequency. The amplitude modulation depth of Fano curves is about 30% if Fermi level changes in the scope of 0.1-1.0 eV. The optimum gap distance between DR is about 8-12 μm, where the value of figure of merit shows a peak. As the graphene ribbon width increases, the Fano spectral curves become broad, and the resonant peak denotes blue shift. The results are very helpful to develop novel graphene plasmonic devices, e.g. sensors and modulators.

Keywords: graphene, metamaterials, terahertz, tunable

Procedia PDF Downloads 333
1304 Deep Reinforcement Learning for Advanced Pressure Management in Water Distribution Networks

Authors: Ahmed Negm, George Aggidis, Xiandong Ma

Abstract:

With the diverse nature of urban cities, customer demand patterns, landscape topologies or even seasonal weather trends; managing our water distribution networks (WDNs) has proved a complex task. These unpredictable circumstances manifest as pipe failures, intermittent supply and burst events thus adding to water loss, energy waste and increased carbon emissions. Whilst these events are unavoidable, advanced pressure management has proved an effective tool to control and mitigate them. Henceforth, water utilities have struggled with developing a real-time control method that is resilient when confronting the challenges of water distribution. In this paper we use deep reinforcement learning (DRL) algorithms as a novel pressure control strategy to minimise pressure violations and leakage under both burst and background leakage conditions. Agents based on asynchronous actor critic (A2C) and recurrent proximal policy optimisation (Recurrent PPO) were trained and compared to benchmarked optimisation algorithms (differential evolution, particle swarm optimisation. A2C manages to minimise leakage by 32.48% under burst conditions and 67.17% under background conditions which was the highest performance in the DRL algorithms. A2C and Recurrent PPO performed well in comparison to the benchmarks with higher processing speed and lower computational effort.

Keywords: deep reinforcement learning, pressure management, water distribution networks, leakage management

Procedia PDF Downloads 64
1303 The Intersection/Union Region Computation for Drosophila Brain Images Using Encoding Schemes Based on Multi-Core CPUs

Authors: Ming-Yang Guo, Cheng-Xian Wu, Wei-Xiang Chen, Chun-Yuan Lin, Yen-Jen Lin, Ann-Shyn Chiang

Abstract:

With more and more Drosophila Driver and Neuron images, it is an important work to find the similarity relationships among them as the functional inference. There is a general problem that how to find a Drosophila Driver image, which can cover a set of Drosophila Driver/Neuron images. In order to solve this problem, the intersection/union region for a set of images should be computed at first, then a comparison work is used to calculate the similarities between the region and other images. In this paper, three encoding schemes, namely Integer, Boolean, Decimal, are proposed to encode each image as a one-dimensional structure. Then, the intersection/union region from these images can be computed by using the compare operations, Boolean operators and lookup table method. Finally, the comparison work is done as the union region computation, and the similarity score can be calculated by the definition of Tanimoto coefficient. The above methods for the region computation are also implemented in the multi-core CPUs environment with the OpenMP. From the experimental results, in the encoding phase, the performance by the Boolean scheme is the best than that by others; in the region computation phase, the performance by Decimal is the best when the number of images is large. The speedup ratio can achieve 12 based on 16 CPUs. This work was supported by the Ministry of Science and Technology under the grant MOST 106-2221-E-182-070.

Keywords: Drosophila driver image, Drosophila neuron images, intersection/union computation, parallel processing, OpenMP

Procedia PDF Downloads 220
1302 Cooperative Learning: A Case Study on Teamwork through Community Service Project

Authors: Priyadharshini Ahrumugam

Abstract:

Cooperative groups through much research have been recognized to churn remarkable achievements instead of solitary or individualistic efforts. Based on Johnson and Johnson’s model of cooperative learning, the five key components of cooperation are positive interdependence, face-to-face promotive interaction, individual accountability, social skills and group processing. In 2011, the Malaysian Ministry of Higher Education (MOHE) introduced the Holistic Student Development policy with the aim to develop morally sound individuals equipped with lifelong learning skills. The Community Service project was included in the improvement initiative. The purpose of this study is to assess the relationship of team-based learning in facilitating particularly students’ positive interdependence and face-to-face promotive interaction. The research methods involve in-depth interviews with the team leaders and selected team members, and a content analysis of the undergraduate students’ reflective journals. A significant positive relationship was found between students’ progressive outlook towards teamwork and the highlighted two components. The key findings show that students have gained in their individual learning and work results through teamwork and interaction with other students. The inclusion of Community Service as a MOHE subject resonates with cooperative learning methods that enhances supportive relationships and develops students’ social skills together with their professional skills.

Keywords: community service, cooperative learning, positive interdependence, teamwork

Procedia PDF Downloads 296
1301 Emotional Awareness and Working Memory as Predictive Factors for the Habitual Use of Cognitive Reappraisal among Adolescents

Authors: Yuri Kitahara

Abstract:

Background: Cognitive reappraisal refers to an emotion regulation strategy in which one changes the interpretation of emotion-eliciting events. Numerous studies show that cognitive reappraisal is associated with mental health and better social functioning. However the examination of the predictive factors of adaptive emotion regulation remains as an issue. The present study examined the factors contributing to the habitual use of cognitive reappraisal, with a focus on emotional awareness and working memory. Methods: Data was collected from 30 junior high school students, using a Japanese version of the Emotion Regulation Questionnaire (ERQ), the Levels of Emotional Awareness Scale for Children (LEAS-C), and N-back task. Results: A positive correlation between emotional awareness and cognitive reappraisal was observed in the high-working-memory group (r = .54, p < .05), whereas no significant relationship was found in the low-working-memory group. In addition, the results of the analysis of variance (ANOVA) showed a significant interaction between emotional awareness and working memory capacity (F(1, 26) = 7.74, p < .05). Subsequent analysis of simple main effects confirmed that high working memory capacity significantly increases the use of cognitive reappraisal for high-emotional-awareness subjects, and significantly decreases the use of cognitive reappraisal for low-emotional-awareness subjects. Discussion: These results indicate that under the condition when one has an adequate ability for simultaneous processing of information, explicit understanding of emotion would contribute to adaptive cognitive emotion regulation. The findings are discussed along with neuroscientific claims.

Keywords: cognitive reappraisal, emotional awareness, emotion regulation, working memory

Procedia PDF Downloads 212
1300 Full-Field Estimation of Cyclic Threshold Shear Strain

Authors: E. E. S. Uy, T. Noda, K. Nakai, J. R. Dungca

Abstract:

Cyclic threshold shear strain is the cyclic shear strain amplitude that serves as the indicator of the development of pore water pressure. The parameter can be obtained by performing either cyclic triaxial test, shaking table test, cyclic simple shear or resonant column. In a cyclic triaxial test, other researchers install measuring devices in close proximity of the soil to measure the parameter. In this study, an attempt was made to estimate the cyclic threshold shear strain parameter using full-field measurement technique. The technique uses a camera to monitor and measure the movement of the soil. For this study, the technique was incorporated in a strain-controlled consolidated undrained cyclic triaxial test. Calibration of the camera was first performed to ensure that the camera can properly measure the deformation under cyclic loading. Its capacity to measure deformation was also investigated using a cylindrical rubber dummy. Two-dimensional image processing was implemented. Lucas and Kanade optical flow algorithm was applied to track the movement of the soil particles. Results from the full-field measurement technique were compared with the results from the linear variable displacement transducer. A range of values was determined from the estimation. This was due to the nonhomogeneous deformation of the soil observed during the cyclic loading. The minimum values were in the order of 10-2% in some areas of the specimen.

Keywords: cyclic loading, cyclic threshold shear strain, full-field measurement, optical flow

Procedia PDF Downloads 220
1299 Extracellular Phytase from Lactobacillus fermentum spp KA1: Optimization of Enzyme Production and Its Application for Improving the Nutritional Quality of Rice Bran

Authors: Neha Sharma, Kanthi K. Kondepudi, Naveen Gupta

Abstract:

Phytases are phytate specific phosphatases catalyzing the step-wise dephosphorylation of phytate, which acts as an anti-nutritional factor in food due to its strong binding capacity to minerals. In recent years microbial phytases have been explored for improving nutritional quality of food. But the major limitation is acceptability of phytases from these microorganisms. Therefore, efforts are being made to isolate organisms which are generally regarded as safe for human consumption such as Lactic Acid Bacteria (LAB). Phytases from these organisms will have an edge over other phytase sources due to its probiotic attributes. Only few LAB have been reported to give phytase activity that too is generally seen as intracellular. LAB producing extracellular phytase will be more useful as it can degrade phytate more effectively. Moreover, enzyme from such isolate will have application in food processing also. Only few species of Lactobacillus producing extracellular phytase have been reported so far. This study reports the isolation of a probiotic strain of Lactobacillus fermentum spp KA1 which produces extracellular phytase. Conditions for the optimal production of phytase have been optimized and the enzyme production resulted in an approximately 13-fold increase in yield. The phytate degradation potential of extracellular phytase in rice bran has been explored and conditions for optimal degradation were optimized. Under optimal conditions, there was 43.26% release of inorganic phosphate and 6.45% decrease of phytate content.

Keywords: Lactobacillus, phytase, phytate reduction, rice bran

Procedia PDF Downloads 177
1298 Information Retrieval from Internet Using Hand Gestures

Authors: Aniket S. Joshi, Aditya R. Mane, Arjun Tukaram

Abstract:

In the 21st century, in the era of e-world, people are continuously getting updated by daily information such as weather conditions, news, stock exchange market updates, new projects, cricket updates, sports and other such applications. In the busy situation, they want this information on the little use of keyboard, time. Today in order to get such information user have to repeat same mouse and keyboard actions which includes time and inconvenience. In India due to rural background many people are not much familiar about the use of computer and internet also. Also in small clinics, small offices, and hotels and in the airport there should be a system which retrieves daily information with the minimum use of keyboard and mouse actions. We plan to design application based project that can easily retrieve information with minimum use of keyboard and mouse actions and make our task more convenient and easier. This can be possible with an image processing application which takes real time hand gestures which will get matched by system and retrieve information. Once selected the functions with hand gestures, the system will report action information to user. In this project we use real time hand gesture movements to select required option which is stored on the screen in the form of RSS Feeds. Gesture will select the required option and the information will be popped and we got the information. A real time hand gesture makes the application handier and easier to use.

Keywords: hand detection, hand tracking, hand gesture recognition, HSV color model, Blob detection

Procedia PDF Downloads 270
1297 Development of Mineral Carbonation Process from Ultramafic Tailings, Enhancing the Reactivity of Feedstocks

Authors: Sara Gardideh, Mansoor Barati

Abstract:

The mineral carbonation approach for reducing global warming has garnered interest on a worldwide scale. Due to the benefits of permanent storage and abundant mineral resources, mineral carbonation (MC) is one of the most effective strategies for sequestering CO₂. The combination of mineral processing for primary metal recovery and mineral carbonation for carbon sequestration is an emerging field of study with the potential to minimize capital costs. A detailed study of low-pressures–solid carbonation of ultramafic tailings in a dry environment has been accomplished. In order to track the changing structure of serpentine minerals and their reactivity as a function of temperature (300-900 ᵒC), CO₂ partial pressure (25-90 mol %), and thermal preconditioning, thermogravimetry has been utilized. The incongruent CO₂ van der Waals molecular diameters with the octahedral-tetrahedral lattice constants of serpentine were used to explain the mild carbonation reactivity. Serpentine requires additional thermal-treatment to remove hydroxyl groups, resulting in the chemical transformation to pseudo-forsterite, which is a mineral composed of isolated SiO₄ tetrahedra linked by octahedrally coordinated magnesium ions. The heating treatment above 850 ᵒC is adequate to remove chemically bound water from the lattice. Particles with a diameter < 34 (μm) are desirable, and thermally treated serpentine at 850 ᵒC for 2.30 hours reached 65% CO₂ storage capacity. The decrease in particle size, increase in temperature, and magnetic separation can dramatically enhance carbonation.

Keywords: particle size, thermogravimetry, thermal-treatment, serpentine

Procedia PDF Downloads 70
1296 Glaucoma Detection in Retinal Tomography Using the Vision Transformer

Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan

Abstract:

Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.

Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning

Procedia PDF Downloads 175
1295 Application of GPRS in Water Quality Monitoring System

Authors: V. Ayishwarya Bharathi, S. M. Hasker, J. Indhu, M. Mohamed Azarudeen, G. Gowthami, R. Vinoth Rajan, N. Vijayarangan

Abstract:

Identification of water quality conditions in a river system based on limited observations is an essential task for meeting the goals of environmental management. The traditional method of water quality testing is to collect samples manually and then send to laboratory for analysis. However, it has been unable to meet the demands of water quality monitoring today. So a set of automatic measurement and reporting system of water quality has been developed. In this project specifies Water quality parameters collected by multi-parameter water quality probe are transmitted to data processing and monitoring center through GPRS wireless communication network of mobile. The multi parameter sensor is directly placed above the water level. The monitoring center consists of GPRS and micro-controller which monitor the data. The collected data can be monitor at any instant of time. In the pollution control board they will monitor the water quality sensor data in computer using Visual Basic Software. The system collects, transmits and processes water quality parameters automatically, so production efficiency and economy benefit are improved greatly. GPRS technology can achieve well within the complex environment of poor water quality non-monitored, and more specifically applicable to the collection point, data transmission automatically generate the field of water analysis equipment data transmission and monitoring.

Keywords: multiparameter sensor, GPRS, visual basic software, RS232

Procedia PDF Downloads 387
1294 Predicting Response to Cognitive Behavioral Therapy for Psychosis Using Machine Learning and Functional Magnetic Resonance Imaging

Authors: Eva Tolmeijer, Emmanuelle Peters, Veena Kumari, Liam Mason

Abstract:

Cognitive behavioral therapy for psychosis (CBTp) is effective in many but not all patients, making it important to better understand the factors that determine treatment outcomes. To date, no studies have examined whether neuroimaging can make clinically useful predictions about who will respond to CBTp. To this end, we used machine learning methods that make predictions about symptom improvement at the individual patient level. Prior to receiving CBTp, 22 patients with a diagnosis of schizophrenia completed a social-affective processing task during functional MRI. Multivariate pattern analysis assessed whether treatment response could be predicted by brain activation responses to facial affect that was either socially threatening or prosocial. The resulting models did significantly predict symptom improvement, with distinct multivariate signatures predicting psychotic (r=0.54, p=0.01) and affective (r=0.32, p=0.05) symptoms. Psychotic symptom improvement was accurately predicted from relatively focal threat-related activation across hippocampal, occipital, and temporal regions; affective symptom improvement was predicted by a more dispersed profile of responses to prosocial affect. These findings enrich our understanding of the neurobiological underpinning of treatment response. This study provides a foundation that will hopefully lead to greater precision and tailoring of the interventions offered to patients.

Keywords: cognitive behavioral therapy, machine learning, psychosis, schizophrenia

Procedia PDF Downloads 264
1293 Electrophoretic Light Scattering Based on Total Internal Reflection as a Promising Diagnostic Method

Authors: Ekaterina A. Savchenko, Elena N. Velichko, Evgenii T. Aksenov

Abstract:

The development of pathological processes, such as cardiovascular and oncological diseases, are accompanied by changes in molecular parameters in cells, tissues, and serum. The study of the behavior of protein molecules in solutions is of primarily importance for diagnosis of such diseases. Various physical and chemical methods are used to study molecular systems. With the advent of the laser and advances in electronics, optical methods, such as scanning electron microscopy, sedimentation analysis, nephelometry, static and dynamic light scattering, have become the most universal, informative and accurate tools for estimating the parameters of nanoscale objects. The electrophoretic light scattering is the most effective technique. It has a high potential in the study of biological solutions and their properties. This technique allows one to investigate the processes of aggregation and dissociation of different macromolecules and obtain information on their shapes, sizes and molecular weights. Electrophoretic light scattering is an analytical method for registration of the motion of microscopic particles under the influence of an electric field by means of quasi-elastic light scattering in a homogeneous solution with a subsequent registration of the spectral or correlation characteristics of the light scattered from a moving object. We modified the technique by using the regime of total internal reflection with the aim of increasing its sensitivity and reducing the volume of the sample to be investigated, which opens the prospects of automating simultaneous multiparameter measurements. In addition, the method of total internal reflection allows one to study biological fluids on the level of single molecules, which also makes it possible to increase the sensitivity and the informativeness of the results because the data obtained from an individual molecule is not averaged over an ensemble, which is important in the study of bimolecular fluids. To our best knowledge the study of electrophoretic light scattering in the regime of total internal reflection is proposed for the first time, latex microspheres 1 μm in size were used as test objects. In this study, the total internal reflection regime was realized on a quartz prism where the free electrophoresis regime was set. A semiconductor laser with a wavelength of 655 nm was used as a radiation source, and the light scattering signal was registered by a pin-diode. Then the signal from a photodetector was transmitted to a digital oscilloscope and to a computer. The autocorrelation functions and the fast Fourier transform in the regime of Brownian motion and under the action of the field were calculated to obtain the parameters of the object investigated. The main result of the study was the dependence of the autocorrelation function on the concentration of microspheres and the applied field magnitude. The effect of heating became more pronounced with increasing sample concentrations and electric field. The results obtained in our study demonstrated the applicability of the method for the examination of liquid solutions, including biological fluids.

Keywords: light scattering, electrophoretic light scattering, electrophoresis, total internal reflection

Procedia PDF Downloads 198
1292 Perceiving Casual Speech: A Gating Experiment with French Listeners of L2 English

Authors: Naouel Zoghlami

Abstract:

Spoken-word recognition involves the simultaneous activation of potential word candidates which compete with each other for final correct recognition. In continuous speech, the activation-competition process gets more complicated due to speech reductions existing at word boundaries. Lexical processing is more difficult in L2 than in L1 because L2 listeners often lack phonetic, lexico-semantic, syntactic, and prosodic knowledge in the target language. In this study, we investigate the on-line lexical segmentation hypotheses that French listeners of L2 English form and then revise as subsequent perceptual evidence is revealed. Our purpose is to shed further light on the processes of L2 spoken-word recognition in context and better understand L2 listening difficulties through a comparison of skilled and unskilled reactions at the point where their working hypothesis is rejected. We use a variant of the gating experiment in which subjects transcribe an English sentence presented in increments of progressively greater duration. The spoken sentence was “And this amazing athlete has just broken another world record”, chosen mainly because it included common reductions and phonetic features in English, such as elision and assimilation. Our preliminary results show that there is an important difference in the manner in which proficient and less-proficient L2 listeners handle connected speech. Less-proficient listeners delay recognition of words as they wait for lexical and syntactic evidence to appear in the gates. Further statistical results are currently being undertaken.

Keywords: gating paradigm, spoken word recognition, online lexical segmentation, L2 listening

Procedia PDF Downloads 451
1291 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 73
1290 Diagnostic Yield of CT PA and Value of Pre Test Assessments in Predicting the Probability of Pulmonary Embolism

Authors: Shanza Akram, Sameen Toor, Heba Harb Abu Alkass, Zainab Abdulsalam Altaha, Sara Taha Abdulla, Saleem Imran

Abstract:

Acute pulmonary embolism (PE) is a common disease and can be fatal. The clinical presentation is variable and nonspecific, making accurate diagnosis difficult. Testing patients with suspected acute PE has increased dramatically. However, the overuse of some tests, particularly CT and D-dimer measurement, may not improve care while potentially leading to patient harm and unnecessary expense. CTPA is the investigation of choice for PE. Its easy availability, accuracy and ability to provide alternative diagnosis has lowered the threshold for performing it, resulting in its overuse. Guidelines have recommended the use of clinical pretest probability tools such as ‘Wells score’ to assess risk of suspected PE. Unfortunately, implementation of guidelines in clinical practice is inconsistent. This has led to low risk patients being subjected to unnecessary imaging, exposure to radiation and possible contrast related complications. Aim: To study the diagnostic yield of CT PA, clinical pretest probability of patients according to wells score and to determine whether or not there was an overuse of CTPA in our service. Methods: CT scans done on patients with suspected P.E in our hospital from 1st January 2014 to 31st December 2014 were retrospectively reviewed. Medical records were reviewed to study demographics, clinical presentation, final diagnosis, and to establish if Wells score and D-Dimer were used correctly in predicting the probability of PE and the need for subsequent CTPA. Results: 100 patients (51male) underwent CT PA in the time period. Mean age was 57 years (24-91 years). Majority of patients presented with shortness of breath (52%). Other presenting symptoms included chest pain 34%, palpitations 6%, collapse 5% and haemoptysis 5%. D Dimer test was done in 69%. Overall Wells score was low (<2) in 28 %, moderate (>2 - < 6) in 47% and high (> 6) in 15% of patients. Wells score was documented in medical notes of only 20% patients. PE was confirmed in 12% (8 male) patients. 4 had bilateral PE’s. In high-risk group (Wells > 6) (n=15), there were 5 diagnosed PEs. In moderate risk group (Wells >2 - < 6) (n=47), there were 6 and in low risk group (Wells <2) (n=28), one case of PE was confirmed. CT scans negative for PE showed pleural effusion in 30, Consolidation in 20, atelactasis in 15 and pulmonary nodule in 4 patients. 31 scans were completely normal. Conclusion: Yield of CT for pulmonary embolism was low in our cohort at 12%. A significant number of our patients who underwent CT PA had low Wells score. This suggests that CT PA is over utilized in our institution. Wells score was poorly documented in medical notes. CT-PA was able to detect alternative pulmonary abnormalities explaining the patient's clinical presentation. CT-PA requires concomitant pretest clinical probability assessment to be an effective diagnostic tool for confirming or excluding PE. . Clinicians should use validated clinical prediction rules to estimate pretest probability in patients in whom acute PE is being considered. Combining Wells scores with clinical and laboratory assessment may reduce the need for CTPA.

Keywords: CT PA, D dimer, pulmonary embolism, wells score

Procedia PDF Downloads 213
1289 The Internet of Things: A Survey of Authentication Mechanisms, and Protocols, for the Shifting Paradigm of Communicating, Entities

Authors: Nazli Hardy

Abstract:

Multidisciplinary application of computer science, interactive database-driven web application, the Internet of Things (IoT) represents a digital ecosystem that has pervasive technological, social, and economic, impact on the human population. It is a long-term technology, and its development is built around the connection of everyday objects, to the Internet. It is estimated that by 2020, with billions of people connected to the Internet, the number of connected devices will exceed 50 billion, and thus IoT represents a paradigm shift in in our current interconnected ecosystem, a communication shift that will unavoidably affect people, businesses, consumers, clients, employees. By nature, in order to provide a cohesive and integrated service, connected devices need to collect, aggregate, store, mine, process personal and personalized data on individuals and corporations in a variety of contexts and environments. A significant factor in this paradigm shift is the necessity for secure and appropriate transmission, processing and storage of the data. Thus, while benefits of the applications appear to be boundless, these same opportunities are bounded by concerns such as trust, privacy, security, loss of control, and related issues. This poster and presentation look at a multi-factor authentication (MFA) mechanisms that need to change from the login-password tuple to an Identity and Access Management (IAM) model, to the more cohesive to Identity Relationship Management (IRM) standard. It also compares and contrasts messaging protocols that are appropriate for the IoT ecosystem.

Keywords: Internet of Things (IoT), authentication, protocols, survey

Procedia PDF Downloads 284
1288 The Time-Frequency Domain Reflection Method for Aircraft Cable Defects Localization

Authors: Reza Rezaeipour Honarmandzad

Abstract:

This paper introduces an aircraft cable fault detection and location method in light of TFDR keeping in mind the end goal to recognize the intermittent faults adequately and to adapt to the serial and after-connector issues being hard to be distinguished in time domain reflection. In this strategy, the correlation function of reflected and reference signal is used to recognize and find the airplane fault as per the qualities of reflected and reference signal in time-frequency domain, so the hit rate of distinguishing and finding intermittent faults can be enhanced adequately. In the work process, the reflected signal is interfered by the noise and false caution happens frequently, so the threshold de-noising technique in light of wavelet decomposition is used to diminish the noise interference and lessen the shortcoming alert rate. At that point the time-frequency cross connection capacity of the reference signal and the reflected signal based on Wigner-Ville appropriation is figured so as to find the issue position. Finally, LabVIEW is connected to execute operation and control interface, the primary capacity of which is to connect and control MATLAB and LABSQL. Using the solid computing capacity and the bottomless capacity library of MATLAB, the signal processing turn to be effortlessly acknowledged, in addition LabVIEW help the framework to be more dependable and upgraded effectively.

Keywords: aircraft cable, fault location, TFDR, LabVIEW

Procedia PDF Downloads 464
1287 Efficacy of Carvacrol as an Antimicrobial Wash Treatment for Reducing Both Campylobacter jejuni and Aerobic Bacterial Counts on Chicken Skin

Authors: Sandip Shrestha, Ann M. Donoghue, Komala Arsi, Basanta R. Wagle, Abhinav Upadhyay, Dan J. Donoghue

Abstract:

Campylobacter, one of the major cause of foodborne illness worldwide, is commonly present in the intestinal tract of poultry. Many strategies are currently being investigated to reduce Campylobacter counts on commercial poultry during processing with limited success. This study investigated the efficacy of the generally recognized as safe compound, carvacrol (CR), a component of wild oregano oil as a wash treatment for reducing C. jejuni and aerobic bacteria on chicken skin. A total of two trials were conducted, and in each trial, a total of 75 skin samples (4cm × 4cm each) were randomly allocated into 5 treatment groups (0%, 0.25%, 0.5%, 1% and 2% CR). Skin samples were inoculated with a cocktail of four wild strains of C. jejuni (~ 8 log10 CFU/skin). After 30 min of attachment, inoculated skin samples were dipped in the respective treatment solution for 1 min, allowed to drip dry for 2 min and processed at 0, 8, 24 h post treatment for enumeration of C. jejuni and aerobic bacterial counts (n=5/treatment/time point). The data were analyzed by ANOVA using PROC GLM procedure of SAS 9.3. All the tested doses of CR suspension consistently reduced C. jejuni counts across all time points. The 2% CR wash was the most effective treatment and reduced C. jejuni counts by ~4 log₁₀ CFU/sample (P < 0.05). Aerobic counts were reduced for the 0.5% CR dose at 0 and 24h in Trial 1 and at 0, 8 and 24h in Trial 2. The 1 and 2% CR doses consistently reduced aerobic counts in both trials up to 2 log₁₀ CFU/skin.

Keywords: Campylobacter jejuni, carvcrol, chicken skin, postharvest

Procedia PDF Downloads 164
1286 Localized Analysis of Cellulosic Fibrous Insulation Materials

Authors: Chady El Hachem, Pan Ye, Kamilia Abahri, Rachid Bennacer

Abstract:

Considered as a building construction material, and regarding its environmental benefits, wood fiber insulation is the material of interest in this work. The definition of adequate elementary representative volume that guarantees reliable understanding of the hygrothermal macroscopic phenomena is very critical. At the microscopic scale, when subjected to hygric solicitations, fibers undergo local dimensionless variations. It is therefore necessary to master this behavior, which affects the global response of the material. This study consists of an experimental procedure using the non-destructive method, X-ray tomography, followed by morphological post-processing analysis using ImageJ software. A refine investigation took place in order to identify the representative elementary volume and the sufficient resolution for accurate structural analysis. The second part of this work was to evaluate the microscopic hygric behavior of the studied material. Many parameters were taken into consideration, like the evolution of the fiber diameters, distribution along the sorption cycle and the porosity, and the water content evolution. In addition, heat transfer simulations based on the energy equation resolution were achieved on the real structure. Further, the problematic of representative elementary volume was elaborated for such heterogeneous material. Moreover, the material’s porosity and its fibers’ thicknesses show very big correlation with the water content. These results provide the literature with very good understanding of wood fiber insulation’s behavior.

Keywords: hygric behavior, morphological characterization, wood fiber insulation material, x-ray tomography

Procedia PDF Downloads 249
1285 Wireless Information Transfer Management and Case Study of a Fire Alarm System in a Residential Building

Authors: Mohsen Azarmjoo, Mehdi Mehdizadeh Koupaei, Maryam Mehdizadeh Koupaei, Asghar Mahdlouei Azar

Abstract:

The increasing prevalence of wireless networks in our daily lives has made them indispensable. The aim of this research is to investigate the management of information transfer in wireless networks and the integration of renewable solar energy resources in a residential building. The focus is on the transmission of electricity and information through wireless networks, as well as the utilization of sensors and wireless fire alarm systems. The research employs a descriptive approach to examine the transmission of electricity and information on a wireless network with electric and optical telephone lines. It also investigates the transmission of signals from sensors and wireless fire alarm systems via radio waves. The methodology includes a detailed analysis of security, comfort conditions, and costs related to the utilization of wireless networks and renewable solar energy resources. The study reveals that it is feasible to transmit electricity on a network cable using two pairs of network cables without the need for separate power cabling. Additionally, the integration of renewable solar energy systems in residential buildings can reduce dependence on traditional energy carriers. The use of sensors and wireless remote information processing can enhance the safety and efficiency of energy usage in buildings and the surrounding spaces.

Keywords: renewable energy, intelligentization, wireless sensors, fire alarm system

Procedia PDF Downloads 38
1284 Perceiving Interpersonal Conflict and the Big Five Personality Traits

Authors: Emily Rivera, Toni DiDona

Abstract:

The Big Five personality traits is a hierarchical classification of personality traits that applies factor analysis to a personality survey data in order to describe human personality using five broad dimensions: Extraversion, Agreeableness, Conscientiousness, Neuroticism, and Openness (Fetvadjiev & Van de Vijer, 2015). Research shows that personality constructs underline individual differences in processing conflict and interpersonal relations. (Graziano et al., 1996). This research explores the understudied correlation between the Big Five personality traits and perceived interpersonal conflict in the workplace. It revises social psychological literature on Big Five personality traits within a social context and discusses organizational development journal articles on the perceived efficacy of conflict tactics and approach to interpersonal relationships. The study also presents research undertaken on a survey group of 867 subjects over the age of 18 that were recruited by means of convenience sampling through social media, email, and text messaging. The central finding of this study is that only two of the Big Five personality traits had a significant correlation with perceiving interpersonal conflict in the workplace. Individuals who score higher on agreeableness and neuroticism, perceive more interpersonal conflict in the workplace compared to those that score lower on each dimension. The relationship between both constructs is worthy of research due to its everyday frequency and unique individual psycho-social consequences. This multimethod research associated the Big Five personality dimensions to interpersonal conflict. Its findings that can be utilized to further understand social cognition, person perception, complex social behavior and social relationships in the work environment.

Keywords: five-factor model, interpersonal conflict, personality, The Big Five personality traits

Procedia PDF Downloads 142