Search results for: refractive errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1075

Search results for: refractive errors

835 Efficient Principal Components Estimation of Large Factor Models

Authors: Rachida Ouysse

Abstract:

This paper proposes a constrained principal components (CnPC) estimator for efficient estimation of large-dimensional factor models when errors are cross sectionally correlated and the number of cross-sections (N) may be larger than the number of observations (T). Although principal components (PC) method is consistent for any path of the panel dimensions, it is inefficient as the errors are treated to be homoskedastic and uncorrelated. The new CnPC exploits the assumption of bounded cross-sectional dependence, which defines Chamberlain and Rothschild’s (1983) approximate factor structure, as an explicit constraint and solves a constrained PC problem. The CnPC method is computationally equivalent to the PC method applied to a regularized form of the data covariance matrix. Unlike maximum likelihood type methods, the CnPC method does not require inverting a large covariance matrix and thus is valid for panels with N ≥ T. The paper derives a convergence rate and an asymptotic normality result for the CnPC estimators of the common factors. We provide feasible estimators and show in a simulation study that they are more accurate than the PC estimator, especially for panels with N larger than T, and the generalized PC type estimators, especially for panels with N almost as large as T.

Keywords: high dimensionality, unknown factors, principal components, cross-sectional correlation, shrinkage regression, regularization, pseudo-out-of-sample forecasting

Procedia PDF Downloads 124
834 Formulation of a Stress Management Program for Human Error Prevention in Nuclear Power Plants

Authors: Hyeon-Kyo Lim, Tong-il Jang, Yong-Hee Lee

Abstract:

As for any nuclear power plant, human error is one of the most dreaded factors that may result in unexpected accidents. Thus, for accident prevention, it is quite indispensable to analyze and to manage the influence of any factor which may raise the possibility of human errors. Among lots factors, stress has been reported to have significant influence on human performance. Stress level of a person may fluctuate over time. To handle the possibility over time, robust stress management program is required, especially in nuclear power plants. Therefore, to overcome the possibility of human errors, this study aimed to develop a stress management program as a part of Fitness-for-Duty (FFD) Program for the workers in nuclear power plants. The meaning of FFD might be somewhat different by research objectives, appropriate definition of FFD was accomplished in this study with special reference to human error prevention, and diverse stress factors were elicited for management of human error susceptibility. In addition, with consideration of conventional FFD management programs, appropriate tests and interventions were introduced over the whole employment cycle including selection and screening of workers, job allocation, job rotation, and disemployment as well as Employee-Assistance-Program (EAP). The results showed that most tools mainly concentrated their weights on common organizational factors such as Demands, Supports, and Relationships in sequence, which were referred as major stress factors.

Keywords: human error, accident prevention, work performance, stress, fatigue

Procedia PDF Downloads 300
833 Phase Transition of Aqueous Ternary (THF + Polyvinylpyrrolidone + H2O) System as Revealed by Terahertz Time-Domain Spectroscopy

Authors: Hyery Kang, Dong-Yeun Koh, Yun-Ho Ahn, Huen Lee

Abstract:

Determination of the behavior of clathrate hydrate with inhibitor in the THz region will provide useful information about hydrate plug control in the upstream of the oil and gas industry. In this study, terahertz time-domain spectroscopy (THz-TDS) revealed the inhibition of the THF clathrate hydrate system with dosage of polyvinylpyrrolidone (PVP) with three different molecular weights. Distinct footprints of phase transition in the THz region (0.4–2.2 THz) were analyzed and absorption coefficients and real part of refractive indices are obtained in the temperature range of 253 K to 288 K. Along with the optical properties, ring breathing and stretching modes for different molecular weights of PVP in THF hydrate are analyzed by Raman spectroscopy.

Keywords: clathrate hydrate, terahertz spectroscopy, tetrahydrofuran, inhibitor

Procedia PDF Downloads 313
832 True Single SKU Script: Applying the Automated Test to Set Software Properties in a Global Software Development Environment

Authors: Antonio Brigido, Maria Meireles, Francisco Barros, Gaspar Mota, Fernanda Terra, Lidia Melo, Marcelo Reis, Camilo Souza

Abstract:

As the globalization of the software process advances, companies are increasingly committed to improving software development technologies across multiple locations. On the other hand, working with teams distributed in different locations also raises new challenges. In this sense, automated processes can help to improve the quality of process execution. Therefore, this work presents the development of a tool called TSS Script that automates the sample preparation process for carrier requirements validation tests. The objective of the work is to obtain significant gains in execution time and reducing errors in scenario preparation. To estimate the gains over time, the executions performed in an automated and manual way were timed. In addition, a questionnaire-based survey was developed to discover new requirements and improvements to include in this automated support. The results show an average gain of 46.67% of the total hours worked, referring to sample preparation. The use of the tool avoids human errors, and for this reason, it adds greater quality and speed to the process. Another relevant factor is the fact that the tester can perform other activities in parallel with sample preparation.

Keywords: Android, GSD, automated testing tool, mobile products

Procedia PDF Downloads 275
831 Attention and Memory in the Music Learning Process in Individuals with Visual Impairments

Authors: Lana Burmistrova

Abstract:

Introduction: The influence of visual impairments on several cognitive processes used in the music learning process is an increasingly important area in special education and cognitive musicology. Many children have several visual impairments due to the refractive errors and irreversible inhibitors. However, based on the compensatory neuroplasticity and functional reorganization, congenitally blind (CB) and early blind (EB) individuals use several areas of the occipital lobe to perceive and process auditory and tactile information. CB individuals have greater memory capacity, memory reliability, and less false memory mechanisms are used while executing several tasks, they have better working memory (WM) and short-term memory (STM). Blind individuals use several strategies while executing tactile and working memory n-back tasks: verbalization strategy (mental recall), tactile strategy (tactile recall) and combined strategies. Methods and design: The aim of the pilot study was to substantiate similar tendencies while executing attention, memory and combined auditory tasks in blind and sighted individuals constructed for this study, and to investigate attention, memory and combined mechanisms used in the music learning process. For this study eight (n=8) blind and eight (n=8) sighted individuals aged 13-20 were chosen. All respondents had more than five years music performance and music learning experience. In the attention task, all respondents had to identify pitch changes in tonal and randomized melodic pairs. The memory task was based on the mismatch negativity (MMN) proportion theory: 80 percent standard (not changed) and 20 percent deviant (changed) stimuli (sequences). Every sequence was named (na-na, ra-ra, za-za) and several items (pencil, spoon, tealight) were assigned for each sequence. Respondents had to recall the sequences, to associate them with the item and to detect possible changes. While executing the combined task, all respondents had to focus attention on the pitch changes and had to detect and describe these during the recall. Results and conclusion: The results support specific features in CB and EB, and similarities between late blind (LB) and sighted individuals. While executing attention and memory tasks, it was possible to observe the tendency in CB and EB by using more precise execution tactics and usage of more advanced periodic memory, while focusing on auditory and tactile stimuli. While executing memory and combined tasks, CB and EB individuals used passive working memory to recall standard sequences, active working memory to recall deviant sequences and combined strategies. Based on the observation results, assessment of blind respondents and recording specifics, following attention and memory correlations were identified: reflective attention and STM, reflective attention and periodic memory, auditory attention and WM, tactile attention and WM, auditory tactile attention and STM. The results and the summary of findings highlight the attention and memory features used in the music learning process in the context of blindness, and the tendency of the several attention and memory types correlated based on the task, strategy and individual features.

Keywords: attention, blindness, memory, music learning, strategy

Procedia PDF Downloads 160
830 AI-Based Technologies for Improving Patient Safety and Quality of Care

Authors: Tewelde Gebreslassie Gebreanenia, Frie Ayalew Yimam, Seada Hussen Adem

Abstract:

Patient safety and quality of care are essential goals of health care delivery, but they are often compromised by human errors, system failures, or resource constraints. In a variety of healthcare contexts, artificial intelligence (AI), a quickly developing field, can provide fresh approaches to enhancing patient safety and treatment quality. Artificial Intelligence (AI) has the potential to decrease errors and enhance patient outcomes by carrying out tasks that would typically require human intelligence. These tasks include the detection and prevention of adverse events, monitoring and warning patients and clinicians about changes in vital signs, symptoms, or risks, offering individualized and evidence-based recommendations for diagnosis, treatment, or prevention, and assessing and enhancing the effectiveness of health care systems and services. This study examines the state-of-the-art and potential future applications of AI-based technologies for enhancing patient safety and care quality, as well as the opportunities and problems they present for patients, policymakers, researchers, and healthcare providers. In order to ensure the safe, efficient, and responsible application of AI in healthcare, the paper also addresses the ethical, legal, social, and technical challenges that must be addressed and regulated.

Keywords: artificial intelligence, health care, human intelligence, patient safty, quality of care

Procedia PDF Downloads 45
829 An Approach for Detection Efficiency Determination of High Purity Germanium Detector Using Cesium-137

Authors: Abdulsalam M. Alhawsawi

Abstract:

Estimation of a radiation detector's efficiency plays a significant role in calculating the activity of radioactive samples. Detector efficiency is measured using sources that emit a variety of energies from low to high-energy photons along the energy spectrum. Some photon energies are hard to find in lab settings either because check sources are hard to obtain or the sources have short half-lives. This work aims to develop a method to determine the efficiency of a High Purity Germanium Detector (HPGe) based on the 662 keV gamma ray photon emitted from Cs-137. Cesium-137 is readily available in most labs with radiation detection and health physics applications and has a long half-life of ~30 years. Several photon efficiencies were calculated using the MCNP5 simulation code. The simulated efficiency of the 662 keV photon was used as a base to calculate other photon efficiencies in a point source and a Marinelli Beaker form. In the Marinelli Beaker filled with water case, the efficiency of the 59 keV low energy photons from Am-241 was estimated with a 9% error compared to the MCNP5 simulated efficiency. The 1.17 and 1.33 MeV high energy photons emitted by Co-60 had errors of 4% and 5%, respectively. The estimated errors are considered acceptable in calculating the activity of unknown samples as they fall within the 95% confidence level.

Keywords: MCNP5, MonteCarlo simulations, efficiency calculation, absolute efficiency, activity estimation, Cs-137

Procedia PDF Downloads 97
828 Designing a Dispersion Flattened Single Mode PCF for E-Band to U-Band with Less Effective Area

Authors: Shabbir Chowdhury

Abstract:

A signal is broadened when it is gone through a channel, this phenomenon is known as dispersion. And dispersion is different for different wavelength. So bandwidth become limited. Research have tried to design an optical fiber with flattened dispersion to use more bandwidth and also for wavelength division multiplexing. In this paper, a single mode photonic crystal fiber with a flattened dispersion and less effective area has been proposed where silica is used as fiber materials. The effective dispersion varies from -1.996 to 0.1783 [ps/(nm-km)] for enter E-band to U-band. This fiber will take only 3.048 [micrometer^2] (for 1.75 micrometer wavelength). Silica is being used as the fiber material.

Keywords: photonic crystal fiber, dispersion, bandwidth, chromatic dispersion, effective dispersion, dispersion compensation, effective area, effective refractive index

Procedia PDF Downloads 386
827 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods

Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard

Abstract:

The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.

Keywords: algorithms, genetics, matching, population

Procedia PDF Downloads 118
826 Sol-Gel Synthesis and Optical Characterisation of TiO2 Thin Films for Photovoltaic Application

Authors: Arabi Nour El Houda, Iratni Aicha, Talaighil Razika, Bruno Capoen, Mohamed Bouazaoui

Abstract:

TiO2 thin films have been prepared by the sol-gel dip-coating technique in order to elaborate antireflective thin films for monocrystalline silicon (mono-Si). The titanium isopropoxyde was chosen as a precursor with hydrochloric acid as a catalyser for preparing a stable solution. The optical properties have been tailored with varying the solution concentration, the withdrawn speed, and the heat-treatment. We showed that using a TiO2 single layer with 64.5 nm in thickness, heat-treated at 450°C or 300°C reduces the mono-Si reflection at a level lower than 3% over the broadband spectral do mains [669-834] nm and [786-1006] nm respectively. Those latter performances are similar to the ones obtained with double layers of low and high refractive index glasses respectively.

Keywords: thin film, dip-coating, mono-crystalline silicon, titanium oxide

Procedia PDF Downloads 410
825 Time Series Forecasting (TSF) Using Various Deep Learning Models

Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan

Abstract:

Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed-length window in the past as an explicit input. In this paper, we study how the performance of predictive models changes as a function of different look-back window sizes and different amounts of time to predict the future. We also consider the performance of the recent attention-based Transformer models, which have had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (RNN, LSTM, GRU, and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the UCI website, which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean Average Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.

Keywords: air quality prediction, deep learning algorithms, time series forecasting, look-back window

Procedia PDF Downloads 128
824 Continuous Measurement of Spatial Exposure Based on Visual Perception in Three-Dimensional Space

Authors: Nanjiang Chen

Abstract:

In the backdrop of expanding urban landscapes, accurately assessing spatial openness is critical. Traditional visibility analysis methods grapple with discretization errors and inefficiencies, creating a gap in truly capturing the human experi-ence of space. Addressing these gaps, this paper introduces a distinct continuous visibility algorithm, a leap in measuring urban spaces from a human-centric per-spective. This study presents a methodological breakthrough by applying this algorithm to urban visibility analysis. Unlike conventional approaches, this tech-nique allows for a continuous range of visibility assessment, closely mirroring hu-man visual perception. By eliminating the need for predefined subdivisions in ray casting, it offers a more accurate and efficient tool for urban planners and architects. The proposed algorithm not only reduces computational errors but also demonstrates faster processing capabilities, validated through a case study in Bei-jing's urban setting. Its key distinction lies in its potential to benefit a broad spec-trum of stakeholders, ranging from urban developers to public policymakers, aid-ing in the creation of urban spaces that prioritize visual openness and quality of life. This advancement in urban analysis methods could lead to more inclusive, comfortable, and well-integrated urban environments, enhancing the spatial experience for communities worldwide.

Keywords: visual openness, spatial continuity, ray-tracing algorithms, urban computation

Procedia PDF Downloads 7
823 Preliminary Study of the Phonological Development in Three and Four Year Old Bulgarian Children

Authors: Tsvetomira Braynova, Miglena Simonska

Abstract:

The article presents the results of research on phonological processes in three and four-year-old children. For the purpose of the study, an author's test was developed and conducted among 120 children. The study included three areas of research - at the level of words (96 words), at the level of sentence repetition (10 sentences) and at the level of generating own speech from a picture (15 pictures). The test also gives us additional information about the articulation errors of the assessed children. The main purpose of the icing is to analyze all phonological processes that occur at this age in Bulgarian children and to identify which are typical and atypical for this age. The results show that the most common phonology errors that children make are: sound substitution, an elision of sound, metathesis of sound, elision of a syllable, and elision of consonants clustered in a syllable. All examined children were identified with the articulatory disorder from type bilabial lambdacism. Measuring the correlation between the average length of repeated speech and the average length of generated speech, the analysis proves that the more words a child can repeat in part “repeated speech,” the more words they can be expected to generate in part “generating sentence.” The results of this study show that the task of naming a word provides sufficient and representative information to assess the child's phonology.

Keywords: assessment, phonology, articulation, speech-language development

Procedia PDF Downloads 151
822 Investigation about Structural and Optical Properties of Bulk and Thin Film of 1H-CaAlSi by Density Functional Method

Authors: M. Babaeipour, M. Vejdanihemmat

Abstract:

Optical properties of bulk and thin film of 1H-CaAlSi for two directions (1,0,0) and (0,0,1) were studied. The calculations are carried out by Density Functional Theory (DFT) method using full potential. GGA approximation was used to calculate exchange-correlation energy. The calculations are performed by WIEN2k package. The results showed that the absorption edge is shifted backward 0.82eV in the thin film than the bulk for both directions. The static values of the real part of dielectric function for four cases were obtained. The static values of the refractive index for four cases are calculated too. The reflectivity graphs have shown an intensive difference between the reflectivity of the thin film and the bulk in the ultraviolet region.

Keywords: 1H-CaAlSi, absorption, bulk, optical, thin film

Procedia PDF Downloads 500
821 A Numerical Investigation of Total Temperature Probes Measurement Performance

Authors: Erdem Meriç

Abstract:

Measuring total temperature of air flow accurately is a very important requirement in the development phases of many industrial products, including gas turbines and rockets. Thermocouples are very practical devices to measure temperature in such cases, but in high speed and high temperature flows, the temperature of thermocouple junction may deviate considerably from real flow total temperature due to the effects of heat transfer mechanisms of convection, conduction, and radiation. To avoid errors in total temperature measurement, special probe designs which are experimentally characterized are used. In this study, a validation case which is an experimental characterization of a specific class of total temperature probes is selected from the literature to develop a numerical conjugate heat transfer analysis methodology to study the total temperature probe flow field and solid temperature distribution. Validated conjugate heat transfer methodology is used to investigate flow structures inside and around the probe and effects of probe design parameters like the ratio between inlet and outlet hole areas and prob tip geometry on measurement accuracy. Lastly, a thermal model is constructed to account for errors in total temperature measurement for a specific class of probes in different operating conditions. Outcomes of this work can guide experimentalists to design a very accurate total temperature probe and quantify the possible error for their specific case.

Keywords: conjugate heat transfer, recovery factor, thermocouples, total temperature probes

Procedia PDF Downloads 107
820 Design of a Compact Herriott Cell for Heat Flux Measurement Applications

Authors: R. G. Ramírez-Chavarría, C. Sánchez-Pérez, V. Argueta-Díaz

Abstract:

In this paper we present the design of an optical device based on a Herriott multi-pass cell fabricated on a small sized acrylic slab for heat flux measurements using the deflection of a laser beam propagating inside the cell. The beam deflection is produced by the heat flux conducted to the acrylic slab due to a gradient in the refractive index. The use of a long path cell as the sensitive element in this measurement device, gives the possibility of high sensitivity within a small size device. We present the optical design as well as some experimental results in order to validate the device’s operation principle.

Keywords: heat flux, Herriott cell, optical beam deflection, thermal conductivity

Procedia PDF Downloads 621
819 I Don’t Want to Have to Wait: A Study Into the Origins of Rule Violations at Rail Pedestrian Level Crossings

Authors: James Freeman, Andry Rakotonirainy

Abstract:

Train pedestrian collisions are common and are the most likely to result in severe injuries and fatalities when compared to other types of rail crossing accidents. However, there is limited research that has focused on understanding the reasons why some pedestrians’ break level crossings rules, which limits the development of effective countermeasures. As a result, this study undertook a deeper exploration into the origins of risky pedestrian behaviour through structured interviews. A total of 40 pedestrians who admitted to either intentionally breaking crossing rules or making crossing errors participated in an in-depth telephone interview. Qualitative analysis was undertaken via thematic analysis that revealed participants were more likely to report deliberately breaking rules (rather than make errors), particular after the train had passed the crossing as compared to before it arrives. Predominant reasons for such behaviours were identified to be: calculated risk taking, impatience, poor knowledge of rules and low likelihood of detection. The findings have direct implications for the development of effective countermeasures to improve crossing safety (and managing risk) such as increasing surveillance and transit officer presence, as well as installing appropriate barriers that either deter or incapacitate pedestrians from violating crossing rules. This paper will further outline the study findings in regards to the development of countermeasures as well as provide direction for future research efforts in this area.

Keywords: crossings, mistakes, risk, violations

Procedia PDF Downloads 392
818 Wrong Site Surgery Should Not Occur In This Day And Age!

Authors: C. Kuoh, C. Lucas, T. Lopes, I. Mechie, J. Yoong, W. Yoong

Abstract:

For all surgeons, there is one preventable but still highly occurring complication – wrong site surgeries. They can have potentially catastrophic, irreversible, or even fatal consequences on patients. With the exponential development of microsurgery and the use of advanced technological tools, the consequences of operating on the wrong side, anatomical part, or even person is seen as the most visible and destructive of all surgical errors and perhaps the error that is dreaded by most clinicians as it threatens their licenses and arouses feelings of guilt. Despite the implementation of the WHO surgical safety checklist more than a decade ago, the incidence of wrong-site surgeries remains relatively high, leading to tremendous physical and psychological repercussions for the clinicians involved, as well as a financial burden for the healthcare institution. In this presentation, the authors explore various factors which can lead to wrong site surgery – a combination of environmental and human factors and evaluate their impact amongst patients, practitioners, their families, and the medical industry. Major contributing factors to these “never events” include deviations from checklists, excessive workload, and poor communication. Two real-life cases are discussed, and systems that can be implemented to prevent these errors are highlighted alongside lessons learnt from other industries. The authors suggest that reinforcing speaking-up, implementing medical professional trainings, and higher patient’s involvements can potentially improve safety in surgeries and electrosurgeries.

Keywords: wrong side surgery, never events, checklist, workload, communication

Procedia PDF Downloads 158
817 Estimation of the Road Traffic Emissions and Dispersion in the Developing Countries Conditions

Authors: Hicham Gourgue, Ahmed Aharoune, Ahmed Ihlal

Abstract:

We present in this work our model of road traffic emissions (line sources) and dispersion of these emissions, named DISPOLSPEM (Dispersion of Poly Sources and Pollutants Emission Model). In its emission part, this model was designed to keep the consistent bottom-up and top-down approaches. It also allows to generate emission inventories from reduced input parameters being adapted to existing conditions in Morocco and in the other developing countries. While several simplifications are made, all the performance of the model results are kept. A further important advantage of the model is that it allows the uncertainty calculation and emission rate uncertainty according to each of the input parameters. In the dispersion part of the model, an improved line source model has been developed, implemented and tested against a reference solution. It provides improvement in accuracy over previous formulas of line source Gaussian plume model, without being too demanding in terms of computational resources. In the case study presented here, the biggest errors were associated with the ends of line source sections; these errors will be canceled by adjacent sections of line sources during the simulation of a road network. In cases where the wind is parallel to the source line, the use of the combination discretized source and analytical line source formulas minimizes remarkably the error. Because this combination is applied only for a small number of wind directions, it should not excessively increase the calculation time.

Keywords: air pollution, dispersion, emissions, line sources, road traffic, urban transport

Procedia PDF Downloads 417
816 Physico-Chemical, GC-MS Analysis and Cold Saponification of Onion (Allium cepa L) Seed Oil

Authors: A. A Warra, S. Fatima

Abstract:

The experimental investigation revealed that the hexane extract of onion seed oil has acid value, iodine value, peroxide value, saponification value, relative density and refractive index of 0.03±0.01 mgKOH/g, 129.80±0.21 gI2/100g, 3.00± 0.00 meq H2O2 203.00±0.71 mgKOH/g, 0.82±0.01and 1.44±0.00 respectively. The percentage yield was 50.28±0.01%. The colour of the oil was light green. We restricted our GC-MS spectra interpretation to compounds identification, particularly fatty acids and they are identified as palmitic acid, linolelaidic acid, oleic acid, stearic acid, behenic acid, linolenic acid and eicosatetraenoic acid. The pH , foam ability (cm³), total fatty matter, total alkali and percentage chloride of the onion oil soap were 11.03± 0.02, 75.13±0.15 (cm³), 36.66 ± 0.02 %, 0.92 ± 0.02% and 0.53 ± 0.15 % respectively. The texture was soft and the colour was lighter green. The results indicated that the hexane extract of the onion seed oil has potential for cosmetic industries.

Keywords: onion seeds, soxhlet extraction, physicochemical, GC-MS, cold saponification

Procedia PDF Downloads 286
815 Estimation of Normalized Glandular Doses Using a Three-Layer Mammographic Phantom

Authors: Kuan-Jen Lai, Fang-Yi Lin, Shang-Rong Huang, Yun-Zheng Zeng, Po-Chieh Hsu, Jay Wu

Abstract:

The normalized glandular dose (DgN) estimates the energy deposition of mammography in clinical practice. The Monte Carlo simulations frequently use uniformly mixed phantom for calculating the conversion factor. However, breast tissues are not uniformly distributed, leading to errors of conversion factor estimation. This study constructed a three-layer phantom to estimated more accurate of normalized glandular dose. In this study, MCNP code (Monte Carlo N-Particles code) was used to create the geometric structure. We simulated three types of target/filter combinations (Mo/Mo, Mo/Rh, Rh/Rh), six voltages (25 ~ 35 kVp), six HVL parameters and nine breast phantom thicknesses (2 ~ 10 cm) for the three-layer mammographic phantom. The conversion factor for 25%, 50% and 75% glandularity was calculated. The error of conversion factors compared with the results of the American College of Radiology (ACR) was within 6%. For Rh/Rh, the difference was within 9%. The difference between the 50% average glandularity and the uniform phantom was 7.1% ~ -6.7% for the Mo/Mo combination, voltage of 27 kVp, half value layer of 0.34 mmAl, and breast thickness of 4 cm. According to the simulation results, the regression analysis found that the three-layer mammographic phantom at 0% ~ 100% glandularity can be used to accurately calculate the conversion factors. The difference in glandular tissue distribution leads to errors of conversion factor calculation. The three-layer mammographic phantom can provide accurate estimates of glandular dose in clinical practice.

Keywords: Monte Carlo simulation, mammography, normalized glandular dose, glandularity

Procedia PDF Downloads 162
814 Neuropsychological Aspects in Adolescents Victims of Sexual Violence with Post-Traumatic Stress Disorder

Authors: Fernanda Mary R. G. Da Silva, Adriana C. F. Mozzambani, Marcelo F. Mello

Abstract:

Introduction: Sexual assault against children and adolescents is a public health problem with serious consequences on their quality of life, especially for those who develop post-traumatic stress disorder (PTSD). The broad literature in this research area points to greater losses in verbal learning, explicit memory, speed of information processing, attention and executive functioning in PTSD. Objective: To compare the neuropsychological functions of adolescents from 14 to 17 years of age, victims of sexual violence with PTSD with those of healthy controls. Methodology: Application of a neuropsychological battery composed of the following subtests: WASI vocabulary and matrix reasoning; Digit subtests (WISC-IV); verbal auditory learning test RAVLT; Spatial Span subtest of the WMS - III scale; abbreviated version of the Wisconsin test; concentrated attention test - D2; prospective memory subtest of the NEUPSILIN scale; five-digit test - FDT and the Stroop test (Trenerry version) in adolescents with a history of sexual violence in the previous six months, referred to the Prove (Violence Care and Research Program of the Federal University of São Paulo), for further treatment. Results: The results showed a deficit in the word coding process in the RAVLT test, with impairment in A3 (p = 0.004) and A4 (p = 0.016) measures, which compromises the verbal learning process (p = 0.010) and the verbal recognition memory (p = 0.012), seeming to present a worse performance in the acquisition of verbal information that depends on the support of the attentional system. A worse performance was found in list B (p = 0.047), a lower priming effect p = 0.026, that is, lower evocation index of the initial words presented and less perseveration (p = 0.002), repeated words. Therefore, there seems to be a failure in the creation of strategies that help the mnemonic process of retention of the verbal information necessary for learning. Sustained attention was found to be impaired, with greater loss of setting in the Wisconsin test (p = 0.023), a lower rate of correct responses in stage C of the Stroop test (p = 0.023) and, consequently, a higher index of erroneous responses in C of the Stroop test (p = 0.023), besides more type II errors in the D2 test (p = 0.008). A higher incidence of total errors was observed in the reading stage of the FDT test p = 0.002, which suggests fatigue in the execution of the task. Performance is compromised in executive functions in the cognitive flexibility ability, suggesting a higher index of total errors in the alternating step of the FDT test (p = 0.009), as well as a greater number of persevering errors in the Wisconsin test (p = 0.004). Conclusion: The data from this study suggest that sexual violence and PTSD cause significant impairment in the neuropsychological functions of adolescents, evidencing risk to quality of life in stages that are fundamental for the development of learning and cognition.

Keywords: adolescents, neuropsychological functions, PTSD, sexual violence

Procedia PDF Downloads 109
813 Melanoma and Non-Melanoma, Skin Lesion Classification, Using a Deep Learning Model

Authors: Shaira L. Kee, Michael Aaron G. Sy, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar AlDahoul

Abstract:

Skin diseases are considered the fourth most common disease, with melanoma and non-melanoma skin cancer as the most common type of cancer in Caucasians. The alarming increase in Skin Cancer cases shows an urgent need for further research to improve diagnostic methods, as early diagnosis can significantly improve the 5-year survival rate. Machine Learning algorithms for image pattern analysis in diagnosing skin lesions can dramatically increase the accuracy rate of detection and decrease possible human errors. Several studies have shown the diagnostic performance of computer algorithms outperformed dermatologists. However, existing methods still need improvements to reduce diagnostic errors and generate efficient and accurate results. Our paper proposes an ensemble method to classify dermoscopic images into benign and malignant skin lesions. The experiments were conducted using the International Skin Imaging Collaboration (ISIC) image samples. The dataset contains 3,297 dermoscopic images with benign and malignant categories. The results show improvement in performance with an accuracy of 88% and an F1 score of 87%, outperforming other existing models such as support vector machine (SVM), Residual network (ResNet50), EfficientNetB0, EfficientNetB4, and VGG16.

Keywords: deep learning - VGG16 - efficientNet - CNN – ensemble – dermoscopic images - melanoma

Procedia PDF Downloads 55
812 Postpartum Depression and Its Association with Food Insecurity and Social Support among Women in Post-Conflict Northern Uganda

Authors: Kimton Opiyo, Elliot M. Berry, Patil Karamchand, Barnabas K. Natamba

Abstract:

Background: Postpartum depression (PPD) is a major psychiatric disorder that affects women soon after birth and in some cases, is a continuation of antenatal depression. Food insecurity (FI) and social support (SS) are known to be associated with major depressive disorder, and vice versa. This study was conducted to examine the interrelationships among FI, SS, and PPD among postpartum women in Gulu, a post-conflict region in Uganda. Methods: Cross-sectional data from postpartum women on depression symptoms, FI and SS were, respectively, obtained using the Center for Epidemiologic Studies-Depression (CES-D) scale, Individually Focused FI Access scale (IFIAS) and Duke-UNC functional social support scale. Standard regression methods were used to assess associations among FI, SS, and PPD. Results: A total of 239 women were studied, and 40% were found to have any PPD, i.e., with depressive symptom scores of ≥ 17. The mean ± standard deviation (SD) for FI score and SS scores were 6.47 ± 5.02 and 19.11 ± 4.23 respectively. In adjusted analyses, PPD symptoms were found to be positively associated with FI (unstandardized beta and standardized beta of 0.703 and 0.432 respectively, standard errors =0.093 and p-value < 0.0001) and negatively associated with SS (unstandardized beta and standardized beta of -0.263 and -0.135 respectively, standard errors = 0.111 and p-value = 0.019). Conclusions: Many women in this post-conflict region reported experiencing PPD. In addition, this data suggest that food security and psychosocial support interventions may help mitigate women’s experience of PPD or its severity.

Keywords: postpartum depression, food insecurity, social support, post-conflict region

Procedia PDF Downloads 140
811 Constructions of Linear and Robust Codes Based on Wavelet Decompositions

Authors: Alla Levina, Sergey Taranov

Abstract:

The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.

Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability

Procedia PDF Downloads 465
810 Modeling and Simulations of Surface Plasmon Waveguide Structures

Authors: Moussa Hamdan, Abdulati Abdullah

Abstract:

This paper presents an investigation of the fabrication of the optical devices in terms of their characteristics based on the use of the electromagnetic waves. Planar waveguides are used to examine the field modes (bound modes) and the parameters required for this structure. The modifications are conducted on surface plasmons based waveguides. Simple symmetric dielectric slab structure is used and analyzed in terms of transverse electric mode (TE-Mode) and transverse magnetic mode (TM-Mode. The paper presents mathematical and numerical solutions for solving simple symmetric plasmons and provides simulations of surface plasmons for field confinement. Asymmetric TM-mode calculations for dielectric surface plasmons are also provided.

Keywords: surface plasmons, optical waveguides, semiconductor lasers, refractive index, slab dialectical

Procedia PDF Downloads 281
809 Importance of Human Factors on Cybersecurity within Organizations: A Study of Attitudes and Behaviours

Authors: Elham Rajabian

Abstract:

The ascent of cybersecurity incidents is a rising threat to most organisations in general, while the impact of the incidents is unique to each of the organizations. It is a need for behavioural sciences to concentrate on employees’ behaviour in order to prepare key security mitigation opinions versus cybersecurity incidents. There are noticeable differences among users of a computer system in terms of complying with security behaviours. We can discuss the people's differences under several subjects such as delaying tactics on something that must be done, the tendency to act without thinking, future thinking about unexpected implications of present-day issues, and risk-taking behaviours in security policies compliance. In this article, we introduce high-profile cyber-attacks and their impacts on weakening cyber resiliency in organizations. We also give attention to human errors that influence network security. Human errors are discussed as a part of psychological matters to enhance compliance with the security policies. The organizational challenges are studied in order to shape a sustainable cyber risks management approach in the related work section. Insiders’ behaviours are viewed as a cyber security gap to draw proper cyber resiliency in section 3. We carry out the best cybersecurity practices by discussing four CIS challenges in section 4. In this regard, we provide a guideline and metrics to measure cyber resilience in organizations in section 5. In the end, we give some recommendations in order to build a cybersecurity culture based on individual behaviours.

Keywords: cyber resilience, human factors, cybersecurity behavior, attitude, usability, security culture

Procedia PDF Downloads 71
808 Evaluation of Vehicle Classification Categories: Florida Case Study

Authors: Ren Moses, Jaqueline Masaki

Abstract:

This paper addresses the need for accurate and updated vehicle classification system through a thorough evaluation of vehicle class categories to identify errors arising from the existing system and proposing modifications. The data collected from two permanent traffic monitoring sites in Florida were used to evaluate the performance of the existing vehicle classification table. The vehicle data were collected and classified by the automatic vehicle classifier (AVC), and a video camera was used to obtain ground truth data. The Federal Highway Administration (FHWA) vehicle classification definitions were used to define vehicle classes from the video and compare them to the data generated by AVC in order to identify the sources of misclassification. Six types of errors were identified. Modifications were made in the classification table to improve the classification accuracy. The results of this study include the development of updated vehicle classification table with a reduction in total error by 5.1%, a step by step procedure to use for evaluation of vehicle classification studies and recommendations to improve FHWA 13-category rule set. The recommendations for the FHWA 13-category rule set indicate the need for the vehicle classification definitions in this scheme to be updated to reflect the distribution of current traffic. The presented results will be of interest to States’ transportation departments and consultants, researchers, engineers, designers, and planners who require accurate vehicle classification information for planning, designing and maintenance of transportation infrastructures.

Keywords: vehicle classification, traffic monitoring, pavement design, highway traffic

Procedia PDF Downloads 162
807 Spectral Broadening in an InGaAsP Optical Waveguide with χ(3) Nonlinearity Including Two Photon Absorption

Authors: Keigo Matsuura, Isao Tomita

Abstract:

We have studied a method to widen the spectrum of optical pulses that pass through an InGaAsP waveguide for application to broadband optical communication. In particular, we have investigated the competitive effect between spectral broadening arising from nonlinear refraction (optical Kerr effect) and shrinking due to two photon absorption in the InGaAsP waveguide with chi^(3) nonlinearity. The shrunk spectrum recovers broadening by the enhancement effect of the nonlinear refractive index near the bandgap of InGaAsP with a bandgap wavelength of 1490 nm. The broadened spectral width at around 1525 nm (196.7 THz) becomes 10.7 times wider than that at around 1560 nm (192.3 THz) without the enhancement effect, where amplified optical pulses with a pulse width of 2 ps and a peak power of 10 W propagate through a 1-cm-long InGaAsP waveguide with a cross-section of 4 um^2.

Keywords: InGaAsP waveguide, Chi^(3) nonlinearity, spectral broadening, photon absorption

Procedia PDF Downloads 619
806 Knowledge-Attitude-Practice Survey Regarding High Alert Medication in a Teaching Hospital in Eastern India

Authors: D. S. Chakraborty, S. Ghosh, A. Hazra

Abstract:

Objective: Medication errors are a reality in all settings where medicines are prescribed, dispensed and used. High Alert Medications (HAM) are those that bear a heightened risk of causing significant patient harm when used in error. We conducted a knowledge-attitude-practice survey, among residents working in a teaching hospital, to assess the ground situation with regard to the handling of HAM. Methods: We plan to approach 242 residents among the approximately 600 currently working in the hospital through purposive sampling. Residents in all disciplines (clinical, paraclinical and preclinical) are being targeted. A structured questionnaire that has been pretested on 5 volunteer residents is being used for data collection. The questionnaire is being administered to residents individually through face-to-face interview, by two raters, while they are on duty but not during rush hours. Results: Of the 156 residents approached so far, data from 140 have been analyzed, the rest having refused participation. Although background knowledge exists for the majority of respondents, awareness levels regarding HAM are moderate, and attitude is non-uniform. The number of respondents correctly able to identify most ( > 80%) HAM in three common settings– accident and emergency, obstetrics and intensive care unit are less than 70%. Several potential errors in practice have been identified. The study is ongoing. Conclusions: Situation requires corrective action. There is an urgent need for improving awareness regarding HAM for the sake of patient safety. The pharmacology department can take the lead in designing awareness campaign with support from the hospital administration.

Keywords: high alert medication, medication error, questionnaire, resident

Procedia PDF Downloads 106