Search results for: noisy forensic speaker verification
478 Wind Wave Modeling Using MIKE 21 SW Spectral Model
Authors: Pouya Molana, Zeinab Alimohammadi
Abstract:
Determining wind wave characteristics is essential for implementing projects related to Coastal and Marine engineering such as designing coastal and marine structures, estimating sediment transport rates and coastal erosion rates in order to predict significant wave height (H_s), this study applies the third generation spectral wave model, Mike 21 SW, along with CEM model. For SW model calibration and verification, two data sets of meteorology and wave spectroscopy are used. The model was exposed to time-varying wind power and the results showed that difference ratio mean, standard deviation of difference ratio and correlation coefficient in SW model for H_s parameter are 1.102, 0.279 and 0.983, respectively. Whereas, the difference ratio mean, standard deviation and correlation coefficient in The Choice Experiment Method (CEM) for the same parameter are 0.869, 1.317 and 0.8359, respectively. Comparing these expected results it is revealed that the Choice Experiment Method CEM has more errors in comparison to MIKE 21 SW third generation spectral wave model and higher correlation coefficient does not necessarily mean higher accuracy.Keywords: MIKE 21 SW, CEM method, significant wave height, difference ratio
Procedia PDF Downloads 404477 [Keynote Talk]: Quest for Sustainability in the Midst of Conflict Between Climate and Energy Security
Authors: Deepak L. Waikar
Abstract:
Unprecedented natural as well as human made disasters have been responsible for loss of hundreds of thousands of lives, injury & displacement of millions of people and damages in billions of dollars in various parts of the world. Scientists, experts, associations and united nation have been warning about colossal disregard for human safety and environment in exploiting natural resources for insatiable greed for economic growth and rising lavish life style of the rich. Usual blame game is routinely played at international forums & summits by vested interests in developing and developed nations, while billions of people continue to suffer in abject energy poverty. Energy security, on the other hand, is becoming illusive with the dominance of few players in the market, poor energy governance mechanisms, volatile prices and geopolitical conflicts in supply chain. Conflicting scenarios have been cited as one of the major barriers for transformation to a low carbon economy. Policy makers, researchers, academics, businesses, industries and communities have been evaluating sustainable alternatives, albeit at snail’s pace. This presentation focuses on technologies, energy governance, policies & practices, economics and public concerns about safe, prudent & sustainable harnessing of energy resources. Current trends and potential research & development projects in power & energy sectors which students can undertake will be discussed. Speaker will highlight on how youths can be engaged in meaningful, safe, enriching, inspiring and value added self-development programmes in our quest for sustainability in the midst of conflict between climate and energy security.Keywords: clean energy, energy policy, energy security, sustainable energy
Procedia PDF Downloads 488476 The Comparison of the Reliability Margin Measure for the Different Concepts in the Slope Analysis
Authors: Filip Dodigovic, Kreso Ivandic, Damir Stuhec, S. Strelec
Abstract:
The general difference analysis between the former and new design concepts in geotechnical engineering is carried out. The application of new regulations results in the need for real adaptation of the computation principles of limit states, i.e. by providing a uniform way of analyzing engineering tasks. Generally, it is not possible to unambiguously match the limit state verification procedure with those in the construction engineering. The reasons are the inability to fully consistency of the common probabilistic basis of the analysis, and the fundamental effect of material properties on the value of actions and the influence of actions on resistance. Consequently, it is not possible to apply separate factorization with partial coefficients, as in construction engineering. For the slope stability analysis design procedures problems in the light of the use of limit states in relation to the concept of allowable stresses is detailed in. The quantifications of the safety margins in the slope stability analysis for both approaches is done. When analyzing the stability of the slope, by the strict application of the adopted forms from the new regulations for significant external temporary and/or seismic actions, the equivalent margin of safety is increased. The consequence is the emergence of more conservative solutions.Keywords: allowable pressure, Eurocode 7, limit states, slope stability
Procedia PDF Downloads 337475 Geospatial Curve Fitting Methods for Disease Mapping of Tuberculosis in Eastern Cape Province, South Africa
Authors: Davies Obaromi, Qin Yongsong, James Ndege
Abstract:
To interpolate scattered or regularly distributed data, there are imprecise or exact methods. However, there are some of these methods that could be used for interpolating data in a regular grid and others in an irregular grid. In spatial epidemiology, it is important to examine how a disease prevalence rates are distributed in space, and how they relate with each other within a defined distance and direction. In this study, for the geographic and graphic representation of the disease prevalence, linear and biharmonic spline methods were implemented in MATLAB, and used to identify, localize and compare for smoothing in the distribution patterns of tuberculosis (TB) in Eastern Cape Province. The aim of this study is to produce a more “smooth” graphical disease map for TB prevalence patterns by a 3-D curve fitting techniques, especially the biharmonic splines that can suppress noise easily, by seeking a least-squares fit rather than exact interpolation. The datasets are represented generally as a 3D or XYZ triplets, where X and Y are the spatial coordinates and Z is the variable of interest and in this case, TB counts in the province. This smoothing spline is a method of fitting a smooth curve to a set of noisy observations using a spline function, and it has also become the conventional method for its high precision, simplicity and flexibility. Surface and contour plots are produced for the TB prevalence at the provincial level for 2012 – 2015. From the results, the general outlook of all the fittings showed a systematic pattern in the distribution of TB cases in the province and this is consistent with some spatial statistical analyses carried out in the province. This new method is rarely used in disease mapping applications, but it has a superior advantage to be assessed at subjective locations rather than only on a rectangular grid as seen in most traditional GIS methods of geospatial analyses.Keywords: linear, biharmonic splines, tuberculosis, South Africa
Procedia PDF Downloads 240474 Reliability Assessment Using Full Probabilistic Modelling for Carbonation and Chloride Exposures, Including Initiation and Propagation Periods
Authors: Frank Papworth, Inam Khan
Abstract:
Fib’s model code 2020 has four approaches for design life verification. Historically ‘deemed to satisfy provisions have been the principal approach, but this has limited options for materials and covers. The use of an equation in fib’s model code for service life design to predict time to corrosion initiation has become increasingly popular to justify further options, but in some cases, the analysis approaches are incorrect. Even when the equations are computed using full probabilistic analysis, there are common mistakes. This paper reviews the work of recent fib commissions on implementing the service life model to assess the reliability of durability designs, including initiation and propagation periods. The paper goes on to consider the assessment of deemed to satisfy requirements in national codes and considers the influence of various options, including different steel types, various cement systems, quality of concrete and cover, on reliability achieved. As modelling is based on achieving agreed target reliability, consideration is given to how a project might determine appropriate target reliability.Keywords: chlorides, marine, exposure, design life, reliability, modelling
Procedia PDF Downloads 236473 Fiber Based Pushover Analysis of Reinforced Concrete Frame
Authors: Shewangizaw Tesfaye Wolde
Abstract:
The current engineering community has developed a method called performance based seismic design in which we design structures based on predefined performance levels set by the parties. Since we design our structures economically for the maximum actions expected in the life of structures they go beyond their elastic limit, in need of nonlinear analysis. In this paper conventional pushover analysis (nonlinear static analysis) is used for the performance assessment of the case study Reinforced Concrete (RC) Frame building located in Addis Ababa City, Ethiopia where proposed peak ground acceleration value by RADIUS 1999 project and others is more than twice as of EBCS-8:1995 (RADIUS 1999 project) by taking critical planar frame. Fiber beam-column model is used to control material nonlinearity with tension stiffening effect. The reliability of the fiber model and validation of software outputs are checked under verification chapter. Therefore, the aim of this paper is to propose a way for structural performance assessment of existing reinforced concrete frame buildings as well as design check.Keywords: seismic, performance, fiber model, tension stiffening, reinforced concrete
Procedia PDF Downloads 77472 The Non-Existence of Perfect 2-Error Correcting Lee Codes of Word Length 7 over Z
Authors: Catarina Cruz, Ana Breda
Abstract:
Tiling problems have been capturing the attention of many mathematicians due to their real-life applications. In this study, we deal with tilings of Zⁿ by Lee spheres, where n is a positive integer number, being these tilings related with error correcting codes on the transmission of information over a noisy channel. We focus our attention on the question ‘for what values of n and r does the n-dimensional Lee sphere of radius r tile Zⁿ?’. It seems that the n-dimensional Lee sphere of radius r does not tile Zⁿ for n ≥ 3 and r ≥ 2. Here, we prove that is not possible to tile Z⁷ with Lee spheres of radius 2 presenting a proof based on a combinatorial method and faithful to the geometric idea of the problem. The non-existence of such tilings has been studied by several authors being considered the most difficult cases those in which the radius of the Lee spheres is equal to 2. The relation between these tilings and error correcting codes is established considering the center of a Lee sphere as a codeword and the other elements of the sphere as words which are decoded by the central codeword. When the Lee spheres of radius r centered at elements of a set M ⊂ Zⁿ tile Zⁿ, M is a perfect r-error correcting Lee code of word length n over Z, denoted by PL(n, r). Our strategy to prove the non-existence of PL(7, 2) codes are based on the assumption of the existence of such code M. Without loss of generality, we suppose that O ∈ M, where O = (0, ..., 0). In this sense and taking into account that we are dealing with Lee spheres of radius 2, O covers all words which are distant two or fewer units from it. By the definition of PL(7, 2) code, each word which is distant three units from O must be covered by a unique codeword of M. These words have to be covered by codewords which dist five units from O. We prove the non-existence of PL(7, 2) codes showing that it is not possible to cover all the referred words without superposition of Lee spheres whose centers are distant five units from O, contradicting the definition of PL(7, 2) code. We achieve this contradiction by combining the cardinality of particular subsets of codewords which are distant five units from O. There exists an extensive literature on codes in the Lee metric. Here, we present a new approach to prove the non-existence of PL(7, 2) codes.Keywords: Golomb-Welch conjecture, Lee metric, perfect Lee codes, tilings
Procedia PDF Downloads 161471 Evaluation of the Analytic for Hemodynamic Instability as a Prediction Tool for Early Identification of Patient Deterioration
Authors: Bryce Benson, Sooin Lee, Ashwin Belle
Abstract:
Unrecognized or delayed identification of patient deterioration is a key cause of in-hospitals adverse events. Clinicians rely on vital signs monitoring to recognize patient deterioration. However, due to ever increasing nursing workloads and the manual effort required, vital signs tend to be measured and recorded intermittently, and inconsistently causing large gaps during patient monitoring. Additionally, during deterioration, the body’s autonomic nervous system activates compensatory mechanisms causing the vital signs to be lagging indicators of underlying hemodynamic decline. This study analyzes the predictive efficacy of the Analytic for Hemodynamic Instability (AHI) system, an automated tool that was designed to help clinicians in early identification of deteriorating patients. The lead time analysis in this retrospective observational study assesses how far in advance AHI predicted deterioration prior to the start of an episode of hemodynamic instability (HI) becoming evident through vital signs? Results indicate that of the 362 episodes of HI in this study, 308 episodes (85%) were correctly predicted by the AHI system with a median lead time of 57 minutes and an average of 4 hours (240.5 minutes). Of the 54 episodes not predicted, AHI detected 45 of them while the episode of HI was ongoing. Of the 9 undetected, 5 were not detected by AHI due to either missing or noisy input ECG data during the episode of HI. In total, AHI was able to either predict or detect 98.9% of all episodes of HI in this study. These results suggest that AHI could provide an additional ‘pair of eyes’ on patients, continuously filling the monitoring gaps and consequently giving the patient care team the ability to be far more proactive in patient monitoring and adverse event management.Keywords: clinical deterioration prediction, decision support system, early warning system, hemodynamic status, physiologic monitoring
Procedia PDF Downloads 190470 Postmortem Analysis of Lidocaine in Women Died of Criminal Abortion
Authors: Mohammed A. Arishy, Sultan M. Alharbi, Mohammed A. Hakami, Farid M. Abualsail, Mohammad A. Attafi, Riyadh M. Tobaiqi, Hussain M. Alsalem, Ibraheem M. Attafi
Abstract:
Lidocaine is the most common local anesthetics used for para cervical block to reduce pain associated with surgical abortion. A 25-year-old pregnant woman who. She died before reaching hospital, and she was undergoing criminal abortion during the first trimester. In post-mortem investigations and autopsy shows no clear finding; therefore, toxic substances must be suspected and searched for routinely toxicology analysis. In this case report, the postmortem concentration of lidocaine was detected blood, brain, liver, kidney, and stomach. For lidocaine identification and quantification, sample was extracted using solid phase extraction and analyzed by GC-MS (Shimadzu, Japan). Initial screening and confirmatory analysis results showed that only lidocaine was detected in all collected samples, and no other toxic substances or alcohol were detected. The concentrations of lidocaine in samples were 19, 17, 14, 7, and 3 ug/m in the brain, blood, kidney, liver, and stomach, respectively. Lidocaine blood concentration (17 ug/ml) was toxic level and may result in death. Among the tissues, brain showed the highest level of lidocaine, followed by the kidney, liver, and stomach.Keywords: forensic toxicology, GC-MS, lidocaine, postmortem
Procedia PDF Downloads 210469 Bilingualism: A Case Study of Assamese and Bodo Classifiers
Authors: Samhita Bharadwaj
Abstract:
This is an empirical study of classifiers in Assamese and Bodo, two genetically unrelated languages of India. The objective of the paper is to address the language contact between Assamese and Bodo as reflected in classifiers. The data has been collected through fieldwork in Bodo recording narratives and folk tales and eliciting specific data from the speakers. The data for Assamese is self-produced as native speaker of the language. Assamese is the easternmost New-Indo-Aryan (henceforth NIA) language mainly spoken in the Brahmaputra valley of Assam and some other north-eastern states of India. It is the lingua franca of Assam and is creolised in the neighbouring state of Nagaland. Bodo, on the other hand, is a Tibeto-Burman (henceforth TB) language of the Bodo-Garo group. It has the highest number of speakers among the TB languages of Assam. However, compared to Assamese, it is still a lesser documented language and due to the prestige of Assamese, all the Bodo speakers are fluent bi-lingual in Assamese, though the opposite isn’t the case. With this context, classifiers, a characteristic phenomenon of TB languages, but not so much of NIA languages, presents an interesting case study on language contact caused by bilingualism. Assamese, as a result of its language contact with the TB languages which are rich in classifiers; has developed the richest classifier system among the IA languages in India. Yet, as a part of rampant borrowing of Assamese words and patterns into Bodo; Bodo is seen to borrow even Assamese classifiers into its system. This paper analyses the borrowed classifiers of Bodo and finds the route of this borrowing phenomenon in the number system of the languages. As the Bodo speakers start replacing the higher numbers from five with Assamese ones, they also choose the Assamese classifiers to attach to these numbers. Thus, the partial loss of number in Bodo as a result of language contact and bilingualism in Assamese is found to be the reason behind the borrowing of classifiers in Bodo. The significance of the study lies in exploring an interesting aspect of language contact in Assam. It is hoped that this will attract further research on bilingualism and classifiers in Assam.Keywords: Assamese, bi-lingual, Bodo, borrowing, classifier, language contact
Procedia PDF Downloads 224468 Application of Relative Regional Total Energy in Rotary Drums with Axial Segregation Characteristics
Authors: Qiuhua Miao, Peng Huang, Yifei Ding
Abstract:
Particles with different properties tend to be unevenly distributed along an axial direction of the rotating drum, which is usually ignored. Therefore, it is important to study the relationship between axial segregation characteristics and particle crushing efficiency in longer drums. In this paper, a relative area total energy (RRTE) index is proposed, which aims to evaluate the overall crushing energy distribution characteristics. Based on numerical simulation verification, the proposed RRTE index can reflect the overall grinding effect more comprehensively, clearly representing crushing energy distribution in different drum areas. Furthermore, the proposed method is applied to the relation between axial segregation and crushing energy in drums. Compared with the radial section, the collision loss energy of the axial section can better reflect the overall crushing effect in long drums. The axial segregation characteristics directly affect the total energy distribution between medium and abrasive, reducing overall crushing efficiency. Therefore, the axial segregation characteristics should be avoided as much as possible in the crushing of the long rotary drum.Keywords: relative regional total energy, crushing energy, axial segregation characteristics, rotary drum
Procedia PDF Downloads 90467 Optimization of SWL Algorithms Using Alternative Adder Module in FPGA
Authors: Tayab D. Memon, Shahji Farooque, Marvi Deshi, Imtiaz Hussain Kalwar, B. S. Chowdhry
Abstract:
Recently single-bit ternary FIR-like filter (SBTFF) hardware synthesize in FPGA is reported and compared with multi-bit FIR filter on similar spectral characteristics. Results shows that SBTFF dominates upon multi-bit filter overall. In this paper, an optimized adder module for ternary quantized sigma-delta modulated signal is presented. The adder is simulated using ModelSim for functional verification the area-performance of the proposed adder were obtained through synthesis in Xilinx and compared to conventional adder trees. The synthesis results show that the proposed adder tree achieves higher clock rates and lower chip area at higher inputs to the adder block; whereas conventional adder tree achieves better performance and lower chip area at lower number of inputs to the same adder block. These results enhance the usefulness of existing short word length DSP algorithms for fast and efficient mobile communication.Keywords: short word length (SWL), DSP algorithms, FPGA, SBTFF, VHDL
Procedia PDF Downloads 348466 Detection of Cyberattacks on the Metaverse Based on First-Order Logic
Authors: Sulaiman Al Amro
Abstract:
There are currently considerable challenges concerning data security and privacy, particularly in relation to modern technologies. This includes the virtual world known as the Metaverse, which consists of a virtual space that integrates various technologies and is therefore susceptible to cyber threats such as malware, phishing, and identity theft. This has led recent studies to propose the development of Metaverse forensic frameworks and the integration of advanced technologies, including machine learning for intrusion detection and security. In this context, the application of first-order logic offers a formal and systematic approach to defining the conditions of cyberattacks, thereby contributing to the development of effective detection mechanisms. In addition, formalizing the rules and patterns of cyber threats has the potential to enhance the overall security posture of the Metaverse and, thus, the integrity and safety of this virtual environment. The current paper focuses on the primary actions employed by avatars for potential attacks, including Interval Temporal Logic (ITL) and behavior-based detection to detect an avatar’s abnormal activities within the Metaverse. The research established that the proposed framework attained an accuracy of 92.307%, resulting in the experimental results demonstrating the efficacy of ITL, including its superior performance in addressing the threats posed by avatars within the Metaverse domain.Keywords: security, privacy, metaverse, cyberattacks, detection, first-order logic
Procedia PDF Downloads 41465 Identification of Disease Causing DNA Motifs in Human DNA Using Clustering Approach
Authors: G. Tamilpavai, C. Vishnuppriya
Abstract:
Studying DNA (deoxyribonucleic acid) sequence is useful in biological processes and it is applied in the fields such as diagnostic and forensic research. DNA is the hereditary information in human and almost all other organisms. It is passed to their generations. Earlier stage detection of defective DNA sequence may lead to many developments in the field of Bioinformatics. Nowadays various tedious techniques are used to identify defective DNA. The proposed work is to analyze and identify the cancer-causing DNA motif in a given sequence. Initially the human DNA sequence is separated as k-mers using k-mer separation rule. The separated k-mers are clustered using Self Organizing Map (SOM). Using Levenshtein distance measure, cancer associated DNA motif is identified from the k-mer clusters. Experimental results of this work indicate the presence or absence of cancer causing DNA motif. If the cancer associated DNA motif is found in DNA, it is declared as the cancer disease causing DNA sequence. Otherwise the input human DNA is declared as normal sequence. Finally, elapsed time is calculated for finding the presence of cancer causing DNA motif using clustering formation. It is compared with normal process of finding cancer causing DNA motif. Locating cancer associated motif is easier in cluster formation process than the other one. The proposed work will be an initiative aid for finding genetic disease related research.Keywords: bioinformatics, cancer motif, DNA, k-mers, Levenshtein distance, SOM
Procedia PDF Downloads 188464 Investigation the Effect of Quenching Media on Abrasive Wear in Grade Medium Carbon Steel
Authors: Abbas S. Alwan, Waleed K. Hussan
Abstract:
In this paper, a general verification of possible heat treatment of steel has been done with the view of conditions of real abrasive wear of rotivater with soil texture. This technique is found promising to improve the quality of agriculture components working with the soil in dry condition. Abrasive wear resistance is very important in many applications and in most cases it is directly correlated with the hardness of materials surface. Responded of heat treatments were carried out in various media (Still air, Cottonseed oil, and Brine water 10 %) and follow by low-temperature tempering (250°C) was applied on steel type (AISI 1030). After heat treatment was applied wear with soil texture by using tillage process to determine the (actual wear rate) of the specimens depending on weight loss method. It was found; the wear resistance Increases with increase hardness with varying quenching media as follows; 30 HRC, 45 HRC, 52 HRC, and 60 HRC for nontreated (as received) cooling media as still air, cottonseed oil, and Brine water 10 %, respectively. Martensitic structure with retained austenite can be obtained depending on the quenching medium. Wear was presented on the worn surfaces of the steels which were used in this work.Keywords: microstructures, hardness, abrasive wear, heat treatment, soil texture
Procedia PDF Downloads 389463 Contaminated Sites Prioritization Process Promoting and Redevelopment Planning
Authors: Che-An Lin, Wan-Ying Tsai, Ying-Shin Chen, Yu-Jen Chung
Abstract:
With the number and area of contaminated sites continued to increase in Taiwan, the Government have to make a priority list of screening contaminated sites under the limited funds and information. This study investigated the announcement of Taiwan EPA land 261 contaminated sites (except the agricultural lands), after preliminary screening 211 valid data to propose a screening system, removed contaminated sites were used to check the accuracy. This system including two dimensions which can create the sequence and use the XY axis to construct four quadrants. One dimension included environmental and social priority and the other related economic. All of the evaluated items included population density, land values, traffic hub, pollutant compound, pollutant concentrations, pollutant transport pathways, land usage sites, site areas, and water conductivity. The classification results of this screening are 1. Prioritization promoting sites (10%). 2. Environmental and social priority of the sites (17%), 3. Economic priority of the sites (30%), 4. Non-priority sites (43 %). Finally, this study used three of the removed contaminated sites to check screening system verification. As the surmise each of them are in line with the priority site and Economic priority of the site.Keywords: contaminated sites, redevelopment, environmental, economics
Procedia PDF Downloads 483462 Use of Vegetative Coverage for Slope Stability in the Brazilian Midwest: Case Study
Authors: Weber A. R. Souza, Andre A. N. Dantas, Marcio A. Medeiros, Rafaella F. Costa
Abstract:
The erosive processes are natural phenomena that cause changes in the soil continuously due to the actions of natural erosive agents and their speed can be intensified or retarded by factors such as climate, inclination, type of matrix rock, vegetation and anthropic activities, the latter being very relevant in occupied areas without planning and urban infrastructure. Inadequate housing sites associated with an inefficient urban drainage network and lack of vegetation cover potentiate the erosive processes that, over time, are gaining alarming proportions, as is the case of the erosion in Planaltina in Federal district, a Brazilian state in the central west. Thus, the aim of this work was to compare the use of Vetiver grass and Alfalfa as vegetation cover to slope protection. For that, a study was carried out in the scientific literature about the improvement of the soil properties provided by them and verification of the safety factor through the simulation of slopes with different heights and inclination using SLOPE / W software. The Vetiver grass presented little more satisfactory results than the Alfalfa, but these obtained results slightly closer to that of the vetiver grass in less time of planting.Keywords: erosive processes, planting, slope protection, vegetation cover
Procedia PDF Downloads 180461 The Effect of Speech-Shaped Noise and Speaker’s Voice Quality on First-Grade Children’s Speech Perception and Listening Comprehension
Authors: I. Schiller, D. Morsomme, A. Remacle
Abstract:
Children’s ability to process spoken language develops until the late teenage years. At school, where efficient spoken language processing is key to academic achievement, listening conditions are often unfavorable. High background noise and poor teacher’s voice represent typical sources of interference. It can be assumed that these factors particularly affect primary school children, because their language and literacy skills are still low. While it is generally accepted that background noise and impaired voice impede spoken language processing, there is an increasing need for analyzing impacts within specific linguistic areas. Against this background, the aim of the study was to investigate the effect of speech-shaped noise and imitated dysphonic voice on first-grade primary school children’s speech perception and sentence comprehension. Via headphones, 5 to 6-year-old children, recruited within the French-speaking community of Belgium, listened to and performed a minimal-pair discrimination task and a sentence-picture matching task. Stimuli were randomly presented according to four experimental conditions: (1) normal voice / no noise, (2) normal voice / noise, (3) impaired voice / no noise, and (4) impaired voice / noise. The primary outcome measure was task score. How did performance vary with respect to listening condition? Preliminary results will be presented with respect to speech perception and sentence comprehension and carefully interpreted in the light of past findings. This study helps to support our understanding of children’s language processing skills under adverse conditions. Results shall serve as a starting point for probing new measures to optimize children’s learning environment.Keywords: impaired voice, sentence comprehension, speech perception, speech-shaped noise, spoken language processing
Procedia PDF Downloads 193460 Developing and integrated Clinical Risk Management Model
Authors: Mohammad H. Yarmohammadian, Fatemeh Rezaei
Abstract:
Introduction: Improving patient safety in health systems is one of the main priorities in healthcare systems, so clinical risk management in organizations has become increasingly significant. Although several tools have been developed for clinical risk management, each has its own limitations. Aims: This study aims to develop a comprehensive tool that can complete the limitations of each risk assessment and management tools with the advantage of other tools. Methods: Procedure was determined in two main stages included development of an initial model during meetings with the professors and literature review, then implementation and verification of final model. Subjects and Methods: This study is a quantitative − qualitative research. In terms of qualitative dimension, method of focus groups with inductive approach is used. To evaluate the results of the qualitative study, quantitative assessment of the two parts of the fourth phase and seven phases of the research was conducted. Purposive and stratification sampling of various responsible teams for the selected process was conducted in the operating room. Final model verified in eight phases through application of activity breakdown structure, failure mode and effects analysis (FMEA), healthcare risk priority number (RPN), root cause analysis (RCA), FT, and Eindhoven Classification model (ECM) tools. This model has been conducted typically on patients admitted in a day-clinic ward of a public hospital for surgery in October 2012 to June. Statistical Analysis Used: Qualitative data analysis was done through content analysis and quantitative analysis done through checklist and edited RPN tables. Results: After verification the final model in eight-step, patient's admission process for surgery was developed by focus discussion group (FDG) members in five main phases. Then with adopted methodology of FMEA, 85 failure modes along with its causes, effects, and preventive capabilities was set in the tables. Developed tables to calculate RPN index contain three criteria for severity, two criteria for probability, and two criteria for preventability. Tree failure modes were above determined significant risk limitation (RPN > 250). After a 3-month period, patient's misidentification incidents were the most frequent reported events. Each RPN criterion of misidentification events compared and found that various RPN number for tree misidentification reported events could be determine against predicted score in previous phase. Identified root causes through fault tree categorized with ECM. Wrong side surgery event was selected by focus discussion group to purpose improvement action. The most important causes were lack of planning for number and priority of surgical procedures. After prioritization of the suggested interventions, computerized registration system in health information system (HIS) was adopted to prepare the action plan in the final phase. Conclusion: Complexity of health care industry requires risk managers to have a multifaceted vision. Therefore, applying only one of retrospective or prospective tools for risk management does not work and each organization must provide conditions for potential application of these methods in its organization. The results of this study showed that the integrated clinical risk management model can be used in hospitals as an efficient tool in order to improve clinical governance.Keywords: failure modes and effective analysis, risk management, root cause analysis, model
Procedia PDF Downloads 250459 Application of Rapid Prototyping to Create Additive Prototype Using Computer System
Authors: Meftah O. Bashir, Fatma A. Karkory
Abstract:
Rapid prototyping is a new group of manufacturing processes, which allows fabrication of physical of any complexity using a layer by layer deposition technique directly from a computer system. The rapid prototyping process greatly reduces the time and cost necessary to bring a new product to market. The prototypes made by these systems are used in a range of industrial application including design evaluation, verification, testing, and as patterns for casting processes. These processes employ a variety of materials and mechanisms to build up the layers to build the part. The present work was to build a FDM prototyping machine that could control the X-Y motion and material deposition, to generate two-dimensional and three-dimensional complex shapes. This study focused on the deposition of wax material. This work was to find out the properties of the wax materials used in this work in order to enable better control of the FDM process. This study will look at the integration of a computer controlled electro-mechanical system with the traditional FDM additive prototyping process. The characteristics of the wax were also analysed in order to optimize the model production process. These included wax phase change temperature, wax viscosity and wax droplet shape during processing.Keywords: rapid prototyping, wax, manufacturing processes, shape
Procedia PDF Downloads 466458 The Use of the Flat Field Panel for the On-Ground Calibration of Metis Coronagraph on Board of Solar Orbiter
Authors: C. Casini, V. Da Deppo, P. Zuppella, P. Chioetto, A. Slemer, F. Frassetto, M. Romoli, F. Landini, M. Pancrazzi, V. Andretta, E. Antonucci, A. Bemporad, M. Casti, Y. De Leo, M. Fabi, S. Fineschi, F. Frassati, C. Grimani, G. Jerse, P. Heinzel, K. Heerlein, A. Liberatore, E. Magli, G. Naletto, G. Nicolini, M.G. Pelizzo, P. Romano, C. Sasso, D. Spadaro, M. Stangalini, T. Straus, R. Susino, L. Teriaca, M. Uslenghi, A. Volpicelli
Abstract:
Solar Orbiter, launched on February 9th 2020, is an ESA/NASA mission conceived to study the Sun. The payload is composed of 10 instruments, among which there is the Metis coronagraph. A coronagraph aims at taking images of the solar corona: the occulter element simulates a total solar eclipse. This work presents some of the results obtained in the visible light band (580-640 nm) using a flat field panel source. The flat field panel gives a uniform illumination; consequently, it has been used during the on-ground calibration for several purposes: evaluating the response of each pixel of the detector (linearity); and characterizing the Field of View of the coronagraph. As a conclusion, a major result is the verification that the requirement for the Field of View (FoV) of Metis is fulfilled. Some investigations are in progress in order to verify that the performance measured on-ground did not change after launch.Keywords: solar orbiter, Metis, coronagraph, flat field panel, calibration, on-ground, performance
Procedia PDF Downloads 106457 Immunization-Data-Quality in Public Health Facilities in the Pastoralist Communities: A Comparative Study Evidence from Afar and Somali Regional States, Ethiopia
Authors: Melaku Tsehay
Abstract:
The Consortium of Christian Relief and Development Associations (CCRDA), and the CORE Group Polio Partners (CGPP) Secretariat have been working with Global Alliance for Vac-cines and Immunization (GAVI) to improve the immunization data quality in Afar and Somali Regional States. The main aim of this study was to compare the quality of immunization data before and after the above interventions in health facilities in the pastoralist communities in Ethiopia. To this end, a comparative-cross-sectional study was conducted on 51 health facilities. The baseline data was collected in May 2019, while the end line data in August 2021. The WHO data quality self-assessment tool (DQS) was used to collect data. A significant improvment was seen in the accuracy of the pentavalent vaccine (PT)1 (p = 0.012) data at the health posts (HP), while PT3 (p = 0.010), and Measles (p = 0.020) at the health centers (HC). Besides, a highly sig-nificant improvment was observed in the accuracy of tetanus toxoid (TT)2 data at HP (p < 0.001). The level of over- or under-reporting was found to be < 8%, at the HP, and < 10% at the HC for PT3. The data completeness was also increased from 72.09% to 88.89% at the HC. Nearly 74% of the health facilities timely reported their respective immunization data, which is much better than the baseline (7.1%) (p < 0.001). These findings may provide some hints for the policies and pro-grams targetting on improving immunization data qaulity in the pastoralist communities.Keywords: data quality, immunization, verification factor, pastoralist region
Procedia PDF Downloads 125456 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility
Authors: Alejandro Villegas, Cihan Varol
Abstract:
Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.Keywords: betamax, digital forensics, report utility, VoIP, VoIPBuster, VoIPWise
Procedia PDF Downloads 298455 Acoustic Induced Vibration Response Analysis of Honeycomb Panel
Authors: Po-Yuan Tung, Jen-Chueh Kuo, Chia-Ray Chen, Chien-Hsing Li, Kuo-Liang Pan
Abstract:
The main-body structure of satellite is mainly constructed by lightweight material, it should be able to withstand certain vibration load during launches. Since various kinds of change possibility in the space, it is an extremely important work to study the random vibration response of satellite structure. This paper based on the reciprocity relationship between sound and structure response and it will try to evaluate the dynamic response of satellite main body under random acoustic load excitation. This paper will study the technical process and verify the feasibility of sonic-borne vibration analysis. One simple plate exposed to the uniform acoustic field is utilized to take some important parameters and to validate the acoustics field model of the reverberation chamber. Then import both structure and acoustic field chamber models into the vibro-acoustic coupling analysis software to predict the structure response. During the modeling process, experiment verification is performed to make sure the quality of numerical models. Finally, the surface vibration level can be calculated through the modal participation factor, and the analysis results are presented in PSD spectrum.Keywords: vibration, acoustic, modal, honeycomb panel
Procedia PDF Downloads 556454 Basic Study of Mammographic Image Magnification System with Eye-Detector and Simple EEG Scanner
Authors: Aika Umemuro, Mitsuru Sato, Mizuki Narita, Saya Hori, Saya Sakurai, Tomomi Nakayama, Ayano Nakazawa, Toshihiro Ogura
Abstract:
Mammography requires the detection of very small calcifications, and physicians search for microcalcifications by magnifying the images as they read them. The mouse is necessary to zoom in on the images, but this can be tiring and distracting when many images are read in a single day. Therefore, an image magnification system combining an eye-detector and a simple electroencephalograph (EEG) scanner was devised, and its operability was evaluated. Two experiments were conducted in this study: the measurement of eye-detection error using an eye-detector and the measurement of the time required for image magnification using a simple EEG scanner. Eye-detector validation showed that the mean distance of eye-detection error ranged from 0.64 cm to 2.17 cm, with an overall mean of 1.24 ± 0.81 cm for the observers. The results showed that the eye detection error was small enough for the magnified area of the mammographic image. The average time required for point magnification in the verification of the simple EEG scanner ranged from 5.85 to 16.73 seconds, and individual differences were observed. The reason for this may be that the size of the simple EEG scanner used was not adjustable, so it did not fit well for some subjects. The use of a simple EEG scanner with size adjustment would solve this problem. Therefore, the image magnification system using the eye-detector and the simple EEG scanner is useful.Keywords: EEG scanner, eye-detector, mammography, observers
Procedia PDF Downloads 215453 Exploration of Abuse of Position for Sexual Gain by UK Police
Authors: Terri Cole, Fay Sweeting
Abstract:
Abuse of position for sexual gain by police is defined as behavior involving individuals taking advantage of their role to pursue a sexual or improper relationship. Previous research has considered whether it involves ‘bad apples’ - individuals with poor moral ethos or ‘bad barrels’ – broader organizational flaws which may unconsciously allow, minimize, or do not effectively deal with such behavior. Low level sexual misconduct (e.g., consensual sex on duty) is more common than more serious offences (e.g., rape), yet the impact of such behavior can have severe implications not only for those involved but can also negatively undermine public confidence in the police. This ongoing, collaborative research project has identified variables from 514 historic case files from 35 UK police forces in order to identify potential risk indicators which may lead to such behavior. Quantitative analysis using logistic regression and the Cox proportion hazard model has resulted in the identification of specific risk factors of significance in prediction. Factors relating to both perpetrator background such as a history of intimate partner violence, debt, and substance misuse coupled with in work behavior such as misusing police systems increase the risk. Findings are able to provide pragmatic recommendations for those tasked with identifying potential or investigating suspected perpetrators of misconduct.Keywords: abuse of position, forensic psychology, misconduct, sexual abuse
Procedia PDF Downloads 196452 Forensic Study on Personal Identification of Pakistani Population by Individualizing Characteristics of Footprints
Authors: Muneeba Butt
Abstract:
One of the most important physical evidence which leaves suspects at the crime scene is footprints. Analysis of footprints, which can provide useful information for personal identification, is helpful in crime scene investigation. For the current study, 200 samples collected (144 male and 56 female) from Pakistani population with a consent form. The footprints were collected by using black ink with an ink pad. The entire samples were photographed, and then the magnifying glass was used for visualization of individual characteristics including detail of toes, humps, phalange mark, and flat foot cracks in footprint patterns. The descriptive results of individualizing characteristics features were presented in tabular form with respective frequency and percentage. In the result in the male population, the prevalence of tibialis type (T-type) is highest. In the female population, the prevalence of midularis type (M-type) is highest. Humps on the first toe are more found in the male population rather than other humps. In the female population, humps on the third toe are more found rather than other humps. In the male population, the prevalence of phalange mark by toe 1 is highest followed by toe 3, toe 5, toe 2, toe 4 and in female population the prevalence of phalange mark by toe 1 is highest followed by toe 5, 4, 3 and 2. Creases marks are found highest in male population as compared to the female population.Keywords: foot prints, toes, humps, cracks
Procedia PDF Downloads 167451 Temperature-Related Alterations to Mineral Levels and Crystalline Structure in Porcine Long Bone: Intense Heat Vs. Open Flame
Authors: Caighley Logan
Abstract:
The outcome of fire related fatalities, along with other research, has found fires can have a detrimental effect to the mineral and crystalline structures within bone. This study focused on the mineral and crystalline structures within porcine bone samples to analyse the changes caused, with the intent of effectively ‘reverse engineering’ the data collected from burned bone samples to discover what may have happened. Using Fourier Transform Infrared (FT-IR), and X-Ray Fluorescence (XRF), the data collected from a controlled source of intense heat (muffle furnace) and an open fire, based in a living room setting in a standard size shipping container (8.5ft x 8ft) of a similar temperature with a known ignition source, a gasoline lighter. This approach is to analyse the changes to the samples and how the changes differ depending on the heat source. Results have found significant differences in the levels of remaining minerals for each type of heat/burning (p=<0.001), particularly Phosphorus and Calcium, this also includes notable additions of absorbed elements and minerals from the surrounding materials, i.e., Cerium (Ce), Bromine (Br) and Neodymium (Ne). The analysis techniques included provide validated results in conjunction with previous studies.Keywords: forensic anthropology, thermal alterations, porcine bone, FTIR, XRF
Procedia PDF Downloads 85450 Analysis of Dynamics Underlying the Observation Time Series by Using a Singular Spectrum Approach
Authors: O. Delage, H. Bencherif, T. Portafaix, A. Bourdier
Abstract:
The main purpose of time series analysis is to learn about the dynamics behind some time ordered measurement data. Two approaches are used in the literature to get a better knowledge of the dynamics contained in observation data sequences. The first of these approaches concerns time series decomposition, which is an important analysis step allowing patterns and behaviors to be extracted as components providing insight into the mechanisms producing the time series. As in many cases, time series are short, noisy, and non-stationary. To provide components which are physically meaningful, methods such as Empirical Mode Decomposition (EMD), Empirical Wavelet Transform (EWT) or, more recently, Empirical Adaptive Wavelet Decomposition (EAWD) have been proposed. The second approach is to reconstruct the dynamics underlying the time series as a trajectory in state space by mapping a time series into a set of Rᵐ lag vectors by using the method of delays (MOD). Takens has proved that the trajectory obtained with the MOD technic is equivalent to the trajectory representing the dynamics behind the original time series. This work introduces the singular spectrum decomposition (SSD), which is a new adaptive method for decomposing non-linear and non-stationary time series in narrow-banded components. This method takes its origin from singular spectrum analysis (SSA), a nonparametric spectral estimation method used for the analysis and prediction of time series. As the first step of SSD is to constitute a trajectory matrix by embedding a one-dimensional time series into a set of lagged vectors, SSD can also be seen as a reconstruction method like MOD. We will first give a brief overview of the existing decomposition methods (EMD-EWT-EAWD). The SSD method will then be described in detail and applied to experimental time series of observations resulting from total columns of ozone measurements. The results obtained will be compared with those provided by the previously mentioned decomposition methods. We will also compare the reconstruction qualities of the observed dynamics obtained from the SSD and MOD methods.Keywords: time series analysis, adaptive time series decomposition, wavelet, phase space reconstruction, singular spectrum analysis
Procedia PDF Downloads 106449 Image Features Comparison-Based Position Estimation Method Using a Camera Sensor
Authors: Jinseon Song, Yongwan Park
Abstract:
In this paper, propose method that can user’s position that based on database is built from single camera. Previous positioning calculate distance by arrival-time of signal like GPS (Global Positioning System), RF(Radio Frequency). However, these previous method have weakness because these have large error range according to signal interference. Method for solution estimate position by camera sensor. But, signal camera is difficult to obtain relative position data and stereo camera is difficult to provide real-time position data because of a lot of image data, too. First of all, in this research we build image database at space that able to provide positioning service with single camera. Next, we judge similarity through image matching of database image and transmission image from user. Finally, we decide position of user through position of most similar database image. For verification of propose method, we experiment at real-environment like indoor and outdoor. Propose method is wide positioning range and this method can verify not only position of user but also direction.Keywords: positioning, distance, camera, features, SURF(Speed-Up Robust Features), database, estimation
Procedia PDF Downloads 350