Search results for: minimum data set
21740 Bringing Together Student Collaboration and Research Opportunities to Promote Scientific Understanding and Outreach Through a Seismological Community
Authors: Michael Ray Brunt
Abstract:
China has been the site of some of the most significant earthquakes in history; however, earthquake monitoring has long been the provenance of universities and research institutions. The China Digital Seismographic Network was initiated in 1983 and improved significantly during 1992-1993. Data from the CDSN is widely used by government and research institutions, and, generally, this data is not readily accessible to middle and high school students. An educational seismic network in China is needed to provide collaboration and research opportunities for students and engaging students around the country in scientific understanding of earthquake hazards and risks while promoting community awareness. In 2022, the Tsinghua International School (THIS) Seismology Team, made up of enthusiastic students and facilitated by two experienced teachers, was established. As a group, the team’s objective is to install seismographs in schools throughout China, thus creating an educational seismic network that shares data from the THIS Educational Seismic Network (THIS-ESN) and facilitates collaboration. The THIS-ESN initiative will enhance education and outreach in China about earthquake risks and hazards, introduce seismology to a wider audience, stimulate interest in research among students, and develop students’ programming, data collection and analysis skills. It will also encourage and inspire young minds to pursue science, technology, engineering, the arts, and math (STEAM) career fields. The THIS-ESN utilizes small, low-cost RaspberryShake seismographs as a powerful tool linked into a global network, giving schools and the public access to real-time seismic data from across China, increasing earthquake monitoring capabilities in the perspective areas and adding to the available data sets regionally and worldwide helping create a denser seismic network. The RaspberryShake seismograph is compatible with free seismic data viewing platforms such as SWARM, RaspberryShake web programs and mobile apps are designed specifically towards teaching seismology and seismic data interpretation, providing opportunities to enhance understanding. The RaspberryShake is powered by an operating system embedded in the Raspberry Pi, which makes it an easy platform to teach students basic computer communication concepts by utilizing processing tools to investigate, plot, and manipulate data. THIS Seismology Team believes strongly in creating opportunities for committed students to become part of the seismological community by engaging in analysis of real-time scientific data with tangible outcomes. Students will feel proud of the important work they are doing to understand the world around them and become advocates spreading their knowledge back into their homes and communities, helping to improve overall community resilience. We trust that, in studying the results seismograph stations yield, students will not only grasp how subjects like physics and computer science apply in real life, and by spreading information, we hope students across the country can appreciate how and why earthquakes bear on their lives, develop practical skills in STEAM, and engage in the global seismic monitoring effort. By providing such an opportunity to schools across the country, we are confident that we will be an agent of change for society.Keywords: collaboration, outreach, education, seismology, earthquakes, public awareness, research opportunities
Procedia PDF Downloads 7221739 Modelling and Control of Milk Fermentation Process in Biochemical Reactor
Authors: Jožef Ritonja
Abstract:
The biochemical industry is one of the most important modern industries. Biochemical reactors are crucial devices of the biochemical industry. The essential bioprocess carried out in bioreactors is the fermentation process. A thorough insight into the fermentation process and the knowledge how to control it are essential for effective use of bioreactors to produce high quality and quantitatively enough products. The development of the control system starts with the determination of a mathematical model that describes the steady state and dynamic properties of the controlled plant satisfactorily, and is suitable for the development of the control system. The paper analyses the fermentation process in bioreactors thoroughly, using existing mathematical models. Most existing mathematical models do not allow the design of a control system for controlling the fermentation process in batch bioreactors. Due to this, a mathematical model was developed and presented that allows the development of a control system for batch bioreactors. Based on the developed mathematical model, a control system was designed to ensure optimal response of the biochemical quantities in the fermentation process. Due to the time-varying and non-linear nature of the controlled plant, the conventional control system with a proportional-integral-differential controller with constant parameters does not provide the desired transient response. The improved adaptive control system was proposed to improve the dynamics of the fermentation. The use of the adaptive control is suggested because the parameters’ variations of the fermentation process are very slow. The developed control system was tested to produce dairy products in the laboratory bioreactor. A carbon dioxide concentration was chosen as the controlled variable. The carbon dioxide concentration correlates well with the other, for the quality of the fermentation process in significant quantities. The level of the carbon dioxide concentration gives important information about the fermentation process. The obtained results showed that the designed control system provides minimum error between reference and actual values of carbon dioxide concentration during a transient response and in a steady state. The recommended control system makes reference signal tracking much more efficient than the currently used conventional control systems which are based on linear control theory. The proposed control system represents a very effective solution for the improvement of the milk fermentation process.Keywords: biochemical reactor, fermentation process, modelling, adaptive control
Procedia PDF Downloads 13021738 Design of a Real Time Closed Loop Simulation Test Bed on a General Purpose Operating System: Practical Approaches
Authors: Pratibha Srivastava, Chithra V. J., Sudhakar S., Nitin K. D.
Abstract:
A closed-loop system comprises of a controller, a response system, and an actuating system. The controller, which is the system under test for us, excites the actuators based on feedback from the sensors in a periodic manner. The sensors should provide the feedback to the System Under Test (SUT) within a deterministic time post excitation of the actuators. Any delay or miss in the generation of response or acquisition of excitation pulses may lead to control loop controller computation errors, which can be catastrophic in certain cases. Such systems categorised as hard real-time systems that need special strategies. The real-time operating systems available in the market may be the best solutions for such kind of simulations, but they pose limitations like the availability of the X Windows system, graphical interfaces, other user tools. In this paper, we present strategies that can be used on a general purpose operating system (Bare Linux Kernel) to achieve a deterministic deadline and hence have the added advantages of a GPOS with real-time features. Techniques shall be discussed how to make the time-critical application run with the highest priority in an uninterrupted manner, reduced network latency for distributed architecture, real-time data acquisition, data storage, and retrieval, user interactions, etc.Keywords: real time data acquisition, real time kernel preemption, scheduling, network latency
Procedia PDF Downloads 14721737 Electric Load Forecasting Based on Artificial Neural Network for Iraqi Power System
Authors: Afaneen Anwer, Samara M. Kamil
Abstract:
Load Forecast required prediction accuracy based on optimal operation and maintenance. A good accuracy is the basis of economic dispatch, unit commitment, and system reliability. A good load forecasting system fulfilled fast speed, automatic bad data detection, and ability to access the system automatically to get the needed data. In this paper, the formulation of the load forecasting is discussed and the solution is obtained by using artificial neural network method. A MATLAB environment has been used to solve the load forecasting schedule of Iraqi super grid network considering the daily load for three years. The obtained results showed a good accuracy in predicting the forecasted load.Keywords: load forecasting, neural network, back-propagation algorithm, Iraqi power system
Procedia PDF Downloads 58321736 Interculturalizing Ethiopian Universities: Between Initiation and Institutionalization
Authors: Desta Kebede Ayana, Lies Sercu, Demelash Mengistu
Abstract:
The study is set in Ethiopia, a sub-Saharan multilingual, multiethnic African country, which has seen a significant increase in the number of universities in recent years. The aim of this growth is to provide access to education for all cultural and linguistic groups across the country. However, there are challenges in promoting intercultural competence among students in this diverse context. The aim of the study is to investigate the interculturalization of Ethiopian Higher Education Institutions as perceived by university lecturers and administrators. In particular, the study aims to determine the level of support for this educational innovation and gather suggestions for its implementation and institutionalization. The researchers employed semi-structured interviews with administrators and lecturers from two large Ethiopian universities to gather data. Thematic analysis was utilized for coding and analyzing the interview data, with the assistance of the NVIVO software. The findings obtained from the grounded analysis of the interview data reveal that while there are opportunities for interculturalization in the curriculum and campus life, support for educational innovation remains low. Administrators and lecturers also emphasize the government's responsibility to prioritize interculturalization over other educational innovation goals. The study contributes to the existing literature by examining an under-researched population in an under-researched context. Additionally, the study explores whether Western perspectives of intercultural competence align with the African context, adding to the theoretical understanding of intercultural education. The data for this study was collected through semi-structured interviews conducted with administrators and lecturers from two large Ethiopian universities. The interviews allowed for an in-depth exploration of the participants' views on interculturalization in higher education. Thematic analysis was applied to the interview data, allowing for the identification and organization of recurring themes and patterns. The analysis was conducted using the NVIVO software, which aided in coding and analyzing the data. The study addresses the extent to which administrators and lecturers support the interculturalization of Ethiopian Higher Education Institutions. It also explores their suggestions for implementing and institutionalizing intercultural education, as well as their perspectives on the current level of institutionalization. The study highlights the challenges in interculturalizing Ethiopian universities and emphasizes the need for greater support and prioritization of intercultural education. It also underscores the importance of considering the African context when conceptualizing intercultural competence. This research contributes to the understanding of intercultural education in diverse contexts and provides valuable insights for policymakers and educational institutions aiming to promote intercultural competence in higher education settings.Keywords: administrators, educational change, Ethiopia, intercultural competence, lecturers
Procedia PDF Downloads 9821735 Weighted-Distance Sliding Windows and Cooccurrence Graphs for Supporting Entity-Relationship Discovery in Unstructured Text
Authors: Paolo Fantozzi, Luigi Laura, Umberto Nanni
Abstract:
The problem of Entity relation discovery in structured data, a well covered topic in literature, consists in searching within unstructured sources (typically, text) in order to find connections among entities. These can be a whole dictionary, or a specific collection of named items. In many cases machine learning and/or text mining techniques are used for this goal. These approaches might be unfeasible in computationally challenging problems, such as processing massive data streams. A faster approach consists in collecting the cooccurrences of any two words (entities) in order to create a graph of relations - a cooccurrence graph. Indeed each cooccurrence highlights some grade of semantic correlation between the words because it is more common to have related words close each other than having them in the opposite sides of the text. Some authors have used sliding windows for such problem: they count all the occurrences within a sliding windows running over the whole text. In this paper we generalise such technique, coming up to a Weighted-Distance Sliding Window, where each occurrence of two named items within the window is accounted with a weight depending on the distance between items: a closer distance implies a stronger evidence of a relationship. We develop an experiment in order to support this intuition, by applying this technique to a data set consisting in the text of the Bible, split into verses.Keywords: cooccurrence graph, entity relation graph, unstructured text, weighted distance
Procedia PDF Downloads 15421734 Structural Development and Multiscale Design Optimization of Additively Manufactured Unmanned Aerial Vehicle with Blended Wing Body Configuration
Authors: Malcolm Dinovitzer, Calvin Miller, Adam Hacker, Gabriel Wong, Zach Annen, Padmassun Rajakareyar, Jordan Mulvihill, Mostafa S.A. ElSayed
Abstract:
The research work presented in this paper is developed by the Blended Wing Body (BWB) Unmanned Aerial Vehicle (UAV) team, a fourth-year capstone project at Carleton University Department of Mechanical and Aerospace Engineering. Here, a clean sheet UAV with BWB configuration is designed and optimized using Multiscale Design Optimization (MSDO) approach employing lattice materials taking into consideration design for additive manufacturing constraints. The BWB-UAV is being developed with a mission profile designed for surveillance purposes with a minimum payload of 1000 grams. To demonstrate the design methodology, a single design loop of a sample rib from the airframe is shown in details. This includes presentation of the conceptual design, materials selection, experimental characterization and residual thermal stress distribution analysis of additively manufactured materials, manufacturing constraint identification, critical loads computations, stress analysis and design optimization. A dynamic turbulent critical load case was identified composed of a 1-g static maneuver with an incremental Power Spectral Density (PSD) gust which was used as a deterministic design load case for the design optimization. 2D flat plate Doublet Lattice Method (DLM) was used to simulate aerodynamics in the aeroelastic analysis. The aerodynamic results were verified versus a 3D CFD analysis applying Spalart-Allmaras and SST k-omega turbulence to the rigid UAV and vortex lattice method applied in the OpenVSP environment. Design optimization of a single rib was conducted using topology optimization as well as MSDO. Compared to a solid rib, weight savings of 36.44% and 59.65% were obtained for the topology optimization and the MSDO, respectively. These results suggest that MSDO is an acceptable alternative to topology optimization in weight critical applications while preserving the functional requirements.Keywords: blended wing body, multiscale design optimization, additive manufacturing, unmanned aerial vehicle
Procedia PDF Downloads 37621733 Analyzing of the Urban Landscape Configurations and Expansion of Dire Dawa City, Ethiopia Using Satellite Data and Landscape Metrics Approaches
Authors: Berhanu Keno Terfa
Abstract:
To realize the consequences of urbanization, accurate, and up-to-date representation of the urban landscape patterns is critical for urban planners and policymakers. Thus, the study quantitatively characterized the spatiotemporal composition and configuration of the urban landscape and urban expansion process in Dire Dawa City, Ethiopia, form the year 2006 to 2018. The integrated approaches of various sensors satellite data, Spot (2006) and Sentinel 2 (2018) combined with landscape metrics analysis was employed to explore the pattern, process, and overall growth status in the city. The result showed that the built-up area had increased by 62% between 2006 and 2018, at an average annual increment of 3.6%, while the other land covers were lost significantly due to urban expansion. The highest urban expansion has occurred in the northwest direction, whereas the most fragmented landscape pattern was recorded in the west direction. Overall, the analysis showed that Dire Dawa City experienced accelerated urban expansion with a fragmented and complicated spatiotemporal urban landscape patterns, suggesting a strong tendency towards sprawl over the past 12 years. The findings in the study could help planners and policy developers to insight the historical dynamics of the urban region for sustainable development.Keywords: zonal metrics, multi-temporal, multi-resolution, urban growth, remote sensing data
Procedia PDF Downloads 20121732 A Dynamic Solution Approach for Heart Disease Prediction
Authors: Walid Moudani
Abstract:
The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the coronary heart disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts’ knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.Keywords: multi-classifier decisions tree, features reduction, dynamic programming, rough sets
Procedia PDF Downloads 41021731 Identification of Hepatocellular Carcinoma Using Supervised Learning Algorithms
Authors: Sagri Sharma
Abstract:
Analysis of diseases integrating multi-factors increases the complexity of the problem and therefore, development of frameworks for the analysis of diseases is an issue that is currently a topic of intense research. Due to the inter-dependence of the various parameters, the use of traditional methodologies has not been very effective. Consequently, newer methodologies are being sought to deal with the problem. Supervised Learning Algorithms are commonly used for performing the prediction on previously unseen data. These algorithms are commonly used for applications in fields ranging from image analysis to protein structure and function prediction and they get trained using a known dataset to come up with a predictor model that generates reasonable predictions for the response to new data. Gene expression profiles generated by DNA analysis experiments can be quite complex since these experiments can involve hypotheses involving entire genomes. The application of well-known machine learning algorithm - Support Vector Machine - to analyze the expression levels of thousands of genes simultaneously in a timely, automated and cost effective way is thus used. The objectives to undertake the presented work are development of a methodology to identify genes relevant to Hepatocellular Carcinoma (HCC) from gene expression dataset utilizing supervised learning algorithms and statistical evaluations along with development of a predictive framework that can perform classification tasks on new, unseen data.Keywords: artificial intelligence, biomarker, gene expression datasets, hepatocellular carcinoma, machine learning, supervised learning algorithms, support vector machine
Procedia PDF Downloads 42921730 Impact of Mixing Parameters on Homogenization of Borax Solution and Nucleation Rate in Dual Radial Impeller Crystallizer
Authors: A. Kaćunić, M. Ćosić, N. Kuzmanić
Abstract:
Interaction between mixing and crystallization is often ignored despite the fact that it affects almost every aspect of the operation including nucleation, growth, and maintenance of the crystal slurry. This is especially pronounced in multiple impeller systems where flow complexity is increased. By choosing proper mixing parameters, what closely depends on the knowledge of the hydrodynamics in a mixing vessel, the process of batch cooling crystallization may considerably be improved. The values that render useful information when making this choice are mixing time and power consumption. The predominant motivation for this work was to investigate the extent to which radial dual impeller configuration influences mixing time, power consumption and consequently the values of metastable zone width and nucleation rate. In this research, crystallization of borax was conducted in a 15 dm3 baffled batch cooling crystallizer with an aspect ratio (H/T) of 1.3. Mixing was performed using two straight blade turbines (4-SBT) mounted on the same shaft that generated radial fluid flow. Experiments were conducted at different values of N/NJS ratio (impeller speed/ minimum impeller speed for complete suspension), D/T ratio (impeller diameter/crystallizer diameter), c/D ratio (lower impeller off-bottom clearance/impeller diameter), and s/D ratio (spacing between impellers/impeller diameter). Mother liquor was saturated at 30°C and was cooled at the rate of 6°C/h. Its concentration was monitored in line by Na-ion selective electrode. From the values of supersaturation that was monitored continuously over process time, it was possible to determine the metastable zone width and subsequently the nucleation rate using the Mersmann’s nucleation criterion. For all applied dual impeller configurations, the mixing time was determined by potentiometric method using a pulse technique, while the power consumption was determined using a torque meter produced by Himmelstein & Co. Results obtained in this investigation show that dual impeller configuration significantly influences the values of mixing time, power consumption as well as the metastable zone width and nucleation rate. A special attention should be addressed to the impeller spacing considering the flow interaction that could be more or less pronounced depending on the spacing value.Keywords: dual impeller crystallizer, mixing time, power consumption, metastable zone width, nucleation rate
Procedia PDF Downloads 29621729 The Unspoken Learning Landscape of Indigenous Peoples (IP) Learners: A Process Documentation and Analysis
Authors: Ailene B. Anonuevo
Abstract:
The aim of the study was to evaluate the quality of life presently available for the IP students in selected schools in the Division of Panabo City. This further explores their future dreams and current status in classes and examines some implications relative to their studies. The study adopted the mixed methodology and used a survey research design as the operational framework for data gathering. Data were collected by self-administered questionnaires and interviews with sixty students from three schools in Panabo City. In addition, this study describes the learners’ background and school climate as variables that might influence their performance in school. The study revealed that an IP student needs extra attention due to their unfavorable learning environment. The study also found out that like any other students, IP learners yearns for a brighter future with the support of our government.Keywords: IP learners, learning landscape, school climate, quality of life
Procedia PDF Downloads 22421728 The Interactive Effects among Supervisor Support, Academic Emotion, and Positive Mental Health: An Evidence Based on Longitudinal Cross-Lagged Panel Data Analysis on Postgraduates in China
Authors: Jianzhou Ni, Hua Fan
Abstract:
It has been determined that supervisor support has a major influence on postgraduate students' academic emotions and is considered a method of successfully anticipating postgraduates' good psychological well-being levels. As a result, by assessing the mediating influence upon academic emotions for contemporary postgraduates in China, this study investigated the tight reciprocal relationship between psychological empowerment and positive mental well-being among postgraduates. To that end, a help enables a theoretical analysis of role clarity, academic emotion, and positive psychological health was developed, and its validity and reliability were demonstrated for the first time using the normalized postgrad relationship with supervisor scale, academic emotion scale, and positive mental scale, as well as questionnaire data from Chinese postgraduate students. This study used the cross-lagged (ARCL) panel model data to longitudinally measure 798 valid data from two survey questions polls taken in 2019 (T1) and 2021 (T2) to investigate the link between supervisor support and positive graduate student mental well-being in a bidirectional relationship of influence. The study discovered that mentor assistance could have a considerable beneficial impact on graduate students' academic emotions and, as a result, indirectly help learners attain positive mental health development. This verifies the theoretical premise that academic emotions partially mediate the effect of mentor support on positive mental health development and argues for the coexistence of the two. The outcomes of this study can help researchers gain a better knowledge of the dynamic interplay among three different research variables: supervisor support, academic emotions, and positive mental health, as well as fill gaps in previous research. In this regard, the study indicated that mentor assistance directly stimulates students' academic drive and assists graduate students in developing good academic emotions, which contributes to the development of positive mental health. However, given the restricted measurement time in this study's cross-lagged panel data and the potential effect of moderating effects other than academic mood on graduate students' good mental health, the results of this study need to be more fully understood and validated.Keywords: supervisor support, academic emotions, positive mental health, interaction effects, longitudinal cross-lagged measurements
Procedia PDF Downloads 8721727 Photoplethysmography-Based Device Designing for Cardiovascular System Diagnostics
Authors: S. Botman, D. Borchevkin, V. Petrov, E. Bogdanov, M. Patrushev, N. Shusharina
Abstract:
In this paper, we report the development of the device for diagnostics of cardiovascular system state and associated automated workstation for large-scale medical measurement data collection and analysis. It was shown that optimal design for the monitoring device is wristband as it represents engineering trade-off between accuracy and usability. The monitoring device is based on the infrared reflective photoplethysmographic sensor, which allows collecting multiple physiological parameters, such as heart rate and pulsing wave characteristics. Developed device use BLE interface for medical and supplementary data transmission to the coupled mobile phone, which process it and send it to the doctor's automated workstation. Results of this experimental model approbation confirmed the applicability of the proposed approach.Keywords: cardiovascular diseases, health monitoring systems, photoplethysmography, pulse wave, remote diagnostics
Procedia PDF Downloads 49221726 Finding Data Envelopment Analysis Target Using the Multiple Objective Linear Programming Structure in Full Fuzzy Case
Authors: Raziyeh Shamsi
Abstract:
In this paper, we present a multiple objective linear programming (MOLP) problem in full fuzzy case and find Data Envelopment Analysis(DEA) targets. In the presented model, we are seeking the least inputs and the most outputs in the production possibility set (PPS) with the variable return to scale (VRS) assumption, so that the efficiency projection is obtained for all decision making units (DMUs). Then, we provide an algorithm for finding DEA targets interactively in the full fuzzy case, which solves the full fuzzy problem without defuzzification. Owing to the use of interactive methods, the targets obtained by our algorithm are more applicable, more realistic, and they are according to the wish of the decision maker. Finally, an application of the algorithm in 21 educational institutions is provided.Keywords: DEA, MOLP, full fuzzy, target
Procedia PDF Downloads 30221725 Data-Driven Performance Evaluation of Surgical Doctors Based on Fuzzy Analytic Hierarchy Processes
Authors: Yuguang Gao, Qiang Yang, Yanpeng Zhang, Mingtao Deng
Abstract:
To enhance the safety, quality and efficiency of healthcare services provided by surgical doctors, we propose a comprehensive approach to the performance evaluation of individual doctors by incorporating insights from performance data as well as views of different stakeholders in the hospital. Exploratory factor analysis was first performed on collective multidimensional performance data of surgical doctors, where key factors were extracted that encompass assessment of professional experience and service performance. A two-level indicator system was then constructed, for which we developed a weighted interval-valued spherical fuzzy analytic hierarchy process to analyze the relative importance of the indicators while handling subjectivity and disparity in the decision-making of multiple parties involved. Our analytical results reveal that, for the key factors identified as instrumental for evaluating surgical doctors’ performance, the overall importance of clinical workload and complexity of service are valued more than capacity of service and professional experience, while the efficiency of resource consumption ranks comparatively the lowest in importance. We also provide a retrospective case study to illustrate the effectiveness and robustness of our quantitative evaluation model by assigning meaningful performance ratings to individual doctors based on the weights developed through our approach.Keywords: analytic hierarchy processes, factor analysis, fuzzy logic, performance evaluation
Procedia PDF Downloads 5821724 Artificial Neural Network in FIRST Robotics Team-Based Prediction System
Authors: Cedric Leong, Parth Desai, Parth Patel
Abstract:
The purpose of this project was to develop a neural network based on qualitative team data to predict alliance scores to determine winners of matches in the FIRST Robotics Competition (FRC). The game for the competition changes every year with different objectives and game objects, however the idea was to create a prediction system which can be reused year by year using some of the statistics that are constant through different games, making our system adaptable to future games as well. Aerial Assist is the FRC game for 2014, and is played in alliances of 3 teams going against one another, namely the Red and Blue alliances. This application takes any 6 teams paired into 2 alliances of 3 teams and generates the prediction for the final score between them.Keywords: artifical neural network, prediction system, qualitative team data, FIRST Robotics Competition (FRC)
Procedia PDF Downloads 51421723 Retrospective Data Analysis of Penetrating Injuries Admitted to Jigme Dorji Wangchuck National Referral Hospital (JDWNRH), Thimphu, Bhutan, Due to Traditional Sports over a Period of 3 Years
Authors: Sonam Kelzang
Abstract:
Background: Penetrating injuries as a result of traditional sports (Archery and Khuru) are commonly seen in Bhutan. To our knowledge, there is no study carried out looking into the data of penetrating injuries due to traditional sports. Aim: This is a retrospective analysis of cases of penetrating injuries as a result of traditional sports admitted to JDWNRH over the last 3 years to draw an inference on the pattern of injury and associated morbidity and mortality. Method: Data on penetrating injuries related to traditional sports (Archery and Khuru) were collected and reviewed over the period of 3 years. Assault cases were excluded. For each year we analysed age, sex, parts of the body affected, agent of injury and whether admission was required or not. Results: Out of the total 44 victims of penetrating injury by traditional sports (Archery and Khuru) between 2013 and 2015 (average of 15 cases of penetrating injuries per year). Eighty-five percent were male and 15% were female. Their age ranged from 4 yrs to 62 years. Sixty-one percent of the victims were in the working age group of 19-58 years; 30% of the victims were referred from various district hospitals; 38% of the victims needed admission; 42 % of the victims suffered injury to the head; and 54% of the injuries were caused by Khuru. Conclusion: Penetrating injuries due to traditional sports admitted to JDWNRH, Thimphu, remained same over the three years period despite safety regulations in place. Although there were no deaths during the last three years, morbidity still remains high.Keywords: archery, Bhutan, Khuru, darts
Procedia PDF Downloads 16621722 Adaptive Training Methods Designed to Improve a Shorter Resident Curriculum in Obstetrics and Gynecology
Authors: Philippe Judlin, Olivier Morel
Abstract:
Background: In France, the resident curriculum (RC) in Obstetrics and Gynecology (OBGYN) takes five years. In the course of the last 15 years, this RC has undergone major changes, characterized mainly by successive reductions of work hours. The program used to comprise long and frequent shifts, huge workload, poor supervision and erratic theoretical teaching. A decade ago, the French Ministry of Heath recommended a limitation of shift duration up to 24 hours and a minimum of 11 hours off duty between shifts. Last year, in order to comply with European Union directives, new recommendations have further limited residents’ work hours to 48 hours per week. Methods: Assessment of the residency program adjustments recently made to accommodate the recommendations while improving the training quality in resorting to new methods. Results: The challenge facing program directors was to provide an all-encompassing curriculum to OBGYN residents despite fewer work hours. Program has been dramatically redesigned, and several measures have been put in place: -The resident rotation system has been redesigned. Residents used to make 6-month rotations between 10-12 Departments of OBGYN or Surgery. Fewer Departments, those providing the best teaching, have been kept in the new RC. -Extensive inhouse supervision has been implemented for all kinds of clinical activities. Effectual supervision of residents has proved to be an effective tool to improve the quality of training. -The tutorship system, with academic members individually overseeing residents during their curriculum, has been perfected. It allows a better follow-up of residents’ progresses during the 5-year program. -The set up of an extensive program of lectures encompassing all maters in Obstetrics & Gynecology. These mandatory lectures are available online in a dedicated website. Therefore, face-to-face lectures have been limited in order to fit in the 48-hour limit. -The use of simulation has been significantly increased in obstetrics, materno-fetal medicine and surgery (stressing especially laparoscopic training). -Residents’ feedback has been taken into account in the setup of the new RC. Conclusion: This extensive overhaul of the Obstetrics and Gynecology RC has been in place since last year only. Nevertheless, the new program seems to adequately take into account the new recommendations while providing a better and more consistent teaching to the OBGYN residents.Keywords: education, laparoscopy, residency, simulation
Procedia PDF Downloads 18721721 Comprehensive Analysis of Power Allocation Algorithms for OFDM Based Communication Systems
Authors: Rakesh Dubey, Vaishali Bahl, Dalveer Kaur
Abstract:
The spiralling urge for high rate data transmission over wireless mediums needs intelligent use of electromagnetic resources considering restrictions like power ingestion, spectrum competence, robustness against multipath propagation and implementation intricacy. Orthogonal frequency division multiplexing (OFDM) is a capable technique for next generation wireless communication systems. For such high rate data transfers there is requirement of proper allocation of resources like power and capacity amongst the sub channels. This paper illustrates various available methods of allocating power and the capacity requirement with the constraint of Shannon limit.Keywords: Additive White Gaussian Noise, Multi-Carrier Modulation, Orthogonal Frequency Division Multiplexing (OFDM), Signal to Noise Ratio (SNR), Water Filling
Procedia PDF Downloads 55421720 A Survey on Genetic Algorithm for Intrusion Detection System
Authors: Prikhil Agrawal, N. Priyanka
Abstract:
With the increase of millions of users on Internet day by day, it is very essential to maintain highly reliable and secured data communication between various corporations. Although there are various traditional security imparting techniques such as antivirus software, password protection, data encryption, biometrics and firewall etc. But still network security has become the main issue in various leading companies. So IDSs have become an essential component in terms of security, as it can detect various network attacks and respond quickly to such occurrences. IDSs are used to detect unauthorized access to a computer system. This paper describes various intrusion detection techniques using GA approach. The intrusion detection problem has become a challenging task due to the conception of miscellaneous computer networks under various vulnerabilities. Thus the damage caused to various organizations by malicious intrusions can be mitigated and even be deterred by using this powerful tool.Keywords: genetic algorithm (GA), intrusion detection system (IDS), dataset, network security
Procedia PDF Downloads 29721719 Bayesian Inference for High Dimensional Dynamic Spatio-Temporal Models
Authors: Sofia M. Karadimitriou, Kostas Triantafyllopoulos, Timothy Heaton
Abstract:
Reduced dimension Dynamic Spatio-Temporal Models (DSTMs) jointly describe the spatial and temporal evolution of a function observed subject to noise. A basic state space model is adopted for the discrete temporal variation, while a continuous autoregressive structure describes the continuous spatial evolution. Application of such a DSTM relies upon the pre-selection of a suitable reduced set of basic functions and this can present a challenge in practice. In this talk, we propose an online estimation method for high dimensional spatio-temporal data based upon DSTM and we attempt to resolve this issue by allowing the basis to adapt to the observed data. Specifically, we present a wavelet decomposition in order to obtain a parsimonious approximation of the spatial continuous process. This parsimony can be achieved by placing a Laplace prior distribution on the wavelet coefficients. The aim of using the Laplace prior, is to filter wavelet coefficients with low contribution, and thus achieve the dimension reduction with significant computation savings. We then propose a Hierarchical Bayesian State Space model, for the estimation of which we offer an appropriate particle filter algorithm. The proposed methodology is illustrated using real environmental data.Keywords: multidimensional Laplace prior, particle filtering, spatio-temporal modelling, wavelets
Procedia PDF Downloads 42821718 Satisfaction of the Training at ASEAN Camp: E-Learning Knowledge and Application at Chantanaburi Province, Thailand
Authors: Sinchai Poolklai
Abstract:
The purpose of this research paper was aimed to examine the level of satisfaction of the faculty members who participated in the ASEAN camp, Chantaburi, Thailand. The population of this study included all the faculty members of Suan Sunandha Rajabhat University who participated in the training and activities of the ASEAN camp during March, 2014. Among a total of 200 faculty members who answered the questionnaire, the data was complied by using SPSS program. Percentage, mean and standard deviation were utilized in analyzing the data. The findings revealed that the average mean of satisfaction was 4.37, and standard deviation was 0.7810. Moreover, the mean average can be used to rank the level of satisfaction from each of the following factors: lower cost, less time consuming, faster delivery, more effective learning, and lower environment impact.Keywords: ASEAN camp, e-learning, satisfaction, application
Procedia PDF Downloads 39121717 Reduction of Defects Using Seven Quality Control Tools for Productivity Improvement at Automobile Company
Authors: Abdul Sattar Jamali, Imdad Ali Memon, Maqsood Ahmed Memon
Abstract:
Quality of production near to zero defects is an objective of every manufacturing and service organization. In order to maintain and improve the quality by reduction in defects, Statistical tools are being used by any organizations. There are many statistical tools are available to assess the quality. Keeping in view the importance of many statistical tools, traditional 7QC tools has been used in any manufacturing and automobile Industry. Therefore, the 7QC tools have been successfully applied at one of the Automobile Company Pakistan. Preliminary survey has been done for the implementation of 7QC tool in the assembly line of Automobile Industry. During preliminary survey two inspection points were decided to collect the data, which are Chassis line and trim line. The data for defects at Chassis line and trim line were collected for reduction in defects which ultimately improve productivity. Every 7QC tools has its benefits observed from the results. The flow charts developed for better understanding about inspection point for data collection. The check sheets developed for helps for defects data collection. Histogram represents the severity level of defects. Pareto charts show the cumulative effect of defects. The Cause and Effect diagrams developed for finding the root causes of each defects. Scatter diagram developed the relation of defects increasing or decreasing. The P-Control charts developed for showing out of control points beyond the limits for corrective actions. The successful implementation of 7QC tools at the inspection points at Automobile Industry concluded that the considerable amount of reduction on defects level, as in Chassis line from 132 defects to 13 defects. The total 90% defects were reduced in Chassis Line. In Trim line defects were reduced from 157 defects to 28 defects. The total 82% defects were reduced in Trim Line. As the Automobile Company exercised only few of the 7 QC tools, not fully getting the fruits by the application of 7 QC tools. Therefore, it is suggested the company may need to manage a mechanism for the application of 7 QC tools at every section.Keywords: check sheet, cause and effect diagram, control chart, histogram
Procedia PDF Downloads 32621716 WormHex: Evidence Retrieval Tool of Social Media from Volatile Memory
Authors: Norah Almubairik, Wadha Almattar, Amani Alqarni
Abstract:
Social media applications are increasingly being used in our everyday communications. These applications utilise end-to-end encryption mechanisms, which make them suitable tools for criminals to exchange messages. These messages are preserved in the volatile memory until the device is restarted. Therefore, volatile forensics has become an important branch of digital forensics. In this study, the WormHex tool was developed to inspect the memory dump files of Windows and Mac-based workstations. The tool supports digital investigators to extract valuable data written in Arabic and English through web-based WhatsApp and Twitter applications. The results verify that social media applications write their data into the memory regardless of the operating system running the application, with there being no major differences between Windows and Mac.Keywords: volatile memory, REGEX, digital forensics, memory acquisition
Procedia PDF Downloads 19121715 Mathematics as the Foundation for the STEM Disciplines: Different Pedagogical Strategies Addressed
Authors: Marion G. Ben-Jacob, David Wang
Abstract:
There is a mathematics requirement for entry level college and university students, especially those who plan to study STEM (Science, Technology, Engineering and Mathematics). Most of them take College Algebra, and to continue their studies, they need to succeed in this course. Different pedagogical strategies are employed to promote the success of our students. There is, of course, the Traditional Method of teaching- lecture, examples, problems for students to solve. The Emporium Model, another pedagogical approach, replaces traditional lectures with a learning resource center model featuring interactive software and on-demand personalized assistance. This presentation will compare these two methods of pedagogy and the study done with its results on this comparison. Math is the foundation for science, technology, and engineering. Its work is generally used in STEM to find patterns in data. These patterns can be used to test relationships, draw general conclusions about data, and model the real world. In STEM, solutions to problems are analyzed, reasoned, and interpreted using math abilities in a assortment of real-world scenarios. This presentation will examine specific examples of how math is used in the different STEM disciplines. Math becomes practical in science when it is used to model natural and artificial experiments to identify a problem and develop a solution for it. As we analyze data, we are using math to find the statistical correlation between the cause of an effect. Scientists who use math include the following: data scientists, scientists, biologists and geologists. Without math, most technology would not be possible. Math is the basis of binary, and without programming, you just have the hardware. Addition, subtraction, multiplication, and division is also used in almost every program written. Mathematical algorithms are inherent in software as well. Mechanical engineers analyze scientific data to design robots by applying math and using the software. Electrical engineers use math to help design and test electrical equipment. They also use math when creating computer simulations and designing new products. Chemical engineers often use mathematics in the lab. Advanced computer software is used to aid in their research and production processes to model theoretical synthesis techniques and properties of chemical compounds. Mathematics mastery is crucial for success in the STEM disciplines. Pedagogical research on formative strategies and necessary topics to be covered are essential.Keywords: emporium model, mathematics, pedagogy, STEM
Procedia PDF Downloads 7521714 Viability of EBT3 Film in Small Dimensions to Be Use for in-Vivo Dosimetry in Radiation Therapy
Authors: Abdul Qadir Jangda, Khadija Mariam, Usman Ahmed, Sharib Ahmed
Abstract:
The Gafchromic EBT3 film has the characteristic of high spatial resolution, weak energy dependence and near tissue equivalence which makes them viable to be used for in-vivo dosimetry in External Beam and Brachytherapy applications. The aim of this study is to assess the smallest film dimension that may be feasible for the use in in-vivo dosimetry. To evaluate the viability, the film sizes from 3 x 3 mm to 20 x 20 mm were calibrated with 6 MV Photon and 6 MeV electron beams. The Gafchromic EBT3 (Lot no. A05151201, Make: ISP) film was cut into five different sizes in order to establish the relationship between absorbed dose vs. film dimensions. The film dimension were 3 x 3, 5 x 5, 10 x 10, 15 x 15, and 20 x 20 mm. The films were irradiated on Varian Clinac® 2100C linear accelerator for dose range from 0 to 1000 cGy using PTW solid water phantom. The irradiation was performed as per clinical absolute dose rate calibratin setup, i.e. 100 cm SAD, 5.0 cm depth and field size of 10x10 cm2 and 100 cm SSD, 1.4 cm depth and 15x15 cm2 applicator for photon and electron respectively. The irradiated films were scanned with the landscape orientation and a post development time of 48 hours (minimum). Film scanning accomplished using Epson Expression 10000 XL Flatbed Scanner and quantitative analysis carried out with ImageJ freeware software. Results show that the dose variation with different film dimension ranging from 3 x 3 mm to 20 x 20 mm is very minimal with a maximum standard deviation of 0.0058 in Optical Density for a dose level of 3000 cGy and the the standard deviation increases with the increase in dose level. So the precaution must be taken while using the small dimension films for higher doses. Analysis shows that there is insignificant variation in the absorbed dose with a change in film dimension of EBT3 film. Study concludes that the film dimension upto 3 x 3 mm can safely be used up to a dose level of 3000 cGy without the need of recalibration for particular dimension in use for dosimetric application. However, for higher dose levels, one may need to calibrate the films for a particular dimension in use for higher accuracy. It was also noticed that the crystalline structure of the film got damage at the edges while cutting the film, which can contribute to the wrong dose if the region of interest includes the damage area of the filmKeywords: external beam radiotherapy, film calibration, film dosimetery, in-vivo dosimetery
Procedia PDF Downloads 49421713 Breast Cancer Survivability Prediction via Classifier Ensemble
Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia
Abstract:
This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.Keywords: classifier ensemble, breast cancer survivability, data mining, SEER
Procedia PDF Downloads 32921712 Pre-Service Teachers’ Opinions on Disabled People
Authors: Sinem Toraman, Aysun Öztuna Kaplan, Hatice Mertoğlu, Esra Macaroğlu Akgül
Abstract:
This study aims to examine pre-service teachers’ opinions on disabled people taking into consideration various variables. The participants of the study are composed of 170 pre-service teachers being 1st year students of different branches at Education Department of Yıldız Technical, Yeditepe, Marmara and Sakarya Universities. Data of the research was collected in 2013-2014 fall term. This study was designed as a phenomenological study appropriately qualitative research paradigm. Pre-service teachers’ opinions about disabled people were examined in this study, open ended question form which was prepared by researcher and focus group interview techniques were used as data collection tool. The study presents pre-service teachers’ opinions about disabled people which were mentioned, and suggestions about teacher education.Keywords: pre-service teachers, disabled people, teacher education, teachers' opinions
Procedia PDF Downloads 45921711 On Modeling Data Sets by Means of a Modified Saddlepoint Approximation
Authors: Serge B. Provost, Yishan Zhang
Abstract:
A moment-based adjustment to the saddlepoint approximation is introduced in the context of density estimation. First applied to univariate distributions, this methodology is extended to the bivariate case. It then entails estimating the density function associated with each marginal distribution by means of the saddlepoint approximation and applying a bivariate adjustment to the product of the resulting density estimates. The connection to the distribution of empirical copulas will be pointed out. As well, a novel approach is proposed for estimating the support of distribution. As these results solely rely on sample moments and empirical cumulant-generating functions, they are particularly well suited for modeling massive data sets. Several illustrative applications will be presented.Keywords: empirical cumulant-generating function, endpoints identification, saddlepoint approximation, sample moments, density estimation
Procedia PDF Downloads 162