Search results for: data management system
38540 Clustering for Detection of the Population at Risk of Anticholinergic Medication
Authors: A. Shirazibeheshti, T. Radwan, A. Ettefaghian, G. Wilson, C. Luca, Farbod Khanizadeh
Abstract:
Anticholinergic medication has been associated with events such as falls, delirium, and cognitive impairment in older patients. To further assess this, anticholinergic burden scores have been developed to quantify risk. A risk model based on clustering was deployed in a healthcare management system to cluster patients into multiple risk groups according to anticholinergic burden scores of multiple medicines prescribed to patients to facilitate clinical decision-making. To do so, anticholinergic burden scores of drugs were extracted from the literature, which categorizes the risk on a scale of 1 to 3. Given the patients’ prescription data on the healthcare database, a weighted anticholinergic risk score was derived per patient based on the prescription of multiple anticholinergic drugs. This study was conducted on over 300,000 records of patients currently registered with a major regional UK-based healthcare provider. The weighted risk scores were used as inputs to an unsupervised learning algorithm (mean-shift clustering) that groups patients into clusters that represent different levels of anticholinergic risk. To further evaluate the performance of the model, any association between the average risk score within each group and other factors such as socioeconomic status (i.e., Index of Multiple Deprivation) and an index of health and disability were investigated. The clustering identifies a group of 15 patients at the highest risk from multiple anticholinergic medication. Our findings also show that this group of patients is located within more deprived areas of London compared to the population of other risk groups. Furthermore, the prescription of anticholinergic medicines is more skewed to female than male patients, indicating that females are more at risk from this kind of multiple medications. The risk may be monitored and controlled in well artificial intelligence-equipped healthcare management systems.Keywords: anticholinergic medicines, clustering, deprivation, socioeconomic status
Procedia PDF Downloads 21238539 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions
Authors: K. Hardy, A. Maurushat
Abstract:
Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.Keywords: big data, open data, productivity, data governance
Procedia PDF Downloads 37138538 Design Systems and the Need for a Usability Method: Assessing the Fitness of Components and Interaction Patterns in Design Systems Using Atmosphere Methodology
Authors: Patrik Johansson, Selina Mardh
Abstract:
The present study proposes a usability test method, Atmosphere, to assess the fitness of components and interaction patterns of design systems. The method covers the user’s perception of the components of the system, the efficiency of the logic of the interaction patterns, perceived ease of use as well as the user’s understanding of the intended outcome of interactions. These aspects are assessed by combining measures of first impression, visual affordance and expectancy. The method was applied to a design system developed for the design of an electronic health record system. The study was conducted involving 15 healthcare personnel. It could be concluded that the Atmosphere method provides tangible data that enable human-computer interaction practitioners to analyze and categorize components and patterns based on perceived usability, success rate of identifying interactive components and success rate of understanding components and interaction patterns intended outcome.Keywords: atomic design, atmosphere methodology, design system, expectancy testing, first impression testing, usability testing, visual affordance testing
Procedia PDF Downloads 18038537 Automated Facial Symmetry Assessment for Orthognathic Surgery: Utilizing 3D Contour Mapping and Hyperdimensional Computing-Based Machine Learning
Authors: Wen-Chung Chiang, Lun-Jou Lo, Hsiu-Hsia Lin
Abstract:
This study aimed to improve the evaluation of facial symmetry, which is crucial for planning and assessing outcomes in orthognathic surgery (OGS). Facial symmetry plays a key role in both aesthetic and functional aspects of OGS, making its accurate evaluation essential for optimal surgical results. To address the limitations of traditional methods, a different approach was developed, combining three-dimensional (3D) facial contour mapping with hyperdimensional (HD) computing to enhance precision and efficiency in symmetry assessments. The study was conducted at Chang Gung Memorial Hospital, where data were collected from 2018 to 2023 using 3D cone beam computed tomography (CBCT), a highly detailed imaging technique. A large and comprehensive dataset was compiled, consisting of 150 normal individuals and 2,800 patients, totaling 5,750 preoperative and postoperative facial images. These data were critical for training a machine learning model designed to analyze and quantify facial symmetry. The machine learning model was trained to process 3D contour data from the CBCT images, with HD computing employed to power the facial symmetry quantification system. This combination of technologies allowed for an objective and detailed analysis of facial features, surpassing the accuracy and reliability of traditional symmetry assessments, which often rely on subjective visual evaluations by clinicians. In addition to developing the system, the researchers conducted a retrospective review of 3D CBCT data from 300 patients who had undergone OGS. The patients’ facial images were analyzed both before and after surgery to assess the clinical utility of the proposed system. The results showed that the facial symmetry algorithm achieved an overall accuracy of 82.5%, indicating its robustness in real-world clinical applications. Postoperative analysis revealed a significant improvement in facial symmetry, with an average score increase of 51%. The mean symmetry score rose from 2.53 preoperatively to 3.89 postoperatively, demonstrating the system's effectiveness in quantifying improvements after OGS. These results underscore the system's potential for providing valuable feedback to surgeons and aiding in the refinement of surgical techniques. The study also led to the development of a web-based system that automates facial symmetry assessment. This system integrates HD computing and 3D contour mapping into a user-friendly platform that allows for rapid and accurate evaluations. Clinicians can easily access this system to perform detailed symmetry assessments, making it a practical tool for clinical settings. Additionally, the system facilitates better communication between clinicians and patients by providing objective, easy-to-understand symmetry scores, which can help patients visualize the expected outcomes of their surgery. In conclusion, this study introduced a valuable and highly effective approach to facial symmetry evaluation in OGS, combining 3D contour mapping, HD computing, and machine learning. The resulting system achieved high accuracy and offers a streamlined, automated solution for clinical use. The development of the web-based platform further enhances its practicality, making it a valuable tool for improving surgical outcomes and patient satisfaction in orthognathic surgery.Keywords: facial symmetry, orthognathic surgery, facial contour mapping, hyperdimensional computing
Procedia PDF Downloads 2738536 Studies on the Prevalence and Determination of Associated Risk Factors of Babesia in Goats of District Toba Tek Singh, Punjab, Pakistan
Authors: Tauseef-ur-Rehman, Rao Zahid Abbas, Wasim Babar, Arbab Sikandar
Abstract:
Babesiosis is an infection due to the multiplication of tick borne parasite, Babesia sp., in erythrocytes of host (variety of vertebrates) including small ruminants and is responsible for decreased livestock output and hence economic losses. A cross-sectional study was designed in order to evaluate the prevalence of Babesia and its relation with various associated factors in district Toba Tek Singh, Central Punjab, Pakistan in 2009-2010. A total 10.84% (50/461) out of 461 examined cases for Babesia were found positive for Babesia infection. Month-wise peak prevalence was observed in July (17.95%), while no positive case was recorded in Dec-2009 and Jan-2010. The prevalence of infection in different goat breeds was found as non-significant (P < 0.05) for Babesia infection. The prevalence of Babesia was found significantly (P < 0.05) dependent to the goat age and sex. The feeding system, housing system, floor type and herd size revealed strong correlation with Babesia prevalence, while watering system and body conditions were found to be non-significant (P < 0.05), and hence it is suggested that with the improvement of management precautions Babesiosis can be avoided.Keywords: Babesia, goat, prevalence, Pakistan, risk factors
Procedia PDF Downloads 52038535 A Non-Parametric Analysis of District Disaster Management Authorities in Punjab, Pakistan
Authors: Zahid Hussain
Abstract:
Provincial Disaster Management Authority (PDMA) Punjab was established under NDM Act 2010 and now working under Senior Member Board of Revenue, deals with the whole spectrum of disasters including preparedness, mitigation, early warning, response, relief, rescue, recovery and rehabilitation. The District Disaster Management Authorities (DDMA) are acting as implementing arms of PDMA in the districts to respond any disaster. DDMAs' role is very important in disaster mitigation, response and recovery as they are the first responder and closest tier to the community. Keeping in view the significant role of DDMAs, technical and human resource capacity are need to be checked. For calculating the technical efficiencies of District Disaster Management Authority (DDMA) in Punjab, three inputs like number of labour, the number of transportation and number of equipment, two outputs like relief assistance and the number of rescue and 25 districts as decision making unit have been selected. For this purpose, 8 years secondary data from 2005 to 2012 has been used. Data Envelopment Analysis technique has been applied. DEA estimates the relative efficiency of peer entities or entities performing the similar tasks. The findings show that all decision making unit (DMU) (districts) are inefficient on techonological and scale efficiency scale while technically efficient on pure and total factor productivity efficiency scale. All DMU are found technically inefficient only in the year 2006. Labour and equipment were not efficiently used in the year 2005, 2007, 2008, 2009 and 2012. Furthermore, only three years 2006, 2010 and 2011 show that districts could not efficiently use transportation in a disaster situation. This study suggests that all districts should curtail labour, transportation and equipment to be efficient. Similarly, overall all districts are not required to achieve number of rescue and relief assistant, these should be reduced.Keywords: DEA, DMU, PDMA, DDMA
Procedia PDF Downloads 24638534 Quality Assurances for an On-Board Imaging System of a Linear Accelerator: Five Months Data Analysis
Authors: Liyun Chang, Cheng-Hsiang Tsai
Abstract:
To ensure the radiation precisely delivering to the target of cancer patients, the linear accelerator equipped with the pretreatment on-board imaging system is introduced and through it the patient setup is verified before the daily treatment. New generation radiotherapy using beam-intensity modulation, usually associated the treatment with steep dose gradients, claimed to have achieved both a higher degree of dose conformation in the targets and a further reduction of toxicity in normal tissues. However, this benefit is counterproductive if the beam is delivered imprecisely. To avoid shooting critical organs or normal tissues rather than the target, it is very important to carry out the quality assurance (QA) of this on-board imaging system. The QA of the On-Board Imager® (OBI) system of one Varian Clinac-iX linear accelerator was performed through our procedures modified from a relevant report and AAPM TG142. Two image modalities, 2D radiography and 3D cone-beam computed tomography (CBCT), of the OBI system were examined. The daily and monthly QA was executed for five months in the categories of safety, geometrical accuracy and image quality. A marker phantom and a blade calibration plate were used for the QA of geometrical accuracy, while the Leeds phantom and Catphan 504 phantom were used in the QA of radiographic and CBCT image quality, respectively. The reference images were generated through a GE LightSpeed CT simulator with an ADAC Pinnacle treatment planning system. Finally, the image quality was analyzed via an OsiriX medical imaging system. For the geometrical accuracy test, the average deviations of the OBI isocenter in each direction are less than 0.6 mm with uncertainties less than 0.2 mm, while all the other items have the displacements less than 1 mm. For radiographic image quality, the spatial resolution is 1.6 lp/cm with contrasts less than 2.2%. The spatial resolution, low contrast, and HU homogenous of CBCT are larger than 6 lp/cm, less than 1% and within 20 HU, respectively. All tests are within the criteria, except the HU value of Teflon measured with the full fan mode exceeding the suggested value that could be due to itself high HU value and needed to be rechecked. The OBI system in our facility was then demonstrated to be reliable with stable image quality. The QA of OBI system is really necessary to achieve the best treatment for a patient.Keywords: CBCT, image quality, quality assurance, OBI
Procedia PDF Downloads 29938533 IoT Continuous Monitoring Biochemical Oxygen Demand Wastewater Effluent Quality: Machine Learning Algorithms
Authors: Sergio Celaschi, Henrique Canavarro de Alencar, Claaudecir Biazoli
Abstract:
Effluent quality is of the highest priority for compliance with the permit limits of environmental protection agencies and ensures the protection of their local water system. Of the pollutants monitored, the biochemical oxygen demand (BOD) posed one of the greatest challenges. This work presents a solution for wastewater treatment plants - WWTP’s ability to react to different situations and meet treatment goals. Delayed BOD5 results from the lab take 7 to 8 analysis days, hindered the WWTP’s ability to react to different situations and meet treatment goals. Reducing BOD turnaround time from days to hours is our quest. Such a solution is based on a system of two BOD bioreactors associated with Digital Twin (DT) and Machine Learning (ML) methodologies via an Internet of Things (IoT) platform to monitor and control a WWTP to support decision making. DT is a virtual and dynamic replica of a production process. DT requires the ability to collect and store real-time sensor data related to the operating environment. Furthermore, it integrates and organizes the data on a digital platform and applies analytical models allowing a deeper understanding of the real process to catch sooner anomalies. In our system of continuous time monitoring of the BOD suppressed by the effluent treatment process, the DT algorithm for analyzing the data uses ML on a chemical kinetic parameterized model. The continuous BOD monitoring system, capable of providing results in a fraction of the time required by BOD5 analysis, is composed of two thermally isolated batch bioreactors. Each bioreactor contains input/output access to wastewater sample (influent and effluent), hydraulic conduction tubes, pumps, and valves for batch sample and dilution water, air supply for dissolved oxygen (DO) saturation, cooler/heater for sample thermal stability, optical ODO sensor based on fluorescence quenching, pH, ORP, temperature, and atmospheric pressure sensors, local PLC/CPU for TCP/IP data transmission interface. The dynamic BOD system monitoring range covers 2 mg/L < BOD < 2,000 mg/L. In addition to the BOD monitoring system, there are many other operational WWTP sensors. The CPU data is transmitted/received to/from the digital platform, which in turn performs analyses at periodic intervals, aiming to feed the learning process. BOD bulletins and their credibility intervals are made available in 12-hour intervals to web users. The chemical kinetics ML algorithm is composed of a coupled system of four first-order ordinary differential equations for the molar masses of DO, organic material present in the sample, biomass, and products (CO₂ and H₂O) of the reaction. This system is solved numerically linked to its initial conditions: DO (saturated) and initial products of the kinetic oxidation process; CO₂ = H₂0 = 0. The initial values for organic matter and biomass are estimated by the method of minimization of the mean square deviations. A real case of continuous monitoring of BOD wastewater effluent quality is being conducted by deploying an IoT application on a large wastewater purification system located in S. Paulo, Brazil.Keywords: effluent treatment, biochemical oxygen demand, continuous monitoring, IoT, machine learning
Procedia PDF Downloads 7338532 Development of a Mechanical Ventilator Using A Manual Artificial Respiration Unit
Authors: Isomar Lima da Silva, Alcilene Batalha Pontes, Aristeu Jonatas Leite de Oliveira, Roberto Maia Augusto
Abstract:
Context: Mechanical ventilators are medical devices that help provide oxygen and ventilation to patients with respiratory difficulties. This equipment consists of a manual breathing unit that can be operated by a doctor or nurse and a mechanical ventilator that controls the airflow and pressure in the patient's respiratory system. This type of ventilator is commonly used in emergencies and intensive care units where it is necessary to provide breathing support to critically ill or injured patients. Objective: In this context, this work aims to develop a reliable and low-cost mechanical ventilator to meet the demand of hospitals in treating people affected by Covid-19 and other severe respiratory diseases, offering a chance of treatment as an alternative to mechanical ventilators currently available in the market. Method: The project presents the development of a low-cost auxiliary ventilator with a controlled ventilatory system assisted by integrated hardware and firmware for respiratory cycle control in non-invasive mechanical ventilation treatments using a manual artificial respiration unit. The hardware includes pressure sensors capable of identifying positive expiratory pressure, peak inspiratory flow, and injected air volume. The embedded system controls the data sent by the sensors. It ensures efficient patient breathing through the operation of the sensors, microcontroller, and actuator, providing patient data information to the healthcare professional (system operator) through the graphical interface and enabling clinical parameter adjustments as needed. Results: The test data of the developed mechanical ventilator presented satisfactory results in terms of performance and reliability, showing that the equipment developed can be a viable alternative to commercial mechanical ventilators currently available, offering a low-cost solution to meet the increasing demand for respiratory support equipment.Keywords: mechanical fans, breathing, medical equipment, COVID-19, intensive care units
Procedia PDF Downloads 7038531 Queueing Modeling of M/G/1 Fault Tolerant System with Threshold Recovery and Imperfect Coverage
Authors: Madhu Jain, Rakesh Kumar Meena
Abstract:
This paper investigates a finite M/G/1 fault tolerant multi-component machining system. The system incorporates the features such as standby support, threshold recovery and imperfect coverage make the study closer to real time systems. The performance prediction of M/G/1 fault tolerant system is carried out using recursive approach by treating remaining service time as a supplementary variable. The numerical results are presented to illustrate the computational tractability of analytical results by taking three different service time distributions viz. exponential, 3-stage Erlang and deterministic. Moreover, the cost function is constructed to determine the optimal choice of system descriptors to upgrading the system.Keywords: fault tolerant, machine repair, threshold recovery policy, imperfect coverage, supplementary variable technique
Procedia PDF Downloads 29238530 Study on Municipal Solid Waste Management to Protect Environment
Authors: Rajesh Kumar
Abstract:
The largest issue in the current situation is managing solid waste since it pollutes the ecosystem. When considering how to manage waste, even the disposal of mixed waste is a challenge. The Saksham Yuva Project, which is managed by the Haryana government, highlights the consequences and drivers of managing the solid waste of urban areas in the municipal committee pundri in the present study. The overall goal of the Saksham Yuva project is to mobilise the public and educate them about the dangers associated with garbage management. There has been a 20% reduction in waste, according to the study's impacts, and the cost of waste management has also gone down. Further, the study also reported the alternative use of wastes in revenue generation by generating Khaad for agricultural purposes.Keywords: solid waste management, people awareness, dry and wet waste disposal, material recover facility
Procedia PDF Downloads 11238529 Risk Issues for Controlling Floods through Unsafe, Dual Purpose, Gated Dams
Authors: Gregory Michael McMahon
Abstract:
Risk management for the purposes of minimizing the damages from the operations of dams has met with opposition emerging from organisations and authorities, and their practitioners. It appears that the cause may be a misunderstanding of risk management arising from exchanges that mix deterministic thinking with risk-centric thinking and that do not separate uncertainty from reliability and accuracy from probability. This paper sets out those misunderstandings that arose from dam operations at Wivenhoe in 2011, using a comparison of outcomes that have been based on the methodology and its rules and those that have been operated by applying misunderstandings of the rules. The paper addresses the performance of one risk-centric Flood Manual for Wivenhoe Dam in achieving a risk management outcome. A mixture of engineering, administrative, and legal factors appear to have combined to reduce the outcomes from the risk approach. These are described. The findings are that a risk-centric Manual may need to assist administrations in the conduct of scenario training regimes, in responding to healthy audit reporting, and in the development of decision-support systems. The principal assistance needed from the Manual, however, is to assist engineering and the law to a good understanding of how risks are managed – do not assume that risk management is understood. The wider findings are that the critical profession for decision-making downstream of the meteorologist is not dam engineering or hydrology, or hydraulics; it is risk management. Risk management will provide the minimum flood damage outcome where actual rainfalls match or exceed forecasts of rainfalls, that therefore risk management will provide the best approach for the likely history of flooding in the life of a dam, and provisions made for worst cases may be state of the art in risk management. The principal conclusion is the need for training in both risk management as a discipline and also in the application of risk management rules to particular dam operational scenarios.Keywords: risk management, flood control, dam operations, deterministic thinking
Procedia PDF Downloads 8738528 Automated Tracking and Statistics of Vehicles at the Signalized Intersection
Authors: Qiang Zhang, Xiaojian Hu1
Abstract:
Intersection is the place where vehicles and pedestrians must pass through, turn and evacuate. Obtaining the motion data of vehicles near the intersection is of great significance for transportation research. Since there are usually many targets and there are more conflicts between targets, this makes it difficult to obtain vehicle motion parameters in traffic videos of intersections. According to the characteristics of traffic videos, this paper applies video technology to realize the automated track, count and trajectory extraction of vehicles to collect traffic data by roadside surveillance cameras installed near the intersections. Based on the video recognition method, the vehicles in each lane near the intersection are tracked with extracting trajectory and counted respectively in various degrees of occlusion and visibility. The performances are compared with current recognized CPU-based algorithms of real-time tracking-by-detection. The speed of the presented system is higher than the others and the system has a better real-time performance. The accuracy of direction has reached about 94.99% on average, and the accuracy of classification and statistics has reached about 75.12% on average.Keywords: tracking and statistics, vehicle, signalized intersection, motion parameter, trajectory
Procedia PDF Downloads 22138527 Software Quality Measurement System for Telecommunication Industry in Malaysia
Authors: Nor Fazlina Iryani Abdul Hamid, Mohamad Khatim Hasan
Abstract:
Evolution of software quality measurement has been started since McCall introduced his quality model in year 1977. Starting from there, several software quality models and software quality measurement methods had emerged but none of them focused on telecommunication industry. In this paper, the implementation of software quality measurement system for telecommunication industry was compulsory to accommodate the rapid growth of telecommunication industry. The quality value of the telecommunication related software could be calculated using this system by entering the required parameters. The system would calculate the quality value of the measured system based on predefined quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). Thus, software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.Keywords: software quality, quality measurement, quality model, quality metric, net satisfaction index
Procedia PDF Downloads 59238526 An Automated Business Process Management for Smart Medical Records
Authors: K. Malak, A. Nourah, S.Liyakathunisa
Abstract:
Nowadays, healthcare services are facing many challenges since they are becoming more complex and more needed. Every detail of a patient’s interactions with health care providers is maintained in Electronic Health Records (ECR) and Healthcare information systems (HIS). However, most of the existing systems are often focused on documenting what happens in manual health care process, rather than providing the highest quality patient care. Healthcare business processes and stakeholders can no longer rely on manual processes, to provide better patient care and efficient utilization of resources, Healthcare processes must be automated wherever it is possible. In this research, a detail survey and analysis is performed on the existing health care systems in Saudi Arabia, and an automated smart medical healthcare business process model is proposed. The business process management methods and rules are followed in discovering, collecting information, analysis, redesign, implementation and performance improvement analysis in terms of time and cost. From the simulation results, it is evident that our proposed smart medical records system can improve the quality of the service by reducing the time and cost and increasing efficiencyKeywords: business process management, electronic health records, efficiency, cost, time
Procedia PDF Downloads 34238525 Introduction of Digital Radiology to Improve the Timeliness in Availability of Radiological Diagnostic Images for Trauma Care
Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe
Abstract:
In an emergency department ‘where every second count for patient’s management’ timely availability of X- rays play a vital role in early diagnosis and management of patients. Trauma care centers rely heavily on timely radiologic imaging for patient care and radiology plays a crucial role in the emergency department (ED) operations. A research study was carried out to assess timeliness of availability of X-rays and total turnaround time at the Accident Service of National Hospital of Sri Lanka which is the premier trauma center in the country. Digital Radiology system was implemented as an intervention to improve the timeliness of availability of X-rays. Post-implementation assessment was carried out to assess the effectiveness of the intervention. Reduction in all three aspects of waiting times namely waiting for initial examination by doctors, waiting until X –ray is performed and waiting for image availability was observed after implementation of the intervention. However, the most significant improvement was seen in waiting time for image availability and reduction in time for image availability had indirect impact on reducing waiting time for initial examination by doctors and waiting until X –ray is performed. The most significant reduction in time for image availability was observed when performing 4-5 X rays with DR system. The least improvement in timeliness was seen in patients who are categorized as critical.Keywords: emergency department, digital radilogy, timeliness, trauma care
Procedia PDF Downloads 26538524 A Dynamic Solution Approach for Heart Disease Prediction
Authors: Walid Moudani
Abstract:
The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the coronary heart disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts’ knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.Keywords: multi-classifier decisions tree, features reduction, dynamic programming, rough sets
Procedia PDF Downloads 41038523 A Proposal of Advanced Key Performance Indicators for Assessing Six Performances of Construction Projects
Authors: Wi Sung Yoo, Seung Woo Lee, Youn Kyoung Hur, Sung Hwan Kim
Abstract:
Large-scale construction projects are continuously increasing, and the need for tools to monitor and evaluate the project success is emphasized. At the construction industry level, there are limitations in deriving performance evaluation factors that reflect the diversity of construction sites and systems that can objectively evaluate and manage performance. Additionally, there are difficulties in integrating structured and unstructured data generated at construction sites and deriving improvements. In this study, we propose the Key Performance Indicators (KPIs) to enable performance evaluation that reflects the increased diversity of construction sites and the unstructured data generated, and present a model for measuring performance by the derived indicators. The comprehensive performance of a unit construction site is assessed based on 6 areas (Time, Cost, Quality, Safety, Environment, Productivity) and 26 indicators. We collect performance indicator information from 30 construction sites that meet legal standards and have been successfully performed. And We apply data augmentation and optimization techniques into establishing measurement standards for each indicator. In other words, the KPI for construction site performance evaluation presented in this study provides standards for evaluating performance in six areas using institutional requirement data and document data. This can be expanded to establish a performance evaluation system considering the scale and type of construction project. Also, they are expected to be used as a comprehensive indicator of the construction industry and used as basic data for tracking competitiveness at the national level and establishing policies.Keywords: key performance indicator, performance measurement, structured and unstructured data, data augmentation
Procedia PDF Downloads 4238522 Internal Product Management: The Key to Achieving Digital Maturity and Business Agility for Manufacturing IT Organizations
Authors: Frederick Johnson
Abstract:
Product management has a long and well-established history within the consumer goods industry, despite being one of the most obscure aspects of brand management. Many global manufacturing organizations are now opting for external cloud-based Manufacturing Execution Systems (MES) to replace costly and outdated monolithic MES solutions. Other global manufacturing leaders are restructuring their organizations to support human-centered values, agile methodologies, and fluid operating principles. Still, industry-leading organizations struggle to apply the appropriate framework for managing evolving external MES solutions as internal "digital products." Product management complements these current trends in technology and philosophical thinking in the market. This paper discusses the central problems associated with adopting product management processes by analyzing its traditional theories and characteristics. Considering these ideas, the article then constructs a translated internal digital product management framework by combining new and existing approaches and principles. The report concludes by demonstrating the framework's capabilities and potential effectiveness in achieving digital maturity and business agility within a manufacturing environment.Keywords: internal product management, digital transformation, manufacturing information technology, manufacturing execution systems
Procedia PDF Downloads 13538521 Quantifying Individual Performance of Pakistani Cricket Players
Authors: Kasif Khan, Azlan Allahwala, Moiz Ali, Hasan Lodhi, Umer Amjad
Abstract:
The number of runs scored by batsmen and wickets taken by bowlers serves as a natural way of quantifying the performance of a cricketer. Traditionally the batsmen and bowlers are rated on their batting or bowling average respectively. However, in a game like Cricket, it is not sufficient to evaluate performance on the basis of average. The biasness in selecting batsman and bowler on the basis of their past performance. The objective is to predict the best player and comparing their performance on the basis of venue, opponent, weather, and particular position. On the basis of predictions and analysis, and comparison the best team is selected for next upcoming series of Pakistan. The system is based and will be built to aid analyst in finding best possible team combination of Pakistan for a particular match and by providing them with advisories so that they can select the best possible team combination. This will also help the team management in identifying a perfect batting order and the bowling order for each match.Keywords: data analysis, Pakistan cricket players, quantifying individual performance, cricket
Procedia PDF Downloads 29738520 Deployment of Electronic Healthcare Records and Development of Big Data Analytics Capabilities in the Healthcare Industry: A Systematic Literature Review
Authors: Tigabu Dagne Akal
Abstract:
Electronic health records (EHRs) can help to store, maintain, and make the appropriate handling of patient histories for proper treatment and decision. Merging the EHRs with big data analytics (BDA) capabilities enable healthcare stakeholders to provide effective and efficient treatments for chronic diseases. Though there are huge opportunities and efforts that exist in the deployment of EMRs and the development of BDA, there are challenges in addressing resources and organizational capabilities that are required to achieve the competitive advantage and sustainability of EHRs and BDA. The resource-based view (RBV), information system (IS), and non- IS theories should be extended to examine organizational capabilities and resources which are required for successful data analytics in the healthcare industries. The main purpose of this study is to develop a conceptual framework for the development of healthcare BDA capabilities based on past works so that researchers can extend. The research question was formulated for the search strategy as a research methodology. The study selection was made at the end. Based on the study selection, the conceptual framework for the development of BDA capabilities in the healthcare settings was formulated.Keywords: EHR, EMR, Big data, Big data analytics, resource-based view
Procedia PDF Downloads 13138519 Modern Management Principles Enshrined in Ancient Vedic Texts
Authors: M. Kishore Kumar
Abstract:
The ancient Vedas and Upanishads are a treasure of knowledge gifted to the world by India. The four Vedas, a conglomerate of Hindu scriptures, contain many principles of modern management at organisation as well as at individual levels. It lays down the duties of a King and ministers as well as its citizens and cites values for leadership. Bhagawadgita (or ‘Gita’ in short), popularly cited as Pancham (Fifth) Veda, is stated to be sermoned about 5000 years ago by Lord Krishna. In the midst of the Kurukshetra battle, Gitopadesh was given various aspects such as dharma (duties), karma (action), stithaprajna (stable mind), nishkama (detachment from results) and ethics. Arjun was steered to victory by Lord Krishna as his charioteer, and the 700-odd-verse holy text Bhagawadgita can become a valuable guide for all of us to achieve success in business management. Many parallels exist between modern-day management theories and principles enshrined in Vedic texts.Keywords: goal, motivation, leadership, mind, management
Procedia PDF Downloads 8938518 Historic Fire Occurrence in Hemi-Boreal Forests: Exploring Natural and Cultural Scots Pine Multi-Cohort Fire Regimes in Lithuania
Authors: Charles Ruffner, Michael Manton, Gintautas Kibirkstis, Gediminas Brazaitas, Vitas Marozas, Ekaterine Makrickiene, Rutile Pukiene, Per Angelstam
Abstract:
In dynamic boreal forests, fire is an important natural disturbance, which drives regeneration and mortality of living and dead trees, and thus successional trajectories. However, current forest management practices focusing on wood production only have effectively eliminated fire as a stand-level disturbance. While this is generally well studied across much of Europe, in Lithuania, little is known about the historic fire regime and the role fire plays as a management tool towards the sustainable management of future landscapes. Focusing on Scots pine forests, we explore; i) the relevance of fire disturbance regimes on forestlands of Lithuania; ii) fire occurrence in the Dzukija landscape for dry upland and peatland forest sites, and iii) correlate tree-ring data with climate variables to ascertain climatic influences on growth and fire occurrence. We sampled and cross-dated 132 Scots pine samples with fire scars from 4 dry pine forest stands and 4 peatland forest stands, respectively. The fire history of each sample was analyzed using standard dendrochronological methods and presented in FHAES format. Analyses of soil moisture and nutrient conditions revealed a strong probability of finding forests that have a high fire frequency in Scots pine forests (59%), which cover 34.5% of Lithuania’s current forestland. The fire history analysis revealed 455 fire scars and 213 fire events during the period 1742-2019. Within the Dzukija landscape, the mean fire interval was 4.3 years for the dry Scots pine forest and 8.7 years for the peatland Scots pine forest. However, our comparison of fire frequency before and after 1950 shows a marked decrease in mean fire interval. Our data suggest that hemi-boreal forest landscapes of Lithuania provide strong evidence that fire, both human and lightning-ignited fires, has been and should be a natural phenomenon and that the examination of biological archives can be used to guide sustainable forest management into the future. Currently, fire use is prohibited by law as a tool for forest management in Lithuania. We recommend introducing trials that use low-intensity prescribed burning of Scots pine stands as a regeneration tool towards mimicking natural forest disturbance regimes.Keywords: biodiversity conservation, cultural burning, dendrochronology, forest dynamics, forest management, succession
Procedia PDF Downloads 20038517 Application of Mathematical Sciences to Farm Management
Authors: Fahad Suleiman
Abstract:
Agriculture has been the mainstay of the nation’s economy in Nigeria. It provides food for the ever rapidly increasing population and raw materials for the industries. People especially the rural dwellers are gainfully employed on their crop farms and small-scale livestock farms for income earning. In farming, availability of funds and time management are one of the major factors that influence the system of farming in Nigeria in which mathematical science knowledge was highly required in order for farms to be managed effectively. Farmers often applied mathematics, almost every day for a variety of tasks, ranging from measuring and weighing, to land marking. This paper, therefore, explores some of the ways math is used in farming. For instance, farmers use arithmetic variety of farm activities such as seed planting, harvesting crop, cultivation and mulching. It is also important in helping farmers to know how much their livestock weighs, how much milk their cows produce and crop yield per acres, among others.Keywords: agriculture, application, economic, farming, mathematics
Procedia PDF Downloads 24938516 The Power of a Vulnerable State: The Rights Revolution and the Emergence of Human Resources Management Departments
Authors: Soheila Ghanbari
Abstract:
After the Civil Rights Act of 1964 was enacted, federal policy transformed employment rights. Equal employment opportunity law, legislation for occupational safety and health, and regulations for fringe benefits were established to ensure that employees have rights to equal protection, health and safety, and the benefits guaranteed by employers. In research analyzing data from 279 organizations over time, it was discovered that legal changes prompted organizations to establish personnel, antidiscrimination, safety, and benefits departments to ensure compliance. However, as the process of institutionalization advanced, middle managers began to separate these fresh offices from policy and rationalize them solely in economic terms as a component of the new human resources management model. This common occurrence is seen in the United States, where the Constitution represents government control of business as unlawful. It could potentially clarify the extended lack of a state theory in organizational analysis and shed light on a puzzle pointed out by state theorists: the federal state is weak in terms of administration but strong in terms of norms.Keywords: management, state, human, resources, employment
Procedia PDF Downloads 5338515 Intelligent Earthquake Prediction System Based On Neural Network
Authors: Emad Amar, Tawfik Khattab, Fatma Zada
Abstract:
Predicting earthquakes is an important issue in the study of geography. Accurate prediction of earthquakes can help people to take effective measures to minimize the loss of personal and economic damage, such as large casualties, destruction of buildings and broken of traffic, occurred within a few seconds. United States Geological Survey (USGS) science organization provides reliable scientific information of Earthquake Existed throughout history & Preliminary database from the National Center Earthquake Information (NEIC) show some useful factors to predict an earthquake in a seismic area like Aleutian Arc in the U.S. state of Alaska. The main advantage of this prediction method that it does not require any assumption, it makes prediction according to the future evolution of object's time series. The article compares between simulation data result from trained BP and RBF neural network versus actual output result from the system calculations. Therefore, this article focuses on analysis of data relating to real earthquakes. Evaluation results show better accuracy and higher speed by using radial basis functions (RBF) neural network.Keywords: BP neural network, prediction, RBF neural network, earthquake
Procedia PDF Downloads 49638514 Present an Active Solar Energy System to Supply Heating Demands of the Teaching Staff Dormitory of Islamic Azad University of Ramhormoz
Authors: M. Talebzadegan, S. Bina , I. Riazi
Abstract:
The purpose of this paper is to present an active solar energy system to supply heating demands of the teaching staff dormitory of Islamic Azad University of Ramhormoz. The design takes into account the solar radiations and climate data of Ramhormoz town and is based on the daily warm water consumption for health demands of 450 residents of the dormitory, which is equal to 27000 lit of 50 C° water, and building heating requirements with an area of 3500 m² well-protected by heatproof materials. First, heating demands of the building were calculated, then a hybrid system made up of solar and fossil energies was developed and finally, the design was economically evaluated. Since there is only roof space for using 110 flat solar water heaters, the calculations were made to hybridize solar water heating system with heat pumping system in which solar energy contributes 67% of the heat generated. According to calculations, the Net Present Value “N.P.V.” of revenue stream exceeds “N.P.V.” of cash paid off in this project over three years, which makes economically quite promising. The return of investment and payback period of the project is 4 years. Also, the Internal Rate of Return (IRR) of the project was 25%, which exceeds bank rate of interest in Iran and emphasizes the desirability of the project.Keywords: solar energy, heat demand, renewable, pollution
Procedia PDF Downloads 42138513 Conductivity-Depth Inversion of Large Loop Transient Electromagnetic Sounding Data over Layered Earth Models
Authors: Ravi Ande, Mousumi Hazari
Abstract:
One of the common geophysical techniques for mapping subsurface geo-electrical structures, extensive hydro-geological research, and engineering and environmental geophysics applications is the use of time domain electromagnetic (TDEM)/transient electromagnetic (TEM) soundings. A large transmitter loop for energising the ground and a small receiver loop or magnetometer for recording the transient voltage or magnetic field in the air or on the surface of the earth, with the receiver at the center of the loop or at any random point inside or outside the source loop, make up a large loop TEM system. In general, one can acquire data using one of the configurations with a large loop source, namely, with the receiver at the center point of the loop (central loop method), at an arbitrary in-loop point (in-loop method), coincident with the transmitter loop (coincidence-loop method), and at an arbitrary offset loop point (offset-loop method), respectively. Because of the mathematical simplicity associated with the expressions of EM fields, as compared to the in-loop and offset-loop systems, the central loop system (for ground surveys) and coincident loop system (for ground as well as airborne surveys) have been developed and used extensively for the exploration of mineral and geothermal resources, for mapping contaminated groundwater caused by hazardous waste and thickness of permafrost layer. Because a proper analytical expression for the TEM response over the layered earth model for the large loop TEM system does not exist, the forward problem used in this inversion scheme is first formulated in the frequency domain and then it is transformed in the time domain using Fourier cosine or sine transforms. Using the EMLCLLER algorithm, the forward computation is initially carried out in the frequency domain. As a result, the EMLCLLER modified the forward calculation scheme in NLSTCI to compute frequency domain answers before converting them to the time domain using Fourier Cosine and/or Sine transforms.Keywords: time domain electromagnetic (TDEM), TEM system, geoelectrical sounding structure, Fourier cosine
Procedia PDF Downloads 9238512 Risk and Uncertainty in Aviation: A Thorough Analysis of System Vulnerabilities
Authors: C. V. Pietreanu, S. E. Zaharia, C. Dinu
Abstract:
Hazard assessment and risks quantification are key components for estimating the impact of existing regulations. But since regulatory compliance cannot cover all risks in aviation, the authors point out that by studying causal factors and eliminating uncertainty, an accurate analysis can be outlined. The research debuts by making delimitations on notions, as confusion on the terms over time has reflected in less rigorous analysis. Throughout this paper, it will be emphasized the fact that the variation in human performance and organizational factors represent the biggest threat from an operational perspective. Therefore, advanced risk assessment methods analyzed by the authors aim to understand vulnerabilities of the system given by a nonlinear behavior. Ultimately, the mathematical modeling of existing hazards and risks by eliminating uncertainty implies establishing an optimal solution (i.e. risk minimization).Keywords: control, human factor, optimization, risk management, uncertainty
Procedia PDF Downloads 24938511 Breast Cancer Survivability Prediction via Classifier Ensemble
Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia
Abstract:
This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.Keywords: classifier ensemble, breast cancer survivability, data mining, SEER
Procedia PDF Downloads 329